cs.LO
102 postsarXiv:2501.12007v1 Announce Type: cross Abstract: We introduce a quantum analogue of classical first-order logic (FO) and develop a theory of quantum first-order logic as a basis of the productive discussions on the power of logical expressiveness toward quantum computing. The purpose of this work is to logically express "quantum computation" by introducing specially-featured quantum connectives and quantum quantifiers that quantify fixed-dimensional quantum states. Our approach is founded on the recently introduced recursion-theoretical schematic definitions of time-bounded quantum functions, which map finite-dimensional Hilbert spaces to themselves. The quantum first-order logic (QFO) in this work therefore looks quite different from the well-known old concept of quantum logic based on lattice theory. We demonstrate that quantum first-order logics possess an ability of expressing bounded-error quantum logarithmic-time computability by the use of new "functional" quantum variables. In contrast, an extra inclusion of quantum transitive closure operator helps us characterize quantum logarithmic-space computability. The same computability can be achieved by the use of different "functional" quantum variables.
arXiv:2403.15987v2 Announce Type: replace-cross Abstract: We define term rewriting systems on the vertices and faces of nestohedra, and show that the former are confluent and terminating. While the associated posets on vertices generalize Barnard--McConville's flip order for graph-associahedra, the preorders on faces generalize the facial weak order for permutahedra and the generalized Tamari order for associahedra. Moreover, we define and study contextual families of nestohedra, whose local confluence diagrams satisfy a certain uniformity condition. Among them are associahedra and operahedra, whose associated proofs of confluence for their rewriting systems reproduce proofs of categorical coherence theorems for monoidal categories and categorified operads.
arXiv:2501.11789v1 Announce Type: new Abstract: Nielsen transformations form the basis of a simple and widely used procedure for solving word equations. We make progress on the problem of determining when this procedure terminates in the presence of length constraints. To do this, we introduce extended word equations, a mathematical model of a word equation with partial information about length constraints. We then define extended Nielsen transformations, which adapt Nielsen transformations to the setting of extended word equations. We provide a partial characterization of when repeatedly applying extended Nielsen transformations to an extended word equation is guaranteed to terminate.
arXiv:2501.11768v1 Announce Type: cross Abstract: This paper develops the model theory of normal modal logics based on partial "possibilities" instead of total "worlds," following Humberstone (1981) instead of Kripke (1963). Possibility semantics can be seen as extending to modal logic the semantics for classical logic used in weak forcing in set theory, or as semanticizing a negative translation of classical modal logic into intuitionistic modal logic. Thus, possibility frames are based on posets with accessibility relations, like intuitionistic modal frames, but with the constraint that the interpretation of every formula is a regular open set in the Alexandrov topology on the poset. The standard world frames for modal logic are the special case of possibility frames wherein the poset is discrete. We develop the beginnings of duality theory, definability/correspondence theory, and completeness theory for possibility frames.
arXiv:2212.01679v5 Announce Type: replace Abstract: We show that the problem of whether a query is equivalent to a query of tree-width $k$ is decidable, for the class of Unions of Conjunctive Regular Path Queries with two-way navigation (UC2RPQs). A previous result by Barcel\'o, Romero, and Vardi [SIAM Journal on Computing, 2016] has shown decidability for the case $k=1$, and here we extend this result showing that decidability in fact holds for any arbitrary $k\geq 1$. The algorithm is in 2ExpSpace, but for the restricted but practically relevant case where all regular expressions of the query are of the form $a^*$ or $(a_1 + \dotsb + a_n)$ we show that the complexity of the problem drops to $\Pi^P_2$. We also investigate the related problem of approximating a UC2RPQ by queries of small tree-width. We exhibit an algorithm which, for any fixed number $k$, builds the maximal under-approximation of tree-width $k$ of a UC2RPQ. The maximal under-approximation of tree-width $k$ of a query $q$ is a query $q'$ of tree-width $k$ which is contained in $q$ in a maximal and unique way, that is, such that for every query $q''$ of tree-width $k$, if $q''$ is contained in $q$ then $q''$ is also contained in $q'$. Our approach is shown to be robust, in the sense that it allows also to test equivalence with queries of a given path-width, it also covers the previously known result for $k=1$, and it allows to test for equivalence of whether a (one-way) UCRPQ is equivalent to a UCRPQ of a given tree-width (or path-width).
arXiv:2501.08550v2 Announce Type: replace Abstract: Modern blockchains increasingly consist of multiple clients that implement a single blockchain protocol. If there is a semantic mismatch between the protocol implementations, the blockchain can permanently split and introduce new attack vectors. Current ad-hoc test suites for client implementations are not sufficient to ensure a high degree of protocol conformance. As an alternative, we present a framework that performs protocol conformance testing using a formal model of the protocol and an implementation running inside a deterministic blockchain simulator. Our framework consists of two complementary workflows that use the components as trace generators and checkers. Our insight is that both workflows are needed to detect all types of violations. We have applied and demonstrated the utility of our framework on an industrial strength consensus protocol.
arXiv:2501.10889v1 Announce Type: new Abstract: Deductive verification has become a mature paradigm for the verification of industrial software. Applying deductive verification, however, requires that every function in the code base is annotated with a function contract specifying its behaviour. This introduces a large overhead of manual work. To address this challenge, we introduce the AutoDeduct toolchain, built on top of the Frama-C framework. It implements a combination of techniques to automatically infer contracts for functions in C programs, in the syntax of ACSL, the specification language of Frama-C. Contract inference in AutoDecuct is implemented as two plugins for Frama-C, each inferring different types of annotations. We assume that programs have an entry-point function already equipped with a contract, which is used in conjunction with the program source code to infer contracts for the helper functions, so that the entry-point contract can be verified. The current release of AutoDeduct is the first public prototype, which we evaluate on an example adapted from industrial software.
arXiv:2501.11641v1 Announce Type: new Abstract: We introduce and study UCPDL+, a family of expressive logics rooted in Propositional Dynamic Logic (PDL) with converse (CPDL) and universal modality (UCPDL). In terms of expressive power, UCPDL+ strictly contains PDL extended with intersection and converse (a.k.a. ICPDL), as well as Conjunctive Queries (CQ), Conjunctive Regular Path Queries (CRPQ), or some known extensions thereof (Regular Queries and CQPDL). Further, it is equivalent to the extension of the unary-negation fragment of first-order logic (UNFO) with unary transitive closure, which we denote by UNFO*, which in turn strictly contains a previously studied extension of UNFO with regular expressions known as UNFO^reg. We investigate the expressive power, indistinguishability via bisimulations, satisfiability, and model checking for UCPDL+ and CPDL+. We argue that natural subclasses of CPDL+ can be defined in terms of the tree-width of the underlying graphs of the formulas. We show that the class of CPDL+ formulas of tree-width 2 is equivalent to ICPDL, and that it also coincides with CPDL+ formulas of tree-width 1. However, beyond tree-width 2, incrementing the tree-width strictly increases the expressive power. We characterize the expressive power for every class of fixed tree-width formulas in terms of a bisimulation game with pebbles. Based on this characterization, we show that CPDL+ has a tree-like model property. We prove that the satisfiability problem for UCPDL+ is decidable in 2ExpTime, coinciding with the complexity of ICPDL. As a consequence, the satisfiability problem for UNFO* is shown to be 2ExpTime-complete as well. We also exhibit classes for which satisfiability is reduced to ExpTime. Finally, we establish that the model checking problem for fixed tree-width formulas is in PTime, contrary to the full class CPDL+.
arXiv:2501.10482v1 Announce Type: cross Abstract: Random fuzzy variables join the modeling of the impreciseness (due to their ``fuzzy part'') and randomness. Statistical samples of such objects are widely used, and their direct, numerically effective generation is therefore necessary. Usually, these samples consist of triangular or trapezoidal fuzzy numbers. In this paper, we describe theoretical results and simulation algorithms for another family of fuzzy numbers -- LR fuzzy numbers with interval-valued cores. Starting from a simulation perspective on the piecewise linear LR fuzzy numbers with the interval-valued cores, their limiting behavior is then considered. This leads us to the numerically efficient algorithm for simulating a sample consisting of such fuzzy values.
arXiv:2501.11620v1 Announce Type: cross Abstract: We define a naturality construction for the operations of weak {\omega}-categories, as a meta-operation in a dependent type theory. Our construction has a geometrical motivation as a local tensor product, and we realise it as a globular analogue of Reynolds parametricity. Our construction operates as a "power tool" to support construction of terms with geometrical structure, and we use it to define composition operations for cylinders and cones in {\omega}-categories. The machinery can generate terms of high complexity, and we have implemented our construction in a proof assistant, which verifies that the generated terms have the correct type. All our results can be exported to homotopy type theory, allowing the explicit computation of complex path type inhabitants.
arXiv:2501.11467v1 Announce Type: new Abstract: The possibility of errors in human-engineered formal verification software, such as model checkers, poses a serious threat to the purpose of these tools. An established approach to mitigate this problem are certificates -- lightweight, easy-to-check proofs of the verification results. In this paper, we develop novel certificates for model checking of Markov decision processes (MDPs) with quantitative reachability and expected reward properties. Our approach is conceptually simple and relies almost exclusively on elementary fixed point theory. Our certificates work for arbitrary finite MDPs and can be readily computed with little overhead using standard algorithms. We formalize the soundness of our certificates in Isabelle/HOL and provide a formally verified certificate checker. Moreover, we augment existing algorithms in the probabilistic model checker Storm with the ability to produce certificates and demonstrate practical applicability by conducting the first formal certification of the reference results in the Quantitative Verification Benchmark Set.
arXiv:2208.14739v5 Announce Type: replace Abstract: The class of Basic Feasible Functionals BFF is the second-order counterpart of the class of first-order functions computable in polynomial time. We present several implicit characterizations of BFF based on a typed programming language of terms. These terms may perform calls to non-recursive imperative procedures. The type discipline has two layers: the terms follow a standard simply-typed discipline and the procedures follow a standard tier-based type discipline. BFF consists exactly of the second-order functionals that are computed by typable and terminating programs. The completeness of this characterization surprisingly still holds in the absence of lambda-abstraction. Moreover, the termination requirement can be specified as a completeness-preserving instance, which can be decided in time quadratic in the size of the program. As typing is decidable in polynomial time, we obtain the first tractable (i.e., decidable in polynomial time), sound, complete, and implicit characterization of BFF, thus solving a problem opened for more than 20 years.
arXiv:2306.09138v4 Announce Type: replace Abstract: The necessity to manage inconsistency in Description Logics Knowledge Bases (KBs) has come to the fore with the increasing importance gained by the Semantic Web, where information comes from different sources that constantly change their content and may contain contradictory descriptions when considered either alone or together. Classical reasoning algorithms do not handle inconsistent KBs, forcing the debugging of the KB in order to remove the inconsistency. In this paper, we exploit an existing probabilistic semantics called DISPONTE to overcome this problem and allow queries also in case of inconsistent KBs. We implemented our approach in the reasoners TRILL and BUNDLE and empirically tested the validity of our proposal. Moreover, we formally compare the presented approach to that of the repair semantics, one of the most established semantics when considering DL reasoning tasks.
arXiv:2401.12638v4 Announce Type: replace Abstract: Adhesive and quasiadhesive categories provide a general framework for the study of algebraic graph rewriting systems. In a quasiadhesive category any two regular subobjects have a join which is again a regular subobject. Vice versa, if regular monos are adhesive, then the existence of a regular join for any pair of regular subobjects entails quasiadhesivity. It is also known (quasi)adhesive categories can be embedded in a Grothendieck topos via a functor preserving pullbacks and pushouts along (regular) monomorphisms. In this paper we extend these results to $\mathcal{M}, \mathcal{N}$-adhesive categories, a concept recently introduced to generalize the notion of (quasi)adhesivity. We introduce the notion of $\mathcal{N}$-adhesive morphism, which allows us to express $\mathcal{M}, \mathcal{N}$-adhesivity as a condition on the subobjects's posets. Moreover, $\mathcal{N}$-adhesive morphisms allows us to show how an $\mathcal{M},\mathcal{N}$-adhesive category can be embedded into a Grothendieck topos, preserving pullbacks and $\mathcal{M}, \mathcal{N}$-pushouts.
arXiv:2411.16348v2 Announce Type: replace Abstract: Formal verification techniques based on computer algebra have proven highly effective for circuit verification. The circuit, given as an and-inverter graph, is encoded as a set of polynomials that automatically generates a Gr\"obner basis with respect to a lexicographic term ordering. Correctness of the circuit can be derived by computing the polynomial remainder of the specification. However, the main obstacle is the monomial blow-up during the rewriting of the specification, which leads to the development of dedicated heuristics to overcome this issue. In this paper, we investigate an orthogonal approach and focus the computational effort on rewriting the Gr\"obner basis itself. Our goal is to ensure the basis contains linear polynomials that can be effectively used to rewrite the linearized specification. We first prove the soundness and completeness of this technique and then demonstrate its practical application. Our implementation of this method shows promising results on benchmarks related to multiplier verification.
arXiv:2406.04936v5 Announce Type: replace-cross Abstract: We explore a kind of first-order predicate logic with intended semantics in the reals. Compared to other approaches in the literature, we work predominantly in the multiplicative reals $[0,\infty]$, showing they support three generations of connectives, that we call non-linear, linear additive, and linear multiplicative. Means and harmonic means emerge as natural candidates for bounded existential and universal quantifiers, and in fact we see they behave as expected in relation to the other logical connectives. We explain this fact through the well-known fact that min/max and arithmetic mean/harmonic mean sit at opposite ends of a spectrum, that of p-means. We give syntax and semantics for this quantitative predicate logic, and as example applications, we show how softmax is the quantitative semantics of argmax, and R\'enyi entropy/Hill numbers are additive/multiplicative semantics of the same formula. Indeed, the additive reals also fit into the story by exploiting the Napierian duality $-\log \dashv 1/\exp$, which highlights a formal distinction between 'additive' and 'multiplicative' quantities. Finally, we describe two attempts at a categorical semantics via enriched hyperdoctrines. We discuss why hyperdoctrines are in fact probably inadequate for this kind of logic.
arXiv:2501.10852v1 Announce Type: new Abstract: The formalisation of mathematics is starting to become routine, but the value of this technology to the work of mathematicians remains to be shown. There are few examples of using proof assistants to verify brand-new work. This paper reports the formalisation of a major new result (arXiv:2303.09521) about Ramsey numbers that was announced in 2023. One unexpected finding was the heavy need for computer algebra techniques.
arXiv:2501.10802v1 Announce Type: new Abstract: Authenticated data structures allow untrusted third parties to carry out operations which produce proofs that can be used to verify an operation's output. Such data structures are challenging to develop and implement correctly. This paper gives a formal proof of security and correctness for a library that generates authenticated versions of data structures automatically. The proof is based on a new relational separation logic for reasoning about programs that use collision-resistant cryptographic hash functions. This logic provides a basis for constructing two semantic models of a type system, which are used to justify how the library makes use of type abstraction to enforce security and correctness. Using these models, we also prove the correctness of several optimizations to the library and then show how optimized, hand-written implementations of authenticated data structures can be soundly linked with automatically generated code. All of the results in this paper have been mechanized in the Coq proof assistant using the Iris framework.
arXiv:2501.11799v1 Announce Type: new Abstract: In a multi-agent system, one may choose to govern the behaviour of an agent by imposing norms, which act as guidelines for how agents should act either all of the time or in given situations. However, imposing multiple norms on one or more agents may result in situations where these norms conflict over how the agent should behave. In any system with normative conflicts (such as safe reinforcement models or systems which monitor safety protocols), one must decide which norms should be followed such that the most important and most relevant norms are maintained. We introduce a new method for resolving normative conflicts through argumentation and graph colouring which is compatible with a variety of normative conflict resolution policies. We prove that this method always creates an admissible set of arguments under argumentation semantics, meaning that it produces coherent outputs. We also introduce more robust variants of this method, each building upon their predecessor to create a superior output, and we include further mathematical proof of their coherence. Our most advanced variant uses the existing concept of curtailment, where one norm may supersede another without fully eliminating it. The methods we introduce are all compatible with various pre-existing policies for resolving normative conflicts. Empirical evaluations are also performed to compare our algorithms to each other and to others in existing literature.
arXiv:2404.01670v3 Announce Type: replace-cross Abstract: In the product $L_1\times L_2$ of two Kripke complete consistent logics, local tabularity of $L_1$ and $L_2$ is necessary for local tabularity of $L_1\times L_2$. However, it is not sufficient: the product of two locally tabular logics may not be locally tabular. We provide extra semantic and axiomatic conditions that give criteria of local tabularity of the product of two locally tabular logics, and apply them to identify new families of locally tabular products. We show that the product of two locally tabular logics may lack the product finite model property. We give an axiomatic criterion of local tabularity for all extensions of $S4.1 [ 2 ]\times S5$. Finally, we describe a new prelocally tabular extension of $S{4}\times S{5}$.