cs.LO

129 posts

arXiv:2503.21906v1 Announce Type: new Abstract: Modern cyber-physical systems (CPS) can consist of various networked components and agents interacting and communicating with each other. In the context of spatially distributed CPS, these connections can be dynamically dependent on the spatial configuration of the various components and agents. In these settings, robust monitoring of the distributed components is vital to ensuring complex behaviors are achieved, and safety properties are maintained. To this end, we look at defining the automaton semantics for the Spatio-Temporal Reach and Escape Logic (STREL), a formal logic designed to express and monitor spatio-temporal requirements over mobile, spatially distributed CPS. Specifically, STREL reasons about spatio-temporal behavior over dynamic weighted graphs. While STREL is endowed with well defined qualitative and quantitative semantics, in this paper, we propose a novel construction of (weighted) alternating finite automata from STREL specifications that efficiently encodes these semantics. Moreover, we demonstrate how this automaton semantics can be used to perform both, offline and online monitoring for STREL specifications using a simulated drone swarm environment.

Anand Balakrishnan, Sheryl Paul, Simone Silvetti, Laura Nenzi, Jyotirmoy V. Deshmukh3/31/2025

arXiv:2503.20849v2 Announce Type: replace Abstract: Logic programs, more specifically, Answer-set programs, can be annotated with probabilities on facts to express uncertainty. We address the problem of propagating weight annotations on facts (eg probabilities) of an ASP to its standard models, and from there to events (defined as sets of atoms) in a dataset over the program's domain. We propose a novel approach which is algebraic in the sense that it relies on an equivalence relation over the set of events. Uncertainty is then described as polynomial expressions over variables. We propagate the weight function in the space of models and events, rather than doing so within the syntax of the program. As evidence that our approach is sound, we show that certain facts behave as expected. Our approach allows us to investigate weight annotated programs and to determine how suitable a given one is for modeling a given dataset containing events.

Francisco Coelho, Bruno Dinis, Dietmar Seipel, Salvador Abreu3/31/2025

arXiv:2503.21852v1 Announce Type: new Abstract: The synthesis of reactive systems aims for the automated construction of strategies for systems that interact with their environment. Whereas the synthesis approach has the potential to change the development of reactive systems significantly due to the avoidance of manual implementation, it still suffers from a lack of efficient synthesis algorithms for many application scenarios. The translation of the system specification into an automaton that allows for strategy construction (if a winning strategy exists) is nonelementary in the length of the specification in S1S and doubly exponential for LTL, raising the need of highly specialized algorithms. In this article, we present an approach on how to reduce this state space explosion in the construction of this automaton by exploiting a monotonicity property of specifications. For this, we introduce window counting constraints that allow for step-wise refinement or abstraction of specifications. In an iterative synthesis procedure, those window counting constraints are used to construct automata representing over- or under-approximations (depending on the counting constraint) of constraint-compliant behavior. Analysis results on winning regions of previous iterations are used to reduce the size of the next automaton, leading to an overall reduction of the state space explosion extent. We present the implementation results of the iterated synthesis for a zero-sum game setting as proof of concept. Furthermore, we discuss the current limitations of the approach in a zero-sum setting and sketch future work in non-zero-sum settings.

Linda Feeken, Martin Fr\"anzle3/31/2025

arXiv:2503.22558v1 Announce Type: new Abstract: The goal of this paper is to provide exact and terminating algorithms for the formal analysis of deterministic continuous-time control systems with affine input and polynomial state dynamics (in short, polynomial systems). We consider the following semantic properties: zeroness and equivalence, input independence, linearity, and analyticity. Our approach is based on Chen-Fliess series, which provide a unique representation of the dynamics of such systems via their formal generating series. Our starting point is Fliess' seminal work showing how the semantic properties above are mirrored by corresponding combinatorial properties on generating series. Next, we observe that the generating series of polynomial systems coincide with the class of shuffle-finite series, a nonlinear generalisation of Sch\"utzenberger's rational series which has recently been studied in the context of automata theory and enumerative combinatorics. We exploit and extend recent results in the algorithmic analysis of shuffle-finite series (such as zeroness, equivalence, and commutativity) to show that the semantic properties above can be decided exactly and in finite time for polynomial systems. Some of our analyses rely on a novel technical contribution, namely that shuffle-finite series are closed under support restrictions with commutative regular languages, a result of independent interest.

Lorenzo Clemente3/31/2025

arXiv:2503.22042v1 Announce Type: cross Abstract: Classical set theory constructs the continuum via the power set P(N), thereby postulating an uncountable totality. However, constructive and computability-based approaches reveal that no formal system with countable syntax can generate all subsets of N, nor can it capture the real line in full. In this paper, we propose fractal countability as a constructive alternative to the power set. Rather than treating countability as an absolute cardinal notion, we redefine it as a stratified, process-relative closure over definable subsets, generated by a sequence of conservative extensions to a base formal system. This yields a structured, internally growing hierarchy of constructive definability that remains within the countable realm but approximates the expressive richness of the continuum. We compare fractally countable sets to classical countability and the hyperarithmetical hierarchy, and interpret the continuum not as a completed object, but as a layered definitional horizon. This framework provides a constructive reinterpretation of power set-like operations without invoking non-effective principles.

Stanislav Semenov3/31/2025

arXiv:2503.16891v2 Announce Type: replace Abstract: We consider the problem of the verification of an LTL specification $\varphi$ on a system $S$ given some prior knowledge $K$, an LTL formula that $S$ is known to satisfy. The automata-theoretic approach to LTL model checking is implemented as an emptiness check of the product $S\otimes A_{\lnot\varphi}$ where $A_{\lnot\varphi}$ is an automaton for the negation of the property. We propose new operations that simplify an automaton $A_{\lnot\varphi}$ \emph{given} some knowledge automaton $A_K$, to produce an automaton $B$ that can be used instead of $A_{\lnot\varphi}$ for more efficient model checking. Our evaluation of these operations on a large benchmark derived from the MCC'22 competition shows that even with simple knowledge, half of the problems can be definitely answered without running an LTL model checker, and the remaining problems can be simplified significantly.

Alexandre Duret-Lutz (LRE), Denis Poitrenaud (UPCit\'e, MoVe), Yann Thierry-Mieg (MoVe)3/31/2025

arXiv:2503.22492v1 Announce Type: cross Abstract: Recently, arXiv:2312.16035 showed that all logics based on Boolean Normal monotonic three-valued schemes coincide with classical logic when defined using a strict-tolerant standard ($\mathbf{st}$). Conversely, they proved that under a tolerant-strict standard ($\mathbf{ts}$), the resulting logics are all empty. Building on these results, we show that classical logic can be obtained by closing under transitivity the union of two logics defined over (potentially different) Boolean normal monotonic schemes, using a strict-strict standard ($\mathbf{ss}$) for one and a tolerant-tolerant standard ($\mathbf{tt}$) for the other, with the first of these logics being paracomplete and the other being paraconsistent. We then identify a notion dual to transitivity that allows us to characterize the logic $\mathsf{TS}$ as the dual transitive closure of the intersection of any two logics defined over (potentially different) Boolean normal monotonic schemes, using an $\mathbf{ss}$ standard for one and a $\mathbf{tt}$ standard for the other. Finally, we expand on the abstract relations between the transitive closure and dual transitive closure operations, showing that they give rise to lattice operations that precisely capture how the logics discussed relate to one another.

Quentin Blomet, Bruno Da R\'e3/31/2025

arXiv:2306.15516v5 Announce Type: replace Abstract: We introduce a general abstract framework for database repairs, where the repair notions are defined using formal logic. We distinguish between integrity constraints and so-called query constraints. The former are used to model consistency and desirable properties of the data (such as functional dependencies and independencies), while the latter relate two database instances according to their answers to the query constraints. The framework allows for a distinction between hard and soft queries, allowing the answers to a core set of queries to be preserved, as well as defining a distance between instances based on query answers. We illustrate how different notions of repairs from the literature can be modelled within our unifying framework. The framework generalises both set-based and cardinality based repairs to semiring annotated databases. Furthermore, we initiate a complexity-theoretic analysis of consistent query answering and checking existence of a repair within the framework.

Nicolas Fr\"ohlich, Arne Meier, Nina Pardal, Jonni Virtema3/31/2025

arXiv:2301.13735v2 Announce Type: replace Abstract: A class of graphs $\mathscr{C}$ is monadically stable if for any unary expansion $\widehat{\mathscr{C}}$ of $\mathscr{C}$, one cannot interpret, in first-order logic, arbitrarily long linear orders in graphs from $\widehat{\mathscr{C}}$. It is known that nowhere dense graph classes are monadically stable; these encompass most of the studied concepts of sparsity in graphs, including graph classes that exclude a fixed topological minor. On the other hand, monadic stability is a property expressed in purely model-theoretic terms and hence it is also suited for capturing structure in dense graphs. For several years, it has been suspected that one can create a structure theory for monadically stable graph classes that mirrors the theory of nowhere dense graph classes in the dense setting. In this work we provide a step in this direction by giving a characterization of monadic stability through the Flipper game: a game on a graph played by Flipper, who in each round can complement the edge relation between any pair of vertex subsets, and Connector, who in each round localizes the game to a ball of bounded radius. This is an analog of the Splitter game, which characterizes nowhere dense classes of graphs (Grohe, Kreutzer, and Siebertz, J.ACM'17). We give two different proofs of our main result. The first proof uses tools from model theory, and it exposes an additional property of monadically stable graph classes that is close in spirit to definability of types. Also, as a byproduct, we give an alternative proof of the recent result of Braunfeld and Laskowski (arXiv 2209.05120) that monadic stability for graph classes coincides with existential monadic stability. The second proof relies on the recently introduced notion of flip-wideness (Dreier, M\"ahlmann, Siebertz, and Toru\'nczyk, ICALP 2023) and provides an efficient algorithm to compute Flipper's moves in a winning strategy.

Jakub Gajarsk\'y, Nikolas M\"ahlmann, Rose McCarty, Pierre Ohlmann, Micha{\l} Pilipczuk, Wojciech Przybyszewski, Sebastian Siebertz, Marek Soko{\l}owski, Szymon Toru\'nczyk3/14/2025

arXiv:2503.10231v1 Announce Type: new Abstract: In this article, we present a novel method for assessing the similarity of information within knowledge-bases using a logical point of view. This proposal introduces the concept of a similarity property space $\Xi$P for each knowledge K, offering a nuanced approach to understanding and quantifying similarity. By defining the similarity knowledge space $\Xi$K through its properties and incorporating similarity source information, the framework reinforces the idea that similarity is deeply rooted in the characteristics of the knowledge being compared. Inclusion of super-categories within the similarity knowledge space $\Xi$K allows for a hierarchical organization of knowledge, facilitating more sophisticated analysis and comparison. On the one hand, it provides a structured framework for organizing and understanding similarity. The existence of super-categories within this space further allows for hierarchical organization of knowledge, which can be particularly useful in complex domains. On the other hand, the finite nature of these categories might be restrictive in certain contexts, especially when dealing with evolving or highly nuanced forms of knowledge. Future research and applications of this framework focus on addressing its potential limitations, particularly in handling dynamic and highly specialized knowledge domains.

Jos\'e-Luis Vilchis-Medina (ENSTA Bretagne, Lab-STICC, Lab-STICC_ROBEX)3/14/2025

arXiv:2403.02170v3 Announce Type: replace Abstract: The verification of Multi-Agent Systems (MAS) poses a significant challenge. Various approaches and methodologies exist to address this challenge; however, tools that support them are not always readily available. Even when such tools are accessible, they tend to be hard-coded, lacking in compositionality, and challenging to use due to a steep learning curve. In this paper, we introduce a methodology designed for the formal verification of MAS in a modular and versatile manner, along with an initial prototype, that we named VITAMIN. Unlike existing verification methodologies and frameworks for MAS, VITAMIN is constructed for easy extension to accommodate various logics (for specifying the properties to verify) and models (for determining on what to verify such properties).

Angelo Ferrando, Vadim Malvone3/14/2025

arXiv:2503.09831v1 Announce Type: new Abstract: It is well-known that intersection type assignment systems can be used to characterize strong normalization (SN). Typical proofs that typable lambda-terms are SN in these systems rely on semantical techniques. In this work, we study $\Lambda_\cap^e$, a variant of Coppo and Dezani's (Curry-style) intersection type system, and we propose a syntactical proof of strong normalization for it. We first design $\Lambda_\cap^i$, a Church-style version, in which terms closely correspond to typing derivations. Then we prove that typability in $\Lambda_\cap^i$ implies SN through a measure that, given a term, produces a natural number that decreases along with reduction. Finally, the result is extended to $\Lambda_\cap^e$, since the two systems simulate each other.

Pablo Barenbaum, Simona Ronchi Della Rocca, Cristian Sottile3/14/2025

arXiv:2503.10353v1 Announce Type: new Abstract: The so-called algebraic approach to the constraint satisfaction problem (CSP) has been a prevalent method of the study of complexity of these problems since early 2000's. The core of this approach is the notion of polymorphisms which determine the complexity of the problem (up to log-space reductions). In the past few years, a new, more general version of the CSP emerged, the promise constraint satisfaction problem (PCSP), and the notion of polymorphisms and most of the core theses of the algebraic approach were generalised to the promise setting. Nevertheless, recent work also suggests that insights from other fields are immensely useful in the study of PCSPs including algebraic topology. In this paper, we provide an entry point for category-theorists into the study of complexity of CSPs and PCSPs. We show that many standard CSP notions have clear and well-known categorical counterparts. For example, the algebraic structure of polymorphisms can be described as a set-functor defined as a right Kan extension. We provide purely categorical proofs of core results of the algebraic approach including a proof that the complexity only depends on the polymorphisms. Our new proofs are substantially shorter and, from the categorical perspective, cleaner than previous proofs of the same results. Moreover, as expected, are applicable more widely. We believe that, in particular in the case of PCSPs, category theory brings insights that can help solve some of the current challenges of the field.

Maximilian Hadek, Tom\'a\v{s} Jakl, Jakub Opr\v{s}al3/14/2025

arXiv:2211.14913v2 Announce Type: replace Abstract: Linear Temporal Logic (LTL) is the de-facto standard temporal logic for system specification, whose foundational properties have been studied for over five decades. Safety and cosafety properties define notable fragments of LTL, where a prefix of a trace suffices to establish whether a formula is true or not over that trace. In this paper, we study the complexity of the problems of satisfiability, validity, and realizability over infinite and finite traces for the safety and cosafety fragments of LTL. As for satisfiability and validity over infinite traces, we prove that the majority of the fragments have the same complexity as full LTL, that is, they are PSPACE-complete. The picture is radically different for realizability: we find fragments with the same expressive power whose complexity varies from 2EXPTIME-complete (as full LTL) to EXPTIME-complete. Notably, for all cosafety fragments, the complexity of the three problems does not change passing from infinite to finite traces, while for all safety fragments the complexity of satisfiability (resp., realizability) over finite traces drops to NP-complete (resp., ${\Pi}^P_2$-complete).

Alessandro Artale, Luca Geatti, Nicola Gigante, Andrea Mazzullo, Angelo Montanari3/14/2025

arXiv:2503.09730v1 Announce Type: new Abstract: The most promising recent methods for AI reasoning require applying variants of reinforcement learning (RL) either on rolled out trajectories from the model, even for the step-wise rewards, or large quantities of human annotated trajectory data. The reliance on the rolled-out trajectory renders the compute cost and time prohibitively high. In particular, the correctness of a reasoning trajectory can typically only be judged at its completion, leading to sparse rewards in RL or requiring expensive synthetic data generation in expert iteration-like methods. In this work, we focus on the Automatic Theorem Proving (ATP) task and propose a novel verifier-in-the-loop design, which unlike existing approaches that leverage feedback on the entire reasoning trajectory, employs an automated verifier to give intermediate feedback at each step of the reasoning process. Using Lean as the verifier, we empirically show that the step-by-step local verification produces a global improvement in the model's reasoning accuracy and efficiency.

Sara Rajaee, Kumar Pratik, Gabriele Cesa, Arash Behboodi3/14/2025

arXiv:2407.14105v2 Announce Type: replace Abstract: This paper studies the recursion-theoretic aspects of large-scale geometries of infinite strings, a subject initiated by Khoussainov and Takisaka (2017). We investigate several notions of quasi-isometric reductions between recursive infinite strings and prove various results on the equivalence classes of such reductions. The main result is the construction of two infinite recursive strings $\alpha$ and $\beta$ such that $\alpha$ is strictly quasi-isometrically reducible to $\beta$, but the reduction cannot be made recursive. This answers an open problem posed by Khoussainov and Takisaka.

Karen Frilya Celine, Ziyuan Gao, Sanjay Jain, Ryan Lou, Frank Stephan, Guohua Wu3/10/2025

arXiv:2304.06348v4 Announce Type: replace Abstract: We propose a generic framework for establishing the decidability of a wide range of logical entailment problems (briefly called querying), based on the existence of countermodels that are structurally simple, gauged by certain types of width measures (with treewidth and cliquewidth as popular examples). As an important special case of our framework, we identify logics exhibiting width-finite finitely universal model sets, warranting decidable entailment for a wide range of homomorphism-closed queries, subsuming a diverse set of practically relevant query languages. As a particularly powerful width measure, we propose to employ Blumensath's partitionwidth, which subsumes various other commonly considered width measures and exhibits highly favorable computational and structural properties. Focusing on the formalism of existential rules as a popular showcase, we explain how finite partitionwidth sets of rules subsume other known abstract decidable classes but - leveraging existing notions of stratification - also cover a wide range of new rulesets. We expose natural limitations for fitting the class of finite unification sets into our picture and suggest several options for remedy.

Thomas Feller, Tim S. Lyon, Piotr Ostropolski-Nalewaja, Sebastian Rudolph3/10/2025

arXiv:2404.12229v2 Announce Type: replace Abstract: In this paper we revisit the problem of computing the closure of a set of attributes given a basis of dependencies or implications. This problem is of main interest in logics, in the relational database model, in lattice theory, and in Formal Concept Analysis as well. A basis of dependencies may have different characteristics, among which being ``minimal'', e.g., the Duquenne-Guigues Basis, or being ``direct'', e.g., the the Canonical Basis and the D-basis. Here we propose an extensive and experimental study of the impacts of minimality and directness on the closure algorithms. The results of the experiments performed on real and synthetic datasets are analyzed in depth, and suggest a different and fresh look at computing the closure of a set of attributes w.r.t. a basis of dependencies. This paper has been submitted to the International Journal of Approximate Reasoning.

Jaume Baixeries, Amedeo Napoli3/10/2025

arXiv:2503.05355v1 Announce Type: new Abstract: Logic programming (LP) is typically understood through operational semantics (e.g., SLD-resolution) or model-theoretic interpretations (e.g., the least Herbrand model). This paper introduces a novel perspective on LP by defining a ``support'' relation that explicates what a program ``knows''. This interpretation is shown to express classical and intuitionistic logic, as well as an intermediate logic, depending on certain choices regarding LP and the meanings of disjunction and negation. These results are formalized using the idea of base-extension semantics within proof-theoretic semantics. Our approach offers new insights into the logical foundations of LP and has potential applications in knowledge representation, automated reasoning, and formal verification.

Alexader V. Gheorghiu3/10/2025

arXiv:2503.04782v1 Announce Type: new Abstract: Satisfiability Modulo Linear Integer Arithmetic, SMT(LIA) for short, is pivotal across various critical domains. Previous research has primarily focused on SMT solving techniques. However, in practical applications such as software and hardware testing, there is a need to generate a diverse set of solutions for use as test inputs. We have developed the first sampling framework that integrates local search with CDCL(T) techniques, named HighDiv, capable of generating a highly diverse set of solutions for constraints under linear integer theory. Initially, in the local search phase, we introduced a novel operator called boundary-aware movement. This operator performs random moves by considering the current state's constraints on variables, thereby enhancing the diversity of variables during the search process. Furthermore, we have conducted an in-depth study of the preprocessing and variable initialization mechanisms within the framework, which significantly enhances the efficiency of subsequent local searches. Lastly, we use the solutions obtained from local search sampling as additional constraints to further explore the solution space using the stochastic CDCL(T) method. Experimental results demonstrate that \HighDiv generates solutions with greater diversity compared to the state-of-the-art SMT(LIA) sampling tool, MeGASampler.

Yong Lai, Junjie Li, Chuan Luo3/10/2025