math.LO
12 postsarXiv:2501.00451v1 Announce Type: new Abstract: We demonstrate that techniques of Weihrauch complexity can be used to get easy and elegant proofs of known and new results on initial value problems. Our main result is that solving continuous initial value problems is Weihrauch equivalent to weak K\H{o}nig's lemma, even if only solutions with maximal domains of existence are considered. This result simultaneously generalizes negative and positive results by Aberth and by Collins and Gra\c{c}a, respectively. It can also be seen as a uniform version of a Theorem of Simpson. Beyond known techniques we exploit for the proof that weak K\H{o}nig's lemma is closed under infinite loops. One corollary of our main result is that solutions with maximal domain of existence of continuous initial value problems can be computed non-deterministically, and for computable instances there are always solutions that are low as points in the function space. Another corollary is that in the case that there is a fixed finite number of solutions, these solutions are all computable for computable instances and they can be found uniformly in a finite mind-change computation.
arXiv:2501.00496v1 Announce Type: cross Abstract: This work studies the proof theory of left (right) skew monoidal closed categories and skew monoidal bi-closed categories from the perspective of non-associative Lambek calculus. Skew monoidal closed categories represent a relaxed version of monoidal closed categories, where the structural laws are not invertible; instead, they are natural transformations with a specific orientation. Uustalu et al. used sequents with stoup (the leftmost position of an antecedent that can be either empty or a single formula) to deductively model left skew monoidal closed categories, yielding results regarding proof identities and categorical coherence. However, their syntax does not work well when modeling right skew monoidal closed and skew monoidal bi-closed categories. We solve the problem by constructing cut-free sequent calculi for left skew monoidal closed and skew monoidal bi-closed categories, reminiscent of non-associative Lambek calculus, with trees as antecedents. Each calculus is respectively equivalent to the sequent calculus with stoup (for left skew monoidal categories) and the axiomatic calculus (for skew monoidal bi-closed categories). Moreover, we prove that the latter calculus is sound and complete with respect to its relational models. We also prove a correspondence between frame conditions and structural laws, providing an algebraic way to understand the relationship between the left and right skew monoidal (closed) categories.
arXiv:2405.13398v3 Announce Type: replace-cross Abstract: Epistemic logic is known as a logic that captures the knowledge and beliefs of agents and has undergone various developments since Hintikka (1962). In this paper, we propose a new logic called agent-knowledge logic by taking the product of individual knowledge structures and the set of relationships among agents. This logic is based on the Facebook logic proposed by Seligman et al. (2011) and the Logic of Hide and Seek Game proposed by Li et al. (2021). We show two main results; one is that this logic can embed the standard epistemic logic, and the other is that there is a proof system of tableau calculus that works in finite time. We also discuss various sentences and inferences that this logic can express.
arXiv:2412.07592v2 Announce Type: replace-cross Abstract: We study the complexity of deterministic and probabilistic inversions of partial computable functions on the reals.
arXiv:2401.01096v2 Announce Type: replace-cross Abstract: We explore the theory of illfounded and cyclic proofs for the propositional {modal $\mu$-calculus}. A fine analysis of {provability} for classical and intuitionistic modal logic provides a novel bridge between finitary, cyclic and illfounded conceptions of proof and re-enforces the importance of two normal form theorems for the logic: guardedness and disjunctiveness.
arXiv:2209.11229v3 Announce Type: replace Abstract: The notions of bounded-size and quasibounded-size decompositions with bounded treedepth base classes are central to the structural theory of graph sparsity introduced by two of the authors years ago, and provide a characterization of both classes with bounded expansions and nowhere dense classes. Strong connections of this theory with model theory led to considering first-order transductions, which are logically defined graph transformations, and to initiate a comparative study of combinatorial and model theoretical properties of graph classes, with an emphasis on the model theoretical notions of dependence (or NIP) and stability. In this paper, we first prove that every hereditary class with quasibounded-size decompositions with dependent (resp.\ stable) base classes is itself dependent (resp.\ stable). This result is obtained in a more general study of ``decomposition horizons'', which are class properties compatible with quasibounded-size decompositions. We deduce that hereditary classes with quasibounded-size decompositions with bounded shrubdepth base classes are stable. In the second part of the paper, we prove the converse. Thus, we characterize stable hereditary classes of graphs as those hereditary classes that admit quasibounded-size decompositions with bounded shrubdepth base classes. This result is obtained by proving that every hereditary stable class of graphs admits almost nowhere dense quasi-bush representations, thus answering positively a conjecture of Dreier et al. These results have several consequences. For example, we show that every graph $G$ in a stable, hereditary class of graphs $\mathscr C$ has a clique or a stable set of size $\Omega_{\mathscr C,\epsilon}(|G|^{1/2-\epsilon})$, for every $\epsilon>0$, which is tight in the sense that it cannot be improved to $\Omega_{\mathscr C}(|G|^{1/2})$.
arXiv:2311.01184v2 Announce Type: replace Abstract: We investigate the correspondence between the time and space recognition complexity of languages. For this purpose, we will code the long-continued computations of deterministic two-tape Turing machines by the relatively short-length quantified Boolean formulae. The modified Meyer and Stockmeyer method will appreciably be used for this simulation. It will be proved using this modeling that the complexity classes Deterministic Exponential Time and Deterministic Polynomial Space coincide. It will also be proven that any language recognized in polynomial time can be recognized in almost logarithmic space. Furthermore, this allows us slightly to improve the early founded lower complexity bound of decidable theories that are nontrivial relative to some equivalence relation (this relation may be equality) -- each of these theories is consistent with the formula, which asserts that there are two non-equivalent elements. Keywords: computational complexity, the coding of computations through formulae, exponential time, polynomial space, the lower complexity bound of the language recognition
arXiv:2406.10924v2 Announce Type: replace-cross Abstract: We introduce a pebble game extended by backtracking options for one of the two players (called Prover) and reduce the provability of the pigeonhole principle for a generic predicate $R$ in the bounded arithmetic $T^2_2(R)$ to the existence of a particular kind of winning strategy (called oblivious) for Prover in the game. While the unprovability of the said principle in $T^2_2(R)$ is an immediate consequence of a celebrated theorem of Ajtai (which deals with a stronger theory $T_2(R)$), up-to-date no methods working for $T^2_2(R)$ directly (in particular without switching lemma) are known. Although the full analysis of the introduced pebble game is left open, as a first step towards resolving it, we restrict ourselves to a simplified version of the game. In this case, Prover can use only two pebbles and move in an extremely oblivious way. Besides, a series of backtracks can be made only once during a play. Under these assumptions, we show that no strategy of Prover can be winning.
arXiv:2412.14758v2 Announce Type: replace Abstract: The development of logic has largely been through the 'deductive' paradigm: conclusions are inferred from established premisses. However, the use of logic in the context of both human and machine reasoning is typically through the dual 'reductive' perspective: collections of sufficient premisses are generated from putative conclusions. We call this paradigm, 'reductive logic'. This expression of logic encompass as diverse reasoning activities as proving a formula in a formal system to seeking to meet a friend before noon on Saturday. This paper is a semantical analysis of reductive logic. In particular, we provide mathematical foundations for representing and reasoning about 'reduction operators'. Heuristically, reduction operators may be thought of as `backwards' inference rules. In this paper, we address their mathematical representation, how they are used in the context of reductive reasoning, and, crucially, what makes them 'valid'.
arXiv:2412.15736v1 Announce Type: new Abstract: This article initiates the semantic study of distribution-free normal modal logic systems, laying the semantic foundations and anticipating further research in the area. The article explores roughly the same area, though taking a different approach, with a recent article by Bezhanishvili, de Groot, Dmitrieva and Morachini, who studied a distribution-free version of Dunn's Positive Modal Logic (PML). Unlike PML, we consider logics that may drop distribution and which are equipped with both an implication connective and modal operators. We adopt a uniform relational semantics approach, relying on recent results on representation and duality for normal lattice expansions. We prove canonicity and completeness in the relational semantics of the minimal distribution-free normal modal logic, assuming just the K-axiom, as well as of its axiomatic extensions obtained by adding any of the D, T, B, S4 or S5 axioms. Adding distribution can be easily accommodated and, as a side result, we also obtain a new semantic treatment of Intuitionistic Modal Logic.
arXiv:2102.06673v3 Announce Type: replace Abstract: We investigate the proof complexity of systems based on positive branching programs, i.e. non-deterministic branching programs (NBPs) where, for any 0-transition between two nodes, there is also a 1-transition. Positive NBPs compute monotone Boolean functions, just like negation-free circuits or formulas, but constitute a positive version of (non-uniform) NL, rather than P or NC1, respectively. The proof complexity of NBPs was investigated in previous work by Buss, Das and Knop, using extension variables to represent the dag-structure, over a language of (non-deterministic) decision trees, yielding the system eLNDT. Our system eLNDT+ is obtained by restricting their systems to a positive syntax, similarly to how the 'monotone sequent calculus' MLK is obtained from the usual sequent calculus LK by restricting to negation-free formulas. Our main result is that eLNDT+ polynomially simulates eLNDT over positive sequents. Our proof method is inspired by a similar result for MLK by Atserias, Galesi and Pudl\'ak, that was recently improved to a bona fide polynomial simulation via works of Je\v{r}\'abek and Buss, Kabanets, Kolokolova and Kouck\'y. Along the way we formalise several properties of counting functions within eLNDT+ by polynomial-size proofs and, as a case study, give explicit polynomial-size poofs of the propositional pigeonhole principle.
arXiv:2412.16152v1 Announce Type: cross Abstract: This paper proves a homomorphism between extensional formal semantics and distributional vector space semantics, demonstrating structural compatibility. Formal semantics models meaning as reference, using logical structures to map linguistic expressions to truth conditions, while distributional semantics represents meaning through word vectors derived from contextual usage. By constructing injective mappings that preserve semantic relationships, we show that every semantic function in an extensional model corresponds to a compatible vector space operation. This result respects compositionality and extends to function compositions, constant interpretations, and $n$-ary relations. Rather than pursuing unification, we highlight a mathematical foundation for hybrid cognitive models that integrate symbolic and sub-symbolic reasoning and semantics. These findings support multimodal language processing, aligning `meaning as reference' (Frege, Tarski) with `meaning as use' (Wittgenstein, Firth).