View: session overviewtalk overview

09:00 | PRESENTER: Lorenzo Ceragioli ABSTRACT. Security Enhanced Linux (SELinux) is a security architecture for Linux implementing mandatory access control. It has been used in numerous security-critical contexts ranging from servers to mobile devices. But this is challenging as SELinux security policies are difficult to write, understand, and maintain. Recently, the intermediate language CIL was introduced to foster the development of high-level policy languages and to write structured configurations. However, CIL lacks mechanisms for ensuring that the resulting configurations obey desired information flow policies. To remedy this, we propose IFCIL, a backward compatible extension of CIL for specifying fine-grained information flow requirements for CIL configurations. Using IFCIL, administrators can express, e.g., confidentiality, integrity, and non-interference properties. We also provide a tool to statically verify these requirements. |

09:30 | Towards a General-Purpose Dynamic Information Flow Policy PRESENTER: Peixuan Li ABSTRACT. Noninterference offers a rigorous end-to-end guarantee for the secure propagation of information. However, real-world systems almost always involve security requirements that change during program execution, making noninterference inapplicable. Prior works alleviate the limitation to some extent, but even for a veteran in information flow security, understanding the subtleties in the syntax and semantics of each policy are challenging, largely due to their very different policy specification languages, and more fundamentally, semantic requirements. We take a top-down approach and present a novel information flow policy, called Dynamic Release, which allows information flow restrictions to downgrade and upgrade in arbitrary ways. Dynamic Release is formalized on a novel framework that, for the first time, allows us to compare and contrast various dynamic policies in the literature. We show that Dynamic Release generalizes declassification, erasure, endorsement and revocation. Moreover, it is the only dynamic policy that is both applicable and correct on a benchmark of tests with dynamic policy. |

10:00 | Beware of Greeks bearing entanglement? Quantum covert channels, information flow and non-local games ABSTRACT. Can quantum entanglement increase the capacity of (classical) covert channels? To one familiar with Holevo's Theorem it is tempting to think that the answer is obviously no. However, in this work we show: quantum entanglement can in fact increase the capacity of a classical covert channel, in the presence of an active adversary; on the other hand, a zero-capacity channel is not improved by entanglement, so entanglement cannot create `purely quantum' covert channels; the problem of determining the capacity of a given channel in the presence of entanglement is undecidable; but there is an algorithm to bound the entangled capacity of a channel from above, adapted from the semi-definite hierarchy from the theory of non-local games, whose close connection to channel capacity is at the core of all of our results. |

09:00 | More on Interpolants and Explicit Definitions for Description Logics with Nominals and/or Role Inclusions PRESENTER: Andrea Mazzullo ABSTRACT. It is known that the problems of deciding the existence of Craig interpolants and of explicit definitions of concepts are both 2ExpTime-complete for standard description logics with nominals and/or role inclusions. These complexity results depend on the presence of an ontology. In this article, we first consider the case without ontologies (or, in the case of role inclusions, ontologies only containing role inclusions) and show that both the existence of Craig interpolants and of explicit definitions of concepts become coNExpTime-complete for DLs such as ALCO and ALCH. Secondly, we make a few observations regarding the size and computation of interpolants and explicit definitions, both with ontologies and without. |

09:25 | Optimal ABox Repair w.r.t. Static EL TBoxes: from Quantified ABoxes back to ABoxes (Extended Abstract) PRESENTER: Francesco Kriegel ABSTRACT. (This is an extended abstract on our paper accepted at ESWC 2022.) |

09:50 | ABSTRACT. Forgetting is an important ontology extraction technique. A variant of forgetting which has received significant attention in the literature is \emph{deductive forgetting}. While \emph{deductive forgetting} is attractive as it generates the forgetting view in a language with the same complexity as the language of the original ontology, it is known to be not precise as it may not preserve the information that requires more complex languages. We study \emph{deductive forgetting} with the aim of understanding the unpreserved information. We present a system that performs \emph{deductive forgetting} and produces a set of axioms~$\Delta$ representing the unpreserved information in the forgetting view. Our system allows a new fine-grained ontology extraction process that gives the user the option to enhance the informativeness of the deductive forgetting view by appending to it axioms from~$\Delta$. |

10:15 | PRESENTER: Jonas Karge ABSTRACT. We introduce a framework that allows for the construction of sequent systems for expressive description logics extending ALC. Our framework not only covers a wide array of common description logics, but also allows for sequent systems to be obtained for extensions of description logics with special formulae that we call role relational axioms. All sequent systems are sound, complete, and possess favorable properties such as height-preserving admissibility of common structural rules and height-preserving invertibility of rules. |

09:00 | PRESENTER: Michael Bernreiter ABSTRACT. Choice logics constitute a family of propositional logics and are used for the representation of preferences, with especially qualitative choice logic (QCL) being an established formalism with numerous applications in artificial intelligence. While computational properties and applications of choice logics have been studied in the literature, only few results are known about the proof-theoretic aspects of their use. We propose a sound and complete sequent calculus for preferred model entailment in QCL, where a formula F is entailed by a QCL-theory T if F is true in all preferred models of T. The calculus is based on labeled sequent and refutation calculi, and can be easily adapted for different purposes. For instance, using the calculus as a cornerstone, calculi for other choice logics such as conjunctive choice logic (CCL) can be obtained in a straightforward way. |

09:20 | PRESENTER: Cezary Kaliszyk ABSTRACT. Lash is a higher-order automated theorem prover created as a fork of the theorem prover Satallax. The basic underlying calculus of Satallax is a ground tableau calculus whose rules only use shallow information about the terms and formulas taking part in the rule. Lash uses new, efficient C representations of vital structures and operations. Most importantly, Lash uses a C representation of (normal) terms with perfect sharing along with a C implementation of normalizing substitutions. We describe the ways in which Lash differs from Satallax and the performance improvement of Lash over Satallax when used with analogous flag settings. With a 10s timeout Lash outperforms Satallax on a collection TH0 problems from the TPTP. We conclude with ideas for continuing the development of Lash. |

09:35 | PRESENTER: Julie Cailler ABSTRACT. We describe Goéland, an automated theorem prover for first-order logic that relies on a concurrent search procedure to find tableau proofs, with concurrent processes corresponding to individual branches of the tableau. Since branch closure may require instantiating free variables shared across branches, processes communicate via channels to exchange information about substitutions used for closure. We present the proof-search procedure and its implementation, as well as experimental results obtained on problems from the TPTP library. |

09:50 | PRESENTER: Dorota Leszczyńska-Jasion ABSTRACT. We discuss the results of our work on heuristics for generating minimal synthetic tableaux. We present this proof method for classical propositional logic and its implementation in Haskell. Based on mathematical insights and exploratory data analysis we defined a heuristics that allows to build a tableau of optimal or nearly optimal size. The proposed heuristics has been first tested on a data set with over 200 thousand of short formulas (length 12), then on a number of longer formulas (9 hundred of formulas of length 23). We describe the results of data analysis and examine some tendencies. We also confront our approach with the pigeonhole principle. |

10:10 | PRESENTER: Štěpán Holub ABSTRACT. A code $X$ is not primitivity preserving if there is a primitive list $\ws \in \lists X$ whose concatenation is imprimitive. We formalize a full characterization of such codes in the binary case in the proof assistant Isabelle/HOL. Part of the formalization, interesting on its own, is a description of $\{x,y\}$-interpretations of the square $xx$ if $\abs y \leq \abs x$. We also provide a formalized parametric solution of the related equation $x^jy^k = z^\ell$ |

09:00 | ABSTRACT. In 1999 Alon et. al. introduced the still active research topic of approximating the frequency moments of a data stream using randomized algorithms with minimal space usage. This includes the problem of estimating the cardinality of the stream elements - the zeroth frequency moment. But also higher-order frequency moments that provide information about the skew of the data stream, which is for example critical information for parallel processing. (The k-th frequency moment of a data stream is the sum of the k-th powers of the occurrence counts of each element in the stream.) They introduce both lower bounds and upper bounds, which were later improved by newer publications. The algorithms have guaranteed success probabilities and accuracies without making any assumptions on the input distribution. They are an interesting use case for formal verification because their correctness proofs require a large body of deep results from algebra, analysis and probability theory. This work reports on the formal verification of three algorithms for the approximation of F_0, F_2 and F_k for k ≥ 3. The results include the identification of significantly simpler algorithms with the same runtime and space complexities as the previously known ones as well as the development of several reusable components, such as a formalization of universal hash families, amplification methods for randomized algorithms, a model for one-pass data stream algorithms or a generic flexible encoding library for the verification of space complexities. |

09:30 | Mechanizing Soundness of Off-Policy Evaluation PRESENTER: Jared Yeager ABSTRACT. There are reinforcement learning scenarios---e.g., in medicine---where we are compelled to be as confident as possible that a policy change will result in an improvement before implementing it. In such scenarios, we can employ *off-policy evaluation* (OPE). The basic idea of OPE is to record histories of behaviors under the current policy, and then develop an estimate of the quality of a proposed new policy, seeing what the behavior would have been under the new policy. As we are evaluating the policy without actually using it, we have the "off-policy" of OPE. Applying a concentration inequality to the estimate, we derive a confidence interval for the expected quality of the new policy. If the confidence interval lies above that of the current policy, we can change policies with high confidence we will do no harm. In this work, we focus on the mathematics of this method, by mechanizing the soundness of off-policy evaluation. A natural side effect of the mechanization is both to clarify all the result's mathematical assumptions and preconditions, and to further develop HOL4's library of verified statistical mathematics, including concentration inequalities. Of more significance, the OPE method relies on importance sampling, whose soundness we prove using a measure-theoretic approach. In fact, we generalize the standard result, showing it for contexts comprising both discrete and continuous probability distributions. |

10:00 | PRESENTER: Alley Stoughton ABSTRACT. We use the EasyCrypt proof assistant to formalize the adversarial approach to proving lower bounds for computational problems in the query model. This is done using a lower bound game between an algorithm and adversary, in which the adversary answers the algorithm's queries in a way that makes the algorithm issue at least the desired number of queries. A complementary upper bound game is used for proving upper bounds of algorithms; here the adversary incrementally and adaptively realizes an algorithm's input. We prove a natural connection between the lower and upper bound games, and apply our framework to three computational problems, including searching in an ordered list and comparison-based sorting, giving evidence for the generality of our notion of algorithm and the usefulness of our framework. |

10:00 | There and Back Again: Combining Nonmonotonic Logical Reasoning and Deep Learning on an Assistive Robot PRESENTER: Mohan Sridharan ABSTRACT. This paper describes the development of an architecture that combines non-monotonic logical reasoning and deep learning in virtual/simulated and real/physical environments for a robot assisting in a restaurant environment. Specifically, for any given goal, the architecture uses Answer Set Prolog to represent and reason with incomplete commonsense domain knowledge, providing a sequence of actions for the robot to execute. At the same time, reasoning directs the robot's learning of deep neural network models in virtual environments for human face and hand gestures. These learned models are used by the robot to recognize and translate human gestures to goals that need to be achieved. We report insights learned from the development and evaluation of this architecture by a distributed team of researchers during the ongoing pandemic. |

11:00 | PRESENTER: Kyveli Doveri ABSTRACT. We propose a novel algorithm to decide the language inclusion between (nondeterministic) Büchi automata, a PSPACE-complete problem. Our approach, like others before, leverage a notion of quasiorder to prune the search for a counterexample by discarding candidates which are subsumed by others for the quasiorder. Discarded candidates are guaranteed to not compromise the completeness of the algorithm. The novelty of our work lies in the quasiorder used to discard candidates. We introduce FORQs (family of right quasiorders) that we obtain by adapting the notion of family of right congruences put forward by Maler and Staiger in 1993. We define a FORQ-based inclusion algorithm which we prove correct and instantiate it for a specific FORQ, called structural FORQ, induced by the Büchi automata to the right of the inclusion sign. The resulting implementation, called FORKLIFT, scales up better than the state-of-the-art on a variety of benchmarks including benchmarks from program verification and theorem proving for word combinatorics. |

11:20 | PRESENTER: Thibault Dardinier ABSTRACT. The magic wand A --* B (also called separating implication) is a separation logic connective commonly used to specify properties of partial data structures, for instance during iterative traversals. A footprint of a magic wand formula A --* B is a state that, combined with any state in which A holds, yields a state in which B holds. The key challenge of proving a magic wand (also called packaging a wand) is to find such a footprint. Existing package algorithms either have a high annotation overhead or, as we show in this paper, are unsound. We present a formal framework that precisely characterises a wide design space of possible package algorithms applicable to a large class of separation logics. We prove in Isabelle/HOL that our formal framework is sound and complete, and use it to develop a novel package algorithm that offers competitive automation and is sound. Moreover, we present a novel, restricted definition of wands and prove in Isabelle/HOL that it is possible to soundly combine fractions of such wands, which is not the case for arbitrary wands. We have implemented our techniques for the Viper language, and demonstrate that they are effective in practice. |

11:40 | PRESENTER: Yong Li ABSTRACT. The determinization of a nondeterministic Büchi automaton (NBA) is a fundamental construction of automata theory, with applications to probabilistic verification and reactive synthesis. The standard determinization constructions, such as the ones based on the Safra-Piterman's approach, work on the whole NBA. In this work we propose a divide-and-conquer determinization approach. To this end, we first classify the strongly connected components (SCCs) of the given NBA as inherently weak, deterministic accepting, and nondeterministic accepting. We then present how to determinize each type of SCC independently from the others; this results in an easier handling of the determinization algorithm that takes advantage of the structure of that SCC. Once all SCCs have been determinized, we show how to compose them so to obtain the final equivalent deterministic Emerson-Lei automaton, which can be converted into a deterministic Rabin automaton without blow-up of states and transitions. We implement our algorithm in a prototype tool named ourDC and empirically evaluate ourDC with the state-of-the-art tools on a large set of benchmarks from literature. The experimental results show that our prototype ourDC outperforms Spot and OWL regarding the number of states and transitions. |

12:00 | PRESENTER: Alexandre Duret-Lutz ABSTRACT. Spot is a C++17 library for LTL and ω-automata manipulation, with command-line utilities, and Python bindings. This paper summarizes its evolution over the past six years, since the release of Spot 2.0, which was the first version to support ω-automata with arbitrary acceptance conditions, and the last version presented at a conference. Since then, Spot has been extended with several features such as acceptance transformations, alternating automata, games, LTL synthesis, and more. We also shed some lights on the data-structure used to store automata. |

12:10 | PRESENTER: Barbora Šmahlíková ABSTRACT. We present the tool Ranker for complementing Büchi automata (BAs). Ranker builds on our previous optimizations of rank-based BA complementation and pushes them even further using numerous heuristics to produce even smaller automata. Moreover, it contains novel optimizations of specialized constructions for complementing (i) inherently weak automata and (ii) semi-deterministic automata, all delivered in a robust tool. The optimizations significantly improve the usability of Ranker, as shown in an extensive experimental evaluation with real-world benchmarks, where Ranker produced in the majority of cases a strictly smaller complement than other state-of-the-art tools. |

11:00 | Cracking the Stateful Nut -- Computational Proofs of Stateful Security Protocols using the Squirrel Proof Assistant PRESENTER: Adrien Koutsos ABSTRACT. Bana and Comon have proposed a logical approach to proving protocols in the computational model, which they call the Computationally Complete Symbolic Attacker (CCSA). The proof assistant Squirrel implements a verification technique that elaborates on this approach, building on a meta-logic over the CCSA base logic. In this paper, we show that this meta-logic can naturally be extended to handle protocols with mutable states (key updates, counters, \etc.) and we extend Squirrel's proof system to be able to express the complex proof arguments that are sometimes required for these protocols. Our theoretical contributions have been implemented in Squirrel and validated on a number of case studies, including a proof of the YubiKey and YubiHSM protocols. |

11:30 | Exploiting Partial Order of Keys to Verify Security of a Vehicular Group Protocol PRESENTER: Felipe Boeira ABSTRACT. Vehicular networks will enable a range of novel applications to enhance road traffic efficiency, safety, and reduce fuel consumption. As for other cyber-physical systems, security is essential to the deployment of these applications and standardisation efforts are ongoing. In this paper, we perform a systematic security evaluation of a vehicular platooning protocol through a thorough analysis of the protocol and security standards. We tackle the complexity of the resulting model with a proof strategy based on a relation on keys. The key relation forms a partial order, which encapsulates both secrecy and authenticity dependencies. We show that our order-aware approach makes the verification feasible and proves authenticity properties along with secrecy of all keys used throughout the protocol. |

12:00 | Symbolic protocol verification with dice: process equivalences in the presence of probabilities PRESENTER: Steve Kremer ABSTRACT. Symbolic protocol verification generally abstracts probabilities away, considering computations that succeed only with negligible probability, such as guessing random numbers or breaking an encryption scheme, as impossible. This abstraction, sometimes referred to as the perfect cryptography assumption, has shown very useful as it simplifies automation of the analysis. However, probabilities may also appear in the control flow where they are generally not negligible. In this paper we consider a framework for symbolic protocol analysis with a probabilistic choice operator: the probabilistic applied pi calculus. We define and explore the relationships between several behavioral equivalences. In particular we show the need for randomized schedulers and exhibit a counter-example to a result in a previous work that relied on non-randomized ones. As in other frameworks that mix both non-deterministic and probabilistic choices, schedulers may sometimes be unrealistically powerful. We therefore consider two subclasses of processes that avoid this problem. In particular, when considering purely non-deterministic protocols, as is done in classical symbolic verification, we show that a probabilistic adversary has---maybe surprisingly---a strictly superior distinguishing power for may testing, which, when the number of sessions is bounded, we show to coincide with purely possibilistic similarity. |

11:00 | Charting the Borderland – Decidability in Description Logics and Beyond ABSTRACT. Decidability of inferencing is commonly considered a very important property of logic-based knowledge representation formalisms, required for the algorithmization and automation of reasoning. Yet, oftentimes, the corresponding (un)decidability arguments are idiosyncratic and do not shed much light on the underlying principles governing the divide. In my talk, I will review generic model- and proof-theoretic criteria for decidability of satisfiability and querying in fragments of first-order logic. Description logics play a central role in these considerations: On the one hand, they can serve as a simplified “testbed” inspiring decidability criteria which can then be generalized to higher arities. On the other hand, they mark a “sweet spot,” highlighting the fact that restricting to a binary setting allows for adding modeling features that would otherwise cause undecidability. |

12:00 | PRESENTER: Alisa Kovtunova ABSTRACT. State constraints in AI Planning globally restrict the legal environment states. Standard planning languages make closed-domain and closed-world assumptions. Here we address open-world state constraints formalized by planning over a description logic (DL) ontology. Previously, this combination of DL and planning has been investigated for the light-weight DL DL-Lite. Here we propose a novel compilation scheme into standard PDDL with derived predicates, which applies to more expressive DLs and is based on the rewritability of DL queries into Datalog with stratified negation. We also provide a new rewritability result for the DL Horn-ALCHOIQ, which allows us to apply our compilation scheme to quite expressive ontologies. In contrast, we show that in the slight extension Horn-SROIQ no such compilation is possible unless the weak exponential hierarchy collapses. Finally, we show that our approach can outperform previous work on existing benchmarks for planning with DL ontologies, and is feasible on new benchmarks taking advantage of more expressive ontologies. It is an abstract of a paper accepted at AAAI-22. |

11:00 | PRESENTER: Vitor Rodrigues Greati ABSTRACT. The characterizing properties of a proof-theoretical presentation of a given logic may hang on the choice of proof formalism, on the shape of the logical rules and of the sequents manipulated by a given proof system, on the underlying notion of consequence, and even on the expressiveness of its linguistic resources and on the logical framework into which it is embedded. Standard (one-dimensional) logics determined by (non-deterministic) logical matrices are known to be axiomatizable by analytic and possibly finite proof systems as soon as they turn out to satisfy a certain constraint of sufficient expressiveness. In this paper we introduce a recipe for cooking up a two-dimensional logical matrix (or B-matrix) by the combination of two (possibly partial) non-deterministic logical matrices. We will see that such a combination may result in B-matrices satisfying the property of sufficient expressiveness, even when the input matrices are not sufficiently expressive in isolate, and we will use this to show that one-dimensional logics that are not finitely axiomatizable may inhabit finitely axiomatizable two-dimensional logics, becoming, thus, finitely axiomatizable by the addition of an extra dimension. We will illustrate the said construction using a well-known logic of formal inconsistency called mCi. We will first prove that this logic is not finitely axiomatizable by a one-dimensional (generalized) Hilbert-style system. Then, taking advantage of a known 5-valued non-deterministic logical matrix for this logic, we will combine it with another one, conveniently chosen so as to give rise to a B-matrix that is axiomatized by a two-dimensional Hilbert-style system that is both finite and analytic. |

11:20 | ABSTRACT. The ontology of Le\'sniewski is commonly regarded as the most comprehensive calculus of names and the theoretical basis of mereology. However, ontology was not examined by means of proof-theoretic methods so far. In the paper we prowide a characterization of elementary ontology as a sequent calculus satisfying desiderata usually formulated for decent systems in modern structural proof theory. In particular, the cut elimination theorem is proved and the version of subformula property holds for the cut-free version. |

11:40 | PRESENTER: Marianna Girlando ABSTRACT. We propose a cut-free cyclic system for Transitive Closure Logic (TCL) based on a form of hypersequents, suitable for automated reasoning via proof search. We show that previously proposed sequent systems are cut-free incomplete for basic validities from Kleene Algebra (KA) and Propositional Dynamic Logic (PDL), over standard translations. On the other hand, our system faithfully simulates known cyclic systems for KA and PDL, thereby inheriting their completeness results. A peculiarity of our system is its richer correctness criterion, requiring alternating automata for proof checking and necessitating a more intricate soundness argument than for traditional cyclic proofs. |

12:00 | ABSTRACT. I introduce substitutive sets, which are algebraic structures axiomatizing fundamental properties of variable-for-variable substitution on syntax with bindings. Substitutive sets compare favorably in some respects with the well-known foundation based on nominal sets. In particular, substitution is a more fundamental operator than the nominal swapping operator and enjoys a simpler, equationally expressed relationship with the variable-freshness predicate. Together with some natural axioms matching properties of the syntactic constructors, substitutive sets yield a truly minimalistic characterization of lambda-calculus terms as an abstract datatype, hence a recursion principle -- the first one in the literature that involves only unconditional equations on the constructors and substitution. Similarly to the case of nominal sets, an improvement of this recursion principle is possible, incorporating Barendregt’s variable convention. When interpreting syntax in semantic domains, my substitution-based recursor is easier to deploy than the nominal recursor. My results have been validated with the proof assistant Isabelle/HOL. |

11:00 | Verifying a Sequent Calculus Prover for First-Order Logic with Functions in Isabelle/HOL PRESENTER: Asta Halkjær From ABSTRACT. We describe the design, implementation and verification of an automated theorem prover for first-order logic with functions. The proof search procedure is based on sequent calculus and we formally verify its soundness and completeness in Isabelle/HOL using an existing abstract framework for coinductive proof trees. Our analytic completeness proof covers both open and closed formulas. Since our deterministic prover considers only the subset of terms relevant to proving a given sequent, we do so as well when building a countermodel from a failed proof. Finally, we formally connect our prover with the proof system and semantics of the existing SeCaV system. In particular, the prover can generate human-readable SeCaV proofs which are also machine-verifiable proof certificates. |

11:30 | PRESENTER: Johannes Hostert ABSTRACT. We develop and mechanize compact proofs of the undecidability of various problems for dyadic first-order logic over a small logical fragment. In this fragment, formulas are restricted to only a single binary relation, and a minimal set of logical connectives. We show that validity, satisfiability, and provability, along with finite satisfiability and finite validity are undecidable, by directly reducing from a suitable binary variant of Diophantine constraints satisfiability. Our results improve upon existing work in two ways: First, the reductions are direct and significantly more compact than existing ones. Secondly, the undecidability of the small logic fragment of dyadic first-order logic was not mechanized before. We contribute our mechanization to the Coq Library of Undecidability Proofs, utilizing its synthetic approach to computability theory. |

12:00 | ABSTRACT. The back-and-forth method is a well-known technique to establish isomorphisms of countable structures. In this proof pearl, we formalise this method abstractly in the framework of constructive type theory, emphasising the computational interpretation of the constructed isomorphisms. As prominent instances, we then deduce Cantor's and Myhill's isomorphism theorems on dense linear orders and one-one interreducible sets, respectively. By exploiting the symmetry of the abstract argument, our approach yields a particularly compact mechanisation of the method itself as well as its two instantiations, all implemented using the Coq proof assistant. As adequate for a proof pearl, we attempt to make the text and mechanisation accessible for a general mathematical audience. |

Argumentation

11:00 | Argumentation Frameworks induced by Assumption-based Argumentation: Relating Size and Complexity PRESENTER: Markus Ulbricht ABSTRACT. A key ingredient of computational argumentation in AI is the generation of arguments in favor or against claims under scrutiny. In this paper we look at the complexity of the argument generation procedure in the prominent structured formalism of assumption-based argumentation (ABA). We show several results connecting expressivity of ABA fragments and number of constructed arguments. First, for several NP-hard fragments of ABA, the number of generated arguments is not bounded polynomially. Even under equivalent rewritings of the given ABA framework there are situations where one cannot avoid an exponential blow-up. We establish a weaker notion of equivalence under which this blow-up can be avoided. As a general tool for analyzing ABA frameworks and resulting arguments and their conflicts, we extend results regarding dependency graphs of ABA frameworks, from which one can infer structural properties on the induced attacks among arguments. |

11:30 | Bipolar Argumentation Frameworks with Explicit Conclusions: Connecting Argumentation and Logic Programming PRESENTER: Fabio Cozman ABSTRACT. We introduce a formalism for bipolar argumentation frameworks that combines different proposals from the literature and results in a one-to-one correspondence with logic programming. We derive the correspondence by presenting translation algorithms from one formalism to the other and by evaluating the semantic equivalences between them. We also show that the bipolar model encapsulates distinct interpretations of the support relations studied the literature. |

12:00 | From Weighted Conditionals to a Gradual Argumentation Semantics and back ABSTRACT. A fuzzy multipreference semantics has been recently proposed for weighted conditional knowledge bases with typicality, and used to develop a logical semantics for Multilayer Perceptrons, by regarding a deep neural network (after training) as a weighted conditional knowledge base. Based on different variants of this semantics, we propose some new gradual argumentation semantics, and relate them to the family of the gradual semantics. The paper also suggests an approach for defeasible reasoning over a weighted argumentation graph, building on the proposed semantics. |

Lunches will be held in Taub lobby (CAV, CSF) and in The Grand Water Research Institute (DL, NMR, IJCAR, ITP).

Tool demonstrations for:

- STLmc: Robust STL Model Checking of Hybrid Systems using SMT (13:00-13:30)
- UCLID5: Multi-Modal Formal Modeling, Verification, and Synthesis (13:30-14:00)

14:00 | PRESENTER: Andres Noetzli ABSTRACT. In the past decade, SMT solvers have been extended to support the theory of strings and regular expressions, and have been applied successfully in a number of applications. To accommodate the expressive nature of string constraints used in practice, string solvers use a multi-layered architecture where extended operators are reduced to a set of core operators. These reductions are often costly to reason about. In this work, we propose new techniques for eagerly discovering conflicts based on equality reasoning, and lazily avoiding reductions for certain extended functions based on lightweight reasoning. We present a strategy for integrating and scheduling these techniques in a CDCL(T)- based theory solver for strings and regular expressions. We implement these techniques and the strategy in cvc5, a state-of-the-art SMT solver, and show that they lead to a significant performance improvement with respect to the state of the art. |

14:20 | PRESENTER: Shaowei Cai ABSTRACT. Satisfiability Modulo Linear Integer Arithmetic, SMT(LIA) for short, has significant applications in many domains. In this paper, we develop the first local search algorithm for SMT(LIA) by directly operating on variables, breaking through the traditional framework. We propose a local search framework by considering the distinctions between Boolean and integer variables. Moreover, we design a novel operator and scoring functions tailored for LIA, and propose a two-level operation selection heuristic. Putting these together, we develop a local search SMT(LIA) solver called LS-LIA. Experiments are carried out to evaluate LS-LIA on benchmarks from SMTLIB and two benchmark sets generated from job shop scheduling and data race detection. The results show that LS-LIA is competitive and complementary with state-of-the-art SMT solvers, and performs particularly well on those formulae with only integer variables. A simple sequential portfolio with Z3 improves the state-of-the-art on satisfiable benchmark sets of LIA and IDL benchmarks from SMT-LIB. LS-LIA also solves Job Shop Scheduling benchmarks substantially faster than traditional complete SMT solvers. |

14:40 | PRESENTER: Gennaro Parlato ABSTRACT. Reasoning about data structures requires powerful logics supporting the combination of structural and data properties. We define a new logic called MSO-D (Monadic Second Order logic with Data) as an extension of standard MSO on trees with predicates of the desired data logic. We also define a new class of *symbolic data tree automata* (SDTAs) to deal with data trees using a simple machine. MSO-D and SDTAs are both Turing-powerful, and their high expressiveness is necessary to deal with interesting data structures. We cope with undecidability by encoding SDTA executions as a CHC (Constrained Horn Clause) system, and solving the resulting system using off-the-shelf solvers. In particular, we identify a fragment of MSO-D whose satisfiability can be effectively reduced to the emptiness problem for SDTAs. This fragment is very expressive since it allows us to characterize a variety of data trees from the literature, capture extensions of temporal logics that involve data, games, etc. We implement this reduction in a prototype tool that combines an MSO decision procedure over trees (MONA) with a CHC engine (Z3), and use this tool to conduct several experiments, demonstrating the effectiveness of our approach across multiple problem domains. |

15:00 | PRESENTER: Joshua M. Cohen ABSTRACT. Most methods of data transmission and storage are prone to errors, leading to data loss. Forward erasure correction (FEC) is a method to allow data to be recovered in the presence of errors by encoding the data with redundant parity information determined by an error-correcting code. There are dozens of classes of such codes, many based on sophisticated mathematics, making them difficult to verify using automated tools. In this paper, we present a formal, machine-checked proof of a C implementation of FEC based on Reed-Solomon coding. The C code has been actively used in network defenses for over 25 years, but the algorithm it implements was partially unpublished, and it uses certain optimizations whose correctness was unknown even to the code's authors. We use Coq's Mathematical Components library to prove the algorithm's correctness and the Verified Software Toolchain to prove that the C program correctly implements this algorithm, connecting both using a modular, well-encapsulated structure that could easily be used to verify a high-speed, hardware version of this FEC. This is the first end-to-end, formal proof of a real-world FEC implementation; we verified all previously unknown optimizations and found a latent bug in the code. |

15:20 | PRESENTER: Shenghao Yuan ABSTRACT. RIOT is a micro-kernel dedicated to IoT applications that adopts eBPF (extended Berkeley Packet Filters) to implement so-called femto-containers: as micro-controllers rarely feature hardware memory protection, the isolation of eBPF virtual machines (VM) is critical to ensure system integrity against potentially malicious programs. This paper proposes a methodology to directly derive the verified C implementation of an eBPF virtual machine from a Gallina specification within the Coq proof assistant. Leveraging the formal semantics of the CompCert C compiler, we obtain an end-to-end theorem stating that the C code of our VM inherits its safety and security properties of its Gallina specification. Our refinement methodology ensures that the isolation property of the specification holds in the verified C implementation. Preliminary experiments demonstrate satisfying performances. |

15:40 | Hemiola: A DSL and Verification Tools to Guide Design and Proof of Hierarchical Cache-Coherence Protocols PRESENTER: Joonwon Choi ABSTRACT. Cache-coherence protocols have been one of the greatest challenges in formal verification of hardware, due to their central complication of executing multiple memory-access transactions concurrently within a distributed message-passing system. In this paper, we introduce Harmony, a framework embedded in Coq that guides the user to design protocols that never experience inconsistent interleavings while handling transactions concurrently. The framework provides a DSL, where any protocol designed in the DSL always satisfies the serializability property, allowing a user to verify the protocol assuming that transactions are executed one-at-a-time. Harmony also provides a novel invariant proof method, for protocols designed in Harmony, that only requires considering execution histories without interleaved memory accesses. We used Harmony to design and prove hierarchical MSI and MESI protocols as case studies. We also demonstrated that the case-study protocols are hardware-synthesizable, by using a compilation/synthesis toolchain targeting FPGAs. |

14:00 | PRESENTER: Yevgeny Kazakov ABSTRACT. We present two alternative algorithms for computing (all or some) solutions to the concept abduction problem: one algorithms is based on Reiter's hitting set tree algorithm, whereas the other on relies on a SAT encoding. In contrast to previous work, the algorithms do not rely on a refutation-based calculus and, hence, can be used also with efficient reasoners for tractable DLs such as EL and its extensions. An adaptation to other forms of (logic-based) abduction, e.g., to ABox abduction, is also possible. |

14:25 | PRESENTER: Patrick Koopmann ABSTRACT. When working with description logic ontologies, understanding entailments derived by a description logic reasoner is not always straightforward. So far, the standard ontology editor Protégé offers two services to help: (black-box) justifications for OWL 2 DL ontologies, and (glass-box) proofs for lightweight OWL EL ontologies, where the latter exploits the proof facilities of reasoner ELK. Since justifications are often insufficient in explaining inferences, there is thus only little tool support for explaining inferences in more expressive DLs. In this paper, we introduce EVEE-LIBS, a Java library for computing proofs for DLs up to ALCH, and EVEE-PROTEGE, a collection of Protégé plugins for displaying those proofs in Protégé. We also give a short glimpse of the latest version of EVONNE, a more advanced standalone application for displaying and interacting with proofs computed with EVEE-LIBS. |

14:50 | PRESENTER: Yevgeny Kazakov ABSTRACT. Propositional SAT solvers have been a popular way of computing justifications for ontological entailment -- minimal subsets of axioms of the ontologies that entail a given conclusion. Most SAT encodings proposed for Description Logics (DLs), translate the inferences obtained by a consequence-based procedure to propositional Horn clauses, using which entailments from subsets of axioms can be effectively checked, and use modified SAT solvers to systematically search over these subsets. To avoid repeated discovery of subsets with already checked entailment, the modified SAT solvers add special blocking clauses that prevent generating truth assignments corresponding to these subsets, the number of which can be exponential, even if the number of justifications is small. In this paper, we propose alternative SAT encodings that avoid generation of unnecessary blocking clauses. Unlike the previous methods, the inferences are used not only for checking entailment from subsets of axioms, but also, as a part of the encoding, to ensure that the SAT solver generates truth assignments corresponding only to justifications. |

15:15 | PRESENTER: Martin Homola ABSTRACT. As abduction is getting more attention in the world of ontologies, multiple abduction solvers for DL have been developed. So far, however, there was no attempt for an unified API that would enable to integrate any DL abduction solver into an application – much in the spirit of the well known OWL API that is now implemented by most deductive DL reasoners. Abstracting the common functionality of multiple DL abduction solvers, we introduce DL Abduction API, that we hope can help to fill this space. |

15:30 | PRESENTER: Guendalina Righetti ABSTRACT. Weighted Threshold Operators are n-ary operators that compute a weighted sum of their arguments and verify whether it reaches a certain threshold. They have been extensively studied in the area of circuit complexity theory, as well as in the neural network community under the name of perceptrons. In Knowledge Representation, they have been introduced in the context of standard DL languages by adding a new constructor, the Tooth operator. Tooth-operators have been shown to behave like linear classification models. Thus, they can play a role in bridging symbolic and sub-symbolic reasoning approaches. In particular, tooth expressions can provide a powerful yet natural tool to represent local explanations of black-box classifiers in the context of Explainable AI. In this paper, we present the result of a user study in which we evaluated the interpretability of tooth expressions, and we compared them with Disjunctive Normal Forms (DNF). In the user study, we asked respondents with different backgrounds to perform distinct classification tasks using concepts represented either as tooth expressions or as different types of DNF formulas. We evaluated interpretability through accuracy, response time, confidence, and perceived understandability by human users. We expected tooth expressions to be generally more interpretable than DNFs. In line with our hypothesis, the study revealed that tooth expressions are generally faster to use, and that they are perceived more understandable by users who are less familiar with logic. Our study also showed that the type of task, the type of DNF, and the background of the respondents affect the interpretability of the formalism used. |

15:45 | PRESENTER: Bernardo Alkmim ABSTRACT. In this paper we present a new labelled Natural Deduction calculus for the logic iALC, an intuitionistic description logic with nominals originally designed to reason over laws and other normative sentences in general. Even though this logic already has a formalised Sequent Calculus system, in practice Natural Deduction is more adequate in regards to making proofs more explainable - which is further aided by our use of labels. Finally, we prove soundness and normalisation for the new Natural Deduction system, and show its completeness. |

14:00 | PRESENTER: Chaitanya Mangla ABSTRACT. A strategy schedule allocates time to different proof strategies in a theorem prover. We employ Bayesian statistics to propose a strategy schedule for each proof attempt. Tested on the TPTP problem library, our method yields a time saving of more than 50%. By extending this method to optimise the fixed time allocations to each strategy, we obtain a notable increase in the number of theorems proved. |

14:20 | ABSTRACT. Treating a saturation-based automatic theorem prover (ATP) as a Las Vegas randomized algorithm is a way to illuminate the chaotic nature of proof search and make it amenable to study by probabilistic tools. On a series of experiments with the ATP Vampire, the paper showcases some implications of this perspective for prover evaluation. |

14:35 | PRESENTER: Christoph Weidenbach ABSTRACT. A clause $C$ is syntactically relevant in some clause set $N$, if it occurs in every refutation of $N$. A clause $C$ is syntactically semi-relevant, if it occurs in some refutation of $N$. While syntactic relevance coincides with satisfiability, i.e., if $C$ is syntactically relevant $N\setminus\{C\}$ is satisfiable, the semantic counterpart for syntactic semi-relevance was not known so far. Using the new notion of a \emph{conflict literal} we show that for independent clause sets $N$ a clause $C$ is syntactically semi-relevant in the clause set $N$ if and only if it adds to the number of conflict literals in $N$. A clause set is independent, if no clause out of the clause set is the consequence of different clauses from the clause set. Furthermore, we relate the notion of relevance to that of a minimal unsatisfiable subset (MUS) of some independent clause set $N$. In propositional logic, a clause $C$ is relevant if it occurs in all MUSes of some clause set $N$ and semi-relevant if it occurs in some MUS. For first-order logic the characterization needs to be refined with respect to ground instances of $N$ and $C$. |

14:55 | Equational Unification and Matching, and Symbolic Reachability Analysis in Maude 3.2 (System Description) PRESENTER: Santiago Escobar ABSTRACT. Equational unification and matching are fundamental mechanisms in many automated deduction applications. Supporting them efficiently for as wide as possible a class of equational theories, and in a typed manner supporting type hierarchies, benefits many applications; but this is both challenging and nontrivial. We present Maude 3.2’s efficient support of these features as well as of symbolic reachability analysis of infinite-state concurrent systems based on them. |

15:10 | PRESENTER: Temur Kutsia ABSTRACT. Anti-unification aims at computing generalizations for given terms, retaining their common structure and abstracting differences by variables. We study quantitative anti-unification where the notion of the common structure is relaxed into "proximal" up to the given degree with respect to the given fuzzy proximity relation. Proximal symbols may have different names and arities. We develop a generic set of rules for computing minimal complete sets of approximate generalizations and study their properties. Depending on the characterizations of proximities between symbols and the desired forms of solutions, these rules give rise to different versions of concrete algorithms. |

14:00 | PRESENTER: Jagadish Bapanapally ABSTRACT. This paper presents a formal proof of the Banach-Tarski theorem in ACL2(r). The Banach-Tarski theorem states that a unit ball can be partitioned into a finite number of pieces that can be rotated to form two identical copies of the ball. We have formalized 3-D rotations and generated a free group of 3-d rotations of rank 2. The non-denumerability of the reals has been proved in ACL2(r) and the Axiom of Choice with the strengthen option has been introduced in ACL2 version 3.1. Using the free group of rotations, the Axiom of Choice, and the proof of the non-denumerability of reals, first we show that the unit sphere can be decomposed into two sets, each equivalent to the original sphere. Then we show that the unit ball except for the origin can be decomposed into two sets each equivalent to the original ball by mapping points of the unit ball to the points on the sphere. Finally, we handle the origin by rotating it around an axis such that the origin falls inside the sphere thus creating the two copies of the unit ball. |

14:30 | Formalizing the ring of adèles of a global field ABSTRACT. The ring of adèles of a global field and its group of units, the group of idèles, are fundamental objects in modern number theory. We discuss a formalization of their definitions in the Lean 3 theorem prover. As a prerequisite, we formalized adic valuations on Dedekind domains. We present some applications, including the statement of the main theorem of global class field theory and a proof that the ideal class group of a number field is isomorphic to an explicit quotient of its idèle class group. |

15:00 | Proof Pearl: Formalizing Spreads and Packings of the Smallest Projective Space PG(3,2) using the Coq Proof Assistant ABSTRACT. We formally implement the smallest three-dimensional projective space PG(3,2) in the Coq proof assistant. This projective space features 15 points and 35 lines, related by an incidence relation. We define points and lines as two plain datatypes (one with 15 constructors for points, and one with 35 constructors for lines) and the incidence relation as a boolean function, instead of using the well-known coordinate-based approach relying on GF(2)^4. We prove that this implementation actually verifies all the usual properties of three-dimensional projective spaces. We then use an oracle to compute some characteristic subsets of objects of PG(3,2), namely spreads and packings. We formally verify that these computed objects exactly correspond to the spreads and packings of PG(3,2). For spreads, this means identifying 56 specific sets of 5 lines among 360360 (= 15 * 14 * 13 * 12 * 11) possible ones. We then classify them, showing that the 56 spreads of PG(3,2) are all isomorphic whereas the 240 packings of PG(3,2) can be classified into two distinct classes of 120 elements. Proving these results requires partially automating the generation of some large specification files as well as some even larger proof scripts. Overall, this work can be viewed as an example of a large-scale combination of interactive and automated specifications and proofs. It is also a first step towards formalizing projective spaces of higher dimension, e.g. PG(4,2), or larger order, e.g. PG(3,3). |

14:00 | Modelling Agents Roles in the Epistemic Logic L-DINF PRESENTER: Stefania Costantini ABSTRACT. In this paper, we further advance a line of work aimed to formally model via epistemic logic (aspects of) the group dynamics of cooperative agents. In fact, we have previously proposed and here extend a particular logical framework (the Logic of ``Inferable'' L-DINF), where a group of cooperative agents can jointly perform actions. I.e., at least one agent of the group can perform the action, either with the approval of the group or on behalf of the group. So far, we have been able to take into consideration actions' \emph{cost} and the preferences that each agent can have for what concerns performing each action. In this paper, we introduce agents' \emph{roles} within a group. We choose to model roles in terms of the actions that each agent is enabled by its group to perform. We extend the semantics and the proof of strong completeness of our logic, and we show the usefulness of the new extension via a significant example. |

14:30 | ABSTRACT. This paper proposes an integration of the situation calculus with justification logic. Justification logic can be seen as a refinement of a modal logic of knowledge and belief to one in which knowledge not only is something that holds in all possible worlds, but also is justified. The work is an extension of that of Scherl and Levesque’s integration of the situation calculus with a modal logic of knowledge. We show that the solution developed here retains all of the desirable properties of the earlier solution while incorporating the enhanced expressibility of having justifications. |

15:00 | PRESENTER: Hisashi Hayashi ABSTRACT. Data transfer among servers is crucial for distributed data mining because many databases are distributed around the world. However, as data privacy is becoming more legally and ethically protected, it is necessary to abide by the laws and respect the ethical guidelines when transferring and utilizing data. Because information affecting legal/ethical decision making is often distributed, the data-transfer plan must be updated online when new information is obtained while transferring data among servers. In this study, we propose a dynamic hierarchical task network (HTN) planning method that considers legal and ethical norms while planning multihop data transfers and data analyses/transformations. In our knowledge representation, we show that data-transfer tasks can be represented by the task-decomposition rules of total-order HTN planning, legal norms can be expressed as the preconditions of tasks and actions, and ethical norms can be expressed as the costs of tasks and actions. In the middle of the plan execution, the online planner dynamically updates the plan based on new information obtained in accordance with laws and ethical guidelines. |

The FLoC Olympic Games 2022 follows the successful edition started in 2014 in conjunction with FLoC, in the spirit of the ancient Olympic Games. Every four years, as part of the Federated Logic Conference, the Games gather together all the challenging disciplines from a variety of computational logic in the form of the solver competitions.

At this Olympic Games Session, the competition organizers of the following competitions will present their competitions to the public and give away special prizes to their successful competitors: SYNTComp, CASC-J11, termCOMP, SMT Competition, SL Competition.

17:30-17:45 - As part of the buisneess meeting we will have a 15 minutes presentation by Pavithra Prabhakar.

__ Title__: Formal Methods and Verification Programs and International Partnerships at NSF.

The talk will provide a brief overview of the funding opportunities at the US National Science Foundation related to the area of formal methods and computer aided verification. International partnerships and opportunities for participation of the broader scientific community will be highlighted.