LPAR-20: 20TH INTERNATIONAL CONFERENCES ON LOGIC FOR PROGRAMMING, ARTIFICIAL INTELLIGENCE AND REASONING
Accepted Papers with Abstracts
Petr Cintula (Institute of Computer Science, Czech Academy of Sciences)
Denisa Diaconescu (Mathematical Institute, University of Bern)
George Metcalfe (Mathematical Institute, University of Bern)
Skolemization for Substructural Logics

ABSTRACT. The usual Skolemization procedure, which removes strong quantifiers by introducing new function symbols, is in general unsound for first-order substructural logics defined based on classes of complete residuated lattices. However, it is shown here (following similar ideas of Baaz and Iemhoff for first-order intermediate logics) that first-order substructural logics with a semantics satisfying certain witnessing conditions admit a "parallel" Skolemization procedure where a strong quantifier is removed by introducing a finite disjunction or conjunction (as appropriate) of formulas with multiple new function symbols. These logics typically lack equivalent prenex forms. Also, semantic consequence does not in general reduce to satisfiability. The Skolemization theorems presented here therefore take various forms, applying to the left or right of the consequence relation, and to all formulas or only prenex formulas.

Miika Hannula (University of Helsinki)
Reasoning about embedded dependencies using inclusion dependencies

ABSTRACT. The implication problem for the class of embedded dependencies is undecidable. However, this does not imply lackness of a proof procedure as exemplified by the chase algorithm. In this paper we present a complete axiomatization of embedded dependencies that is based on the chase and uses inclusion dependencies and implicit existential quantification in the intermediate steps of deductions.

Miroslav Klimos (Masaryk University)
Antonin Kucera (Masaryk University)
Cobra: A Tool for Solving General Deductive Games

ABSTRACT. We propose a general framework for modelling and solving deductive games, where one player selects a secret code and the other player strives to discover this code using a minimal number of allowed experiments that reveal some partial information about the code. The framework is implemented in a software tool Cobra, and its functionality is demonstrated by producing new results about existing deductive games.

Eric Gregoire (CRIL CNRS UMR 8188)
Jean Marie Lagniez (CRIL)
On Anti-Subsumptive Knowledge Enforcement

ABSTRACT. The anti-subsumptive enforcement of a clause δ in a set of clauses ∆ consists in extracting one cardinality-maximal satisfiable subset ∆0 of ∆ ∪ {δ} that contains δ but that does not strictly subsume δ. In this paper, the computational issues of this problem are investigated in the Boolean framework. Especially, the minimal change policy that requires a minimal number of clauses to be dropped from ∆ can lead to an exponential computational blow-up. Indeed, a direct and natural approach to anti-subsumptive enforcement requires the computation of all inclusion-maximal subsets of ∆ ∪ {δ} that, at the same time, contain δ and are satisfiable with ¬δj where δj is some strict sub-clause of δ. On the contrary, we propose a method that avoids the computation of this possibly exponential number of subsets of clauses. Interestingly, it requires only one single call to a Partial-Max-SAT procedure and appears tractable in many realistic situations, even for very large ∆. Moreover, the approach is easily extended to take into account a preference pre-ordering between formulas and lay the foundations for the practical enumeration of all optimal solutions to the problem of making δ subsumption-free in ∆ under a minimal change policy.

Luciano Bello (Chalmers University of Technology)
Daniel Hedin (Mälardalen University)
Andrei Sabelfeld (Chalmers University of Technology)
Value Sensitivity and Observable Abstract Values for Information Flow Control

ABSTRACT. Much progress has recently been made on information flow control, enabling the enforcement of increasingly rich policies for increasingly expressive programming languages. This has resulted in tools for mainstream programming languages as JavaScript, Java, Caml, and Ada that enforce versatile security policies. However, a roadblock on the way to wider adoption of these tools has been their limited permissiveness (high false positives). Flow-, context-, and object-sensitive techniques have been suggested to improve the precision of static information flow control and dynamic monitors have been explored to leverage the knowledge about the current run for precision.

This paper explores value sensitivity to boost the permissiveness of information flow control. We show that both dynamic and hybrid information flow mechanisms benefit from value sensitivity. Further, we introduce the concept of observable abstract values to generalize and leverage the power of value sensitivity to richer programming languages. We demonstrate the usefulness of the approach by comparing it to known disciplines for dealing with information flow in dynamic and hybrid settings.

Souheib Baarir (LRDE/EPITA)
Alexandre Duret-Lutz (LRDE/EPITA)
SAT-based Minimization of Deterministic ω-Automata

ABSTRACT. We describe a tool that inputs a deterministic ω-automaton with an arbitrary acceptance condition, and synthesize an equivalent ω-automaton with another arbitrary acceptance condition and a given number of states, if such an automaton exist. This tool, that relies on a SAT-based encoding of the problem, can be used to provide minimal ω-automata equivalent to given properties, for different acceptance conditions.

Cezary Kaliszyk (University of Innsbruck)
Josef Urban (Radboud University)
FEMaLeCoP: Fairly Efficient Machine Learning Connection Prover

ABSTRACT. FEMaLeCoP is a connection tableau theorem prover based on leanCoP which uses efficient implementation of internal learning-based guidance for extension steps. Despite the fact that exhaustive use of such internal guidance incurs a significant slowdown of the raw inferencing process, FEMaLeCoP trained on related proofs can prove many problems that cannot be solved by leanCoP. In particular,FEMaLeCoP adds 90 (15.7%) more MPTP2078 benchmark problems to the 574 problems that are provable by leanCoP. FEMaLeCoP is thus the first system that practically demonstrates that smart internal learning-based guidance can significantly improve the performance of current automated theorem provers running without such sophisticated guiding methods. This paper describes the system, discusses the technology developed, and evaluates the system.

Gilles Dowek (INRIA)
Ying Jiang (State Key Laboratory of Computer Science, Institute of of Software, Chinese Academy of Sciences)
Decidability, Introduction Rules and Automata

ABSTRACT. We present a method to prove uniformly the decidability of provability in several well-known inference systems. This method generalizes both cut-elimination and the construction of an automaton recognizing the provable propositions.

Adi Sosnovich (Computer Science Department, Technion, Haifa, Israel)
Orna Grumberg (Computer Science Department, Technion, Haifa, Israel)
Gabi Nakibly (National Research and Simulation Center, Rafael, Haifa, Israel)
Analyzing Internet Routing Security Using Model Checking

ABSTRACT. The goal of this work is to enhance Internet security by applying formal analysis of traffic attraction attacks on the BGP routing protocol. BGP is the sole protocol used throughout the Internet for inter-domain routing, hence its importance. In attraction attacks an attacker sends false routing advertisements to gain attraction of extra traffic in order to increase its revenue from customers, drop, tamper, or snoop on the packets. Such attacks are most common on the inter-domain routing. We use model checking to perform exhaustive search for attraction attacks on BGP. This requires substantial reductions due to scalability issues of the entire Internet topology. Therefore, we propose static methods to identify and automatically reduce Internet fragments of interest, prior to using model checking. We developed a method, called BGP-SA, for BGP Security Analysis, which extracts and reduces fragments from the Internet. In order to apply model checking, we model the BGP protocol and also model an attacker with predefined capabilities. Our specifications allow to reveal different types of attraction attacks. Using a model checking tool we identify attacks as well as show that certain attraction scenarios are impossible on the Internet under the modeled attacker capabilities.

Alberto Lovato (Università degli studi di Verona)
Damiano Macedonio (Julia Srl, Verona, Italy)
Ciprian Spiridon (Julia Srl, Verona, Italy)
Fausto Spoto (Dipartimento di Informatica, Verona)
Michael D. Ernst (University of Washington)
Boolean Formulas for the Static Identification of Injection Attacks in Java

ABSTRACT. The most dangerous security-related software errors, according to CWE 2011, are those leading to injection attacks --- user-provided data that result in undesired database access and updates (SQL-injections), dynamic generation of web pages (cross-site scripting-injections), redirection to user-specified web pages (redirect-injections), execution of OS commands (command-injections), class loading of user-specified classes (reflection-injections), and many others. This paper describes a flow- and context-sensitive static analysis that automatically identifies if and where injections of tainted data can occur in a program. The analysis models explicit flows of tainted data. Its notion of taintedness applies also to reference (non-primitive) types dynamically allocated in the heap, and is object-sensitive and field-sensitive. The analysis works by translating the program into Boolean formulas that model all possible flows. We implemented it within the Julia analyzer for Java and Android. Julia found injection security vulnerabilities in the Internet banking service and in the customer relationship management of a large Italian bank.

An adequate compositional encoding of bigraph structure in linear logic with subexponentials

ABSTRACT. In linear logic, formulas can be split into two sets: classical (those that can be used as many times as necessary) or linear (those that are consumed and no longer available after being used). Subexponentials generalize this notion by allowing the formulas to be split into many sets, each of which can then be specified to be classical or linear. This flexibility increases its expressiveness: we already have adequate encodings of a number of other proof systems, and for computational models such as concurrent constraint programming, in linear logic with subexponentials (SEL). Bigraphs were proposed by Milner in 2001 as a model for ubiquitous computing, subsuming models of computation such as CCS and the $\pi$-calculus and capable of modeling connectivity and locality at the same time. In this work we present an encoding of the bigraph structure in SEL, thus giving an indication of the expressive power of this logic, and at the same time providing a framework for reasoning and operating on bigraphs. Our encoding is adequate and therefore the operations of composition and juxtaposition can be performed on the logical level. Moreover, all the proof-theoretical tools of SEL become available for querying and proving properties of bigraph structures.

Vojtech Forejt (Oxford University)
Jan Krcal (Saarland University)
Jan Kretinsky (IST Austria)
Controller synthesis for MDPs and Frequency LTL\GU

ABSTRACT. Quantitative extensions of temporal logics have recently attracted significant attention. In this work, we study frequency LTL (fLTL), an extension of LTL which allows to speak about frequencies of events along an execution. Such an extension is particularly useful for probabilistic systems that often cannot fulfil strict qualitative guarantees on the behaviour. It has been recently shown that controller synthesis for Markov decision processes and fLTL is decidable when all the bounds on frequencies are 1. As a step towards a complete quantitative solution, we show that the problem is decidable for the fragment fLTL\GU, where U does not occur in the scope of G (but still F can). Our solution is based on a novel translation of such quantitative formulae into equivalent deterministic automata.

Uwe Egly (Vienna University of Technology)
Florian Lonsing (Vienna University of Technology)
Johannes Oetsch (Vienna University of Technology)
Automated Benchmarking of Incremental SAT and QBF Solvers

ABSTRACT. Incremental SAT and QBF solving potentially yields improvements when sequences of related formulas are solved. An application program that employs an incremental approach to a problem is usually tailored towards some specific solver and decomposes the problem into several incremental solver calls generated directly within the application. This hinders the independent comparison of different incremental solvers, particularly when the application program is not available. To remedy this situation, we present an approach to automated benchmarking of incremental SAT and QBF solvers. Given a collection of formulas in (Q)DIMACS format generated incrementally by an application program, our approach automatically translates the formulas into instructions to import and solve a formula by an incremental SAT/QBF solver. The result of the translation is a program which replays the incremental solver calls defined by the formulas in the collection. The program can be configured to include any SAT/QBF solver which implements a minimal API similar to IPASIR, which has recently been proposed for the Incremental Library Track of the SAT Race 2015. Our approach allows to evaluate incremental solvers independently from the application program that was used to generate the formulas. We illustrate our approach by different hardware verification problems for SAT and QBF solvers.

Shoshin Nomura (Japan Advanced Institute of Science and Technology)
Katsuhiko Sano (Japan Advanced Institute of Science and Technology)
Satoshi Tojo (Japan Advanced Institute of Science and Technology)
A Labelled Sequent Calculus for Intuitionistic Public Announcement Logic

ABSTRACT. Intuitionistic Public Announcement Logic (IntPAL) proposed by Ma et al. (2014) aims for a formal expression of change of knowledge in a constructive manner. IntPAL is a combination of Public Announcement Logic (PAL) by Plaza (1989) and the intuitionistic modal logic IK by Fischer Servi (1984) and Simpson (1994). We also refer to IK for the basis of this paper. Meanwhile, Nomura et al. (2015) provided a cut-free labelled sequent calculus based on the study of Maffezioli et al. (2010). In this paper, we introduce a labelled sequent calculus for IntPAL (we call it GIntPAL) as both an intuitionistic variant of GPAL and a public announcement extension of Simpson's labelled calculus, and show that all theorems of the Hilbert axiomatization of IntPAL are also provable in GIntPAL with the cut rule. Then we prove the admissibility of the cut rule in GIntPAL and also its soundness for birelational Kripke semantics. Finally, we derive the semantic completeness of GIntPAL as a corollary of these theorems.

Patrick Baillot (CNRS and ENS Lyon)
Gilles Barthe (IMDEA Software Institute)
Ugo Dal Lago (Università di Bologna)
Implicit Computational Complexity of Subrecursive Definitions and Applications to Cryptographic Proofs

ABSTRACT. We define a call-by-value variant of Godel’s System T with references, and equip it with a linear dependent type and effect system, called dlT, that can estimate the complexity of programs, as a function of the size of their inputs. We prove that the type system is intentionally sound, by showing that the type system over-approximates the complexity of executing programs on a variant of the CEK abstract machine. Moreover, we define a sound and complete type inference algorithm which critically exploits the subrecursive nature of dlT. Finally, we demonstrate the usefulness of dlT for analyzing the complexity of cryptographic reductions by providing an upper bound for the constructed adversary of the Goldreich-Levin theorem.

Dan Rosén (Chalmers)
Nicholas Smallbone (Chalmers University of Technology)
TIP: Tools for Inductive Provers [tool paper]

ABSTRACT. TIP is a toolbox for users and developers of inductive provers. It consists of a large number of tools which can, for example, simplify an inductive problem, monomorphise it or find counterexamples to it. We are using TIP to help maintain a set of benchmarks for inductive theorem provers, where its main job is to encode features that are not natively supported by the respective provers. TIP makes it easier to write inductive provers, by supplying necessary tools such as lemma discovery which prover authors can simply import into their own prover.

Franck Cassez (Macquarie University)
Frowin Ziegler (Universität Augsburg)
Verification of Concurrent Programs Using Trace Abstraction Refinement

ABSTRACT. Verifying concurrent programs is notoriously hard due to the state explosion problem: 1) the data state space can be very large as the variables can range over very large sets, and 2) the control state space is the Cartesian product of the control state space of the concurrent components and thus grows exponentially in the number of components. On the one hand, the most successful approaches to address the control state explosion problem are based on assume-guarantee reasoning or model-checking coupled with partial order reduction. On the other hand, the most successful techniques to address the data space explosion problem for sequential programs verification are based on the abstraction/refinement paradigm which consists in refining an abstract over-approximation of a program via predicate refinement. In this paper, we show how to combine partial order reduction techniques with trace abstraction refinement. We apply our approach to standard benchmarks and show that it matches current state-of-the-art analysis techniques.

Yuya Uezato (Graduate School of Systems and Information Engineering, University of Tsukuba)
Yasuhiko Minamide (Department of Mathematical and Computing Sciences, Tokyo Institute of Technology)
Synchronized Recursive Timed Automata

ABSTRACT. We present synchronized recursive timed automata (SRTA) that extend timed automata with a stack. Each frame of a stack is rational-valued clocks, and SRTA synchronously increase the values of all the clocks within the stack. Our main contribution is to show that the reachability problem of SRTA is ExpTime-complete. This decidability contrasts with the undecidability for recursive timed automata (RTA) introduced by Trivedi and Wojtczak, and Benerecetti et al. Unlike SRTA, the frames below the top are frozen during the computation at the top frame in RTA.

Our construction of the decidability proof extends the region abstraction for dense timed pushdown automata (TPDA) of Abdulla et al. to accommodate together diagonal constraints and fractional constraints of SRTA. Since SRTA can be seen as an extension of TPDA with diagonal and fractional constraints, our result enlarges the decidable class of pushdown-extensions of timed automata.

Dale Miller (INRIA and LIX/Ecole Polytechnique)
Marco Volpe (Inria and LIX/École Polytechnique, France)
Focused labeled proof systems for modal logic

ABSTRACT. Focused proofs are sequent calculus proofs that group inference rules into alternating negative and positive phases. These phases can then be used to define macro-level inference rules from Gentzen's original and tiny introduction and structural rules. We show here that the inference rules of labeled proof systems for modal logics can similarly be described as pairs of such negative and positive phases within the LKF focused proof system (which contains no modal connectives). In particular, we consider the system G3K of Negri for the modal logic K and define a translation from labeled modal formulas into first-order polarized formulas and show a strict correspondence between derivations in the two systems, i.e., each rule application in G3K corresponds to a bipole - a pair of a positive and a negative phases - in LKF. Since geometric axioms (when properly polarized) induce bipoles, this strong correspondence holds for all modal logics whose Kripke frames are characterized by geometric properties. We extend these results to present a focused labeled proof system for this same class of modal logics and show its soundness and completeness. This resulting proof system allows one to define a rich set of normal forms of modal logic proofs.

Sasha Rubin (UNINA)
Benjamin Aminof (School of Computer Science and Engineering - Hebrew University)
Aniello Murano (Universita' di Napoli "Federico II")
On CTL* with Graded Path Modalities

ABSTRACT. Graded path modalities count the number of paths satisfying a property, and generalize the existential (E) and universal (A) path modalities of CTL*. The resulting logic is denoted GCTL*, and is a very powerful logic since (as we show) it is equivalent, over trees, to monadic path logic. We settle the complexity of the satisfiability problem of GCTL*, i.e., TwoExpTime-Complete, and the complexity of the model checking problem of GCTL*, i.e., PSpace-Complete. The lower bounds already hold for CTL*, and so we supply the upper bounds. The significance of this work is two-fold: GCTL* is much more expressive than CTL* as it adds to it a form of quantitative reasoning, and this is done at no extra cost in computational complexity.

Chuck Liang (Hofstra University)
Dale Miller (INRIA and LIX/Ecole Polytechnique)
On Subexponentials, Synthetic Connectives, and Multi-Level Delimited Control

ABSTRACT. We construct a partially-ordered hierarchy of delimited control operators similar to those of the CPS hierarchy of Danvy and Filinski [1990]. However, instead of relying on nested CPS translations, these operators directly interpreted in linear logic extended with subexponentials (i.e., multiple pairs of ! and ?). We construct an independent proof theory for a fragment of this logic based on the principal of focusing. It is then shown that the new constraints placed on the permutation of cuts correspond to multiple levels of delimited control.

Florian Zuleger (Technische Universiät Wien)
Sasha Rubin (UNINA)
Benjamin Aminof (Technische Universiät Wien)
On the expressive power of communication primitives in parameterised systems

ABSTRACT. We study foundational problems regarding the expressive power of parameterised systems. These (infinite-state) systems are composed of arbitrarily many finite-state processes that synchronise using a given communication primitive, i.e., broadcast, asynchronous rendezvous, broadcast with message loss, pairwise rendezvous, or disjunctive guards. With each communication primitive we associate the class of parameterised systems that use that primitive. We study the relative expressive power of these classes (can systems in one class be simulated by systems in another?) and provide a complete picture with only a single question left open. Motivated by the question of separating these classes, we also study the absolute expressive power (e.g., is the set of traces of every parameterised system of a given class omega-regular?). Our work gives insight into the verification and synthesis of parameterised systems, including new decidability and undecidability results of the model checking problem of parameterised systems using broadcast with message loss and asynchronous rendezvous.

Alexander Steen (Freie Universität Berlin)
Christoph Benzmüller (Freie Universität Berlin)
There Is No Best Beta-Normalization Strategy for Higher-Order Reasoners

ABSTRACT. The choice of data structures for the internal representation of terms in logical frameworks and higher-order theorem provers is a crucial low-level factor for their performance. We propose a representation of terms based on a polymorphically typed nameless spine data structure in conjunction with perfect term sharing and explicit substitutions.

In related systems the choice of a beta-normalization methods is usually statically fixed and cannot be adjusted to the input problem at runtime. The typical predominant strategies are hereby implementation specific adaptions of leftmost-outermost normalization. We introduce several different beta-normalization strategies and empirically evaluate their performance by reduction step measurement on about 7000 heterogeneous problems from different (TPTP) domains.

Our study shows that there is no generally best beta-normalization strategy and that for different problem domains, different winning strategies can be identified. The evaluation results suggest a problem-dependent choice of a preferred beta-normalization strategy for higher-order reasoning systems. To that end, we present first preliminary approaches for strategy-selection heuristics.

Liana Hadarean (New York University)
Clark Barrett (New York University)
Morgan Deters (New York University)
Cesare Tinelli (The University of Iowa)
Andrew Reynolds (University of Iowa)
Fine grained SMT proofs for the theory of fixed-width bit-vectors

ABSTRACT. Many high-level verification tools rely on SMT solvers to efficiently discharge complex verification conditions. Some applications require more than just a yes/no answer from the solver. For satisfiable problems, a satisfying model is a natural artefact. In the unsatisfiable case, an externally checkable proof can serve as a certificate of correctness and can be mined to gain additional insight into the problem.

In this paper we present a method of encoding and checking SMT-generated proofs for the theory of fixed-width bit-vectors. Proof generation and checking for the bit-vector theory poses additional challenges compared to other theories: reducing the problem to propositional logic can result in large resolution proofs as well as requiring a proof that the reduction itself is correct. We describe a fine-grained proof system formalized in the LFSC framework that addresses some of these challenges by the use of computational side-conditions. We instrument the CVC4 SMT solver to automatically generate such proofs and evaluate the performance impact of proof generation.

Arlen Cox (University of Colorado at Boulder)
Bor-Yuh Evan Chang (University of Colorado at Boulder)
Huisong Li (INRIA / ENS Paris)
Xavier Rival (INRIA / ENS Paris)
Abstract Domains and Solvers for Sets Reasoning

ABSTRACT. When constructing complex program analyses, it is often useful to reason about not just individual values, but collections of values. Symbolic set abstractions provide such a building block that can be used to construct partitions of elements, relate partitions to other partitions, and determine the provenance of multiple values, all without knowing any concrete values. To address the simultaneous challenges of scalability and precision, we formalize and implement an interface for symbolic set abstractions and construct multiple abstractions relying on both specialized data structures and off-the-shelf theorem provers. Additionally, we develop techniques for lifting existing domains to improve performance and precision. We evaluate these new domains on real-world data structure analysis problems.

Thibault Gauthier (Institute of Computer Science University of Innsbruck)
Cezary Kaliszyk (University of Innsbruck)
Sharing HOL4 and HOL Light proof knowledge

ABSTRACT. New proof assistant developments often involve concepts similar to already formalized ones. When proving their properties, a human can often take inspiration from the existing formalized proofs available in other provers or libraries. In this paper we propose and evaluate a number of methods, which strengthen proof automation by learning from proof libraries of different provers. Certain conjectures can be proved directly from the dependencies induced by similar proofs in the other library. Even if exact correspondences are not found, learning-reasoning systems can make use of the association between proved theorems and their characteristics to predict the relevant premises. Such external help can be further combined with internal advice. We evaluate the proposed knowledge-sharing methods by reproving the HOL Light and HOL4 standard libraries. The learning-reasoning system HOL(y)Hammer, whose single best strategy could automatically find proofs for 30\% of the HOL Light problems, can prove 40\% with the knowledge from HOL4.

Gilles Barthe (IMDEA Software Institute)
Thomas Espitau (ENS Cachan)
Benjamin Gregoire (INRIA Sophia Antipolis)
Justin Hsu (University of Pennsylvania)
Léo Stefanesco (ENS Lyon)
Pierre-Yves Strub (IMDEA Software Institute)
Relational reasoning via probabilistic coupling

ABSTRACT. Probabilistic coupling is a powerful tool for analyzing probabilistic processes. Roughly, coupling two processes requires finding an appropriate witness process that characterizes both processes in the same probability space. Applications of coupling include reasoning about convergence of distributions, and stochastic dominance---a probabilistic version of a monotonicity property.

While the mathematical definition of coupling looks rather complex and difficult to manipulate, we show that the relational program logic pRHL---the logic underlying the EasyCrypt cryptographic proof assistant---internalizes a generalization of probabilistic coupling. We demonstrate how to express and verify classic examples of couplings in pRHL, and we mechanically verifying several couplings in EasyCrypt.

Peter Brottveit Bock (IT University of Copenhagen)
Carsten Schürmann (IT University of Copenhagen)
A Contextual Logical Framework

ABSTRACT. A new logical framework with explicit linear contexts and names is presented, with the purpose of enabling direct and flexible manipulation of contexts, both for representing systems and meta-properties. The framework is a conservative extension of the logical framework LF, and builds on linear logic and contextual modal type theory. We prove that the framework admits canonical forms, and that it possesses all desirable meta-theoretic properties, in particular hereditary substitutions.

As proof of concept, we give an encoding of the one-sided sequent calculus for classical linear logic and the corresponding cut-admissibility proof, as well as an encoding of parallel reduction of lambda terms with the corresponding value-soundness proof.

Florian Lonsing (Vienna University of Technology)
Fahiem Bacchus (University of Toronto)
Armin Biere (Johannes Kepler University)
Uwe Egly (Vienna University of Technology)
Martina Seidl (Johannes Kepler University Linz)
Enhancing Search-Based QBF Solving by Dynamic Blocked Clause Elimination

ABSTRACT. Preprocessing has been found vital for both search-based QBF solving with clause and cube learning (QCDCL) and solving by variable expansion. Among other QBF preprocessing techniques, blocked clause elimination (QBCE) simulates structural reasoning on formulas in prenex conjunctive normal form (PCNF) and reduces the bias introduced when translating non-PCNF formulas into PCNF. So far, the power of QBCE has not been exploited during solving.

We present the dynamic application of QBCE integrated in QCDCL solving. This dynamic application of QBCE is in sharp contrast to its typical use as a mere preprocessing technique. In our dynamic approach, QBCE is applied eagerly to the PCNF interpreted under the assignments that have currently been enumerated. The tight integration of QBCE in QCDCL results in a variant of cube learning which is exponentially stronger than traditional cube learning. We implemented our approach in the QBF solver DepQBF and ran experiments on instances from the QBF Gallery 2014. On application benchmarks, QCDCL with dynamic QBCE substantially outperforms traditional QCDCL. Moreover, our approach is compatible with incremental solving and can be combined with preprocessing techniques other than QBCE.

Simon Robillard (Chalmers University of Technology)
Laura Kovacs (Chalmers University of Technology)
Wolfgang Ahrendt (Chalmers University of Technology)
Reasoning About Loops Using Vampire in KeY

ABSTRACT. We describe symbol elimination and consequence finding in the first-order theorem prover Vampire for automatic generation of quantified invariants, possibly with quantifier alternations, of loops with arrays. Unlike the previous implementation of symbol elimination in Vampire, our work is not limited to a specific programming language but provides a generic framework by relying on a simple guarded command representation of the input loop. We also improve the loop analysis part in Vampire by generating loop properties more easily handled by the saturation engine of Vampire. Our experiments show that, with our changes, the number of generated invariants is decreased, in some cases, by a factor of 20. We also provide a framework to use our approach to invariant generation in conjunction with pre- and post-conditions of program loops. We use the program specification to find relevant invariants as well as to verify the partial correctness of the loop. As a case study, we demonstrate how symbol elimination in Vampire can be used as an interface for realistic imperative languages, by integrating our tool in the KeY verification system, thus allowing reasoning about loops in Java programs in a fully automated way, without any user guidance.

Marijn Heule (The University of Texas at Austin)
Armin Biere (Johannes Kepler University)
Compositional Propositional Proofs

ABSTRACT. Many hard-combinatorial problems can only be tackled by SAT solvers in a massively parallel setting. This reduces the trust one can have in the final result as errors can occur in parallel SAT solvers as well as in methods to partition the original problem. We present a new framework to produce clausal proofs for cube-and-conquer, arguably the most effective parallel SAT solving paradigm for hard-combinatorial problems. The framework also provides an elegant approach to parallelize the validation of clausal proofs efficiently, both in terms of run time and memory usage. We demonstrate the usefulness of the framework on some hard-combinatorial problems and validate the constructed clausal proofs in parallel.

Cvetan Dunchev (University of Bologna)
Ferruccio Guidi (Dept. of Computer Science, University of Bologna)
Claudio Sacerdoti Coen (University of Bologna)
Enrico Tassi (INRIA)
A Fast Interpreter for λProlog

ABSTRACT. We present a new interpreter for λProlog that runs consistently faster than the bytecode compiled by Teyjus, that is believed to be the best available implementation for λProlog. The key insight is the identification of a fragment of the language, which we call Reduction-Free Fragment, that occurs quite naturally in λProlog programs and that admits constant time reduction and unification rules. The implementation exploits De Bruijn levels and no explicit substitutions, whereas Teyjus is based on De Bruijn indexes and explicit substitutions (the suspension calculus).

Olivier Hermant (MINES ParisTech)
Gaetan Gilbert (Inria)
Normalization by Completeness with Heyting Algebras

ABSTRACT. Usual normalization by evaluation techniques have a strong relationship with completeness with respect to Kripke structures. But Kripke structures is not the only semantics that fits intuitionistic logic: Heyting algebras are a more algebraic alternative.

In this paper, we focus on this less investigated area: how completeness with respect to Heyting algebras generate a normalization algorithm for a natural deduction calculus, in the propositional fragment. Our main contributions is that we prove in a direct way completeness of natural deduction with respect to Heyting algebras, that the underlying algorithm natively deals with disjunction, that we formalized those proofs in Coq, and give an extracted algorithm.

Cristina David (University of Oxford)
Daniel Kroening (Computer Science Department, University of Oxford)
Matt Lewis (Oxford University)
Using Program Synthesis for Program Analysis

ABSTRACT. In this paper, we identify a fragment of second-order logic with restricted quantification that is expressive enough to capture numerous static analysis problems (e.g. safety proving, bug finding, termination and non-termination proving, superoptimisation). We call this fragment the synthesis fragment. Satisfiability of a formula in the synthesis fragment is decidable over finite domains; specifically the decision problem is NEXPTIME-complete. If a formula in this fragment is satisfiable, a solution consists of a satisfying assignment from the second order variables to functions over finite domains. To concretely find these solutions, we synthesise programs that compute the functions. Our program synthesis algorithm is complete for finite state programs, i.e. every function over finite domains is computed by some \emph{program} that we can synthesise. We can therefore use our synthesiser as a decision procedure for the synthesis fragment of second-order logic, which in turn allows us to use it as a powerful backend for many program analysis tasks. To show the tractability of our approach, we evaluate the program synthesiser on several static analysis problems.

Temesghen Kahsai (NASA Ames / CMU)
Jorge A Navas (NASA Ames Research Center)
Dejan Jovanović (SRI International)
Martin Schaef (SRI International)
Finding Inconsistencies in Programs with Loops

ABSTRACT. Inconsistent code is an important class of program abnormalities that appears in real-world code bases and often reveals serious bugs. Existing approaches to inconsistent code detection scale to pro- grams with millions of lines of code, and have lead to patches in applications like Tomcat or the Linux kernel. However, the ability of existing tools to detect inconsistencies is limited by gross over-approximation of looping control-flow. We present a novel approach to inconsistent code detection that can reason about programs with loops without compromising precision. To that end, by leveraging recent advances in Horn clause solving, we demonstrate how to encode the problem as a sequence of Horn clauses queries enabling us to detect inconsistencies that were previously unattainable.

Iliano Cervesato (Carnegie Mellon University)
Edmund Soon Lee Lam (Carnegie Mellon University Qatar)
Modular Multiset Rewriting

ABSTRACT. Rule-based languages are being used for ever more ambitious applications. As program size grows however, so does the overhead of team-based development, reusing components, and just keeping a large flat collection of rules from interfering. In this paper, we propose a module system for a small logically-motivated rule-based language. The resulting modules are nothing more than rewrite rules of a specific form, which are themselves just logic formulas. Yet, they provide some of the same features found in advanced module systems such as that of Standard ML, in particular name space separation, support for abstract data types, and parametrization (functors in ML). Our modules also offer essential features for concurrent programming such as facilities for sharing private names. This approach is directly applicable to other rule-based languages, including most forward-chaining logic programming languages and many process algebras.

Fiona Berreby (LIP6 - UPMC)
Gauvain Bourgne (LIP6 - UPMC)
Ganascia Jean-Gabriel (LIP6 - UPMC)
Modelling Moral Reasoning and Ethical Responsibility with Logic Programming

ABSTRACT. In this paper, we investigate the use of high-level action languages for representing and reasoning about ethical responsibility in goal specification domains. First, we present a simplified Event Calculus formulated as a logic program under the stable model semantics in order to represent situations within Answer Set Programming. Second, we introduce a model of causality that allows us to use an answer set solver to perform reasoning over the agent’s ethical responsibility. We then extend and test this framework against the Trolley Problem and the Doctrine of Double Effect. The overarching aim of the paper is to propose a general and adaptable formal language that may be employed over a variety of ethical scenarios in which the agent’s responsibility must be examined and their choices determined. Our fundamental motivation rests in the will to displace the burden of moral reasoning from the programmer to the program itself, moving away from current computational ethics that too easily embed moral reasoning withing the computational engines, thereby feeding atomic answers that fail to truly represent the underlying dynamics at play.

Cynthia Kop (University of Innsbruck)
Naoki Nishida (Nagoya University)
ConsTrained Rewriting tooL

ABSTRACT. This paper discusses Ctrl, a tool to analyse – both automatically and manually – term rewriting with logical constraints. Ctrl can be used with TRSs on arbitrary underlying logics, and can automatically verify termination, confluence and (to a limited extent) term equivalence using inductive theorem proving. Ctrl also offers a manual mode for inductive theorem proving, allowing support for and verification of "hand-written" term equivalence proofs.

Proof Search in Nested Sequent Calculi

ABSTRACT. One of the emergent trends in Proof Theory nowadays is the quest for proof search in modal logics. In fact, modalities are usually introduced via semantics and/or axioms, hence not lending to efficient implementations. The situation is slightly better for sequent calculi for modal logics. Unfortunately, the construction of suitable sequent calculi from semantic or axiomatic specifications is far from trivial. Moreover, even for some of the most basic modal logics, the currently known sequent calculi lack important properties such as cut elimination or a notion of normal proofs. A major obstacle to establishing these properties is the fact that such calculi usually do not exhibit distinct left and right introduction rules for the modalities. It turns out that this problem can be circumvent by using linear nested systems. The structures considered in this framework are linear sequences of ordinary sequents, and the rules for the modalities control the transfer of formulae from one sequent layer to another. In this work, we propose a general definition of labels for linear nested systems, giving rise to a focusing discipline for a class of modal systems. This opens the possibility of proposing automatic procedures for proof search in modal logics.

Thinh Dong (LIASD, University Paris 8, IUT of Montreuil)
Chan Le Duc (LIASD, University Paris 8, IUT de Montreuil)
Philippe Bonnot (University Paris 8, IUT de Montreuil)
Myriam Lamolle (LIASD-LINC, University Paris 8, IUT of Montreuil)
Tableau-based Revision over SHIQ TBoxes

ABSTRACT. Semantics-based applications encapsulate commonly a set of ontologies which represent knowledge formalized from different data sources. Some of these ontologies may change over time since, not only data would be updated but also our understanding on application domain would evolve. To ensure that ontologies remain usable, it is needed to revise ontologies in such a way that takes into account new knowledge and guarantees minimal changes. In this paper, we propose an ontology revision approach which allows one to use finite structures to characterize a set of models of an ontology, and to define a total pre-order over these structures. Thus, our revision operation satisfies all revision postulates. Moreover, we propose a procedure for revising an ontology expressed in an expressive description logic, namely SHIQ, and show that the resulting ontology remains expressible in SHIQ.

Daniel Fava (UC Santa Cruz)
Matthieu Lemerre (CEA)
Julien Signoles (CEA)
Martin Schäf (SRI International)
Ashish Tiwari (SRI International)
Gamifying Program Analysis

ABSTRACT. Abstract interpretation is a powerful tool in program verification. Several commercial or industrial scale implementations of abstract interpretation have demonstrated that this approach can verify safety properties of real-world code. However, using abstract interpretation tools is not always simple. If no user-provided hints are available, the abstract interpretation engine may lose precision during widening and produce an overwhelming number of false alarms. However, manually providing these hints is time consuming and often frustrating when re-running the analysis takes a lot of time.

We present an algorithm for program verification that combines abstract interpretation, symbolic execution and crowd sourcing. If verification fails, our procedure suggests likely invariants, or program patches, that provide helpful information to the verification engineer and makes it easier to find the correct specification. By complementing machine learning with well-designed games, we enable program analysis to incorporate human insights that help improve their scalability and usability.

Grigory Fedyukovich (USI)
Arie Gurfinkel (Software Engineering Institute, Carnegie Mellon University)
Natasha Sharygina (University of Lugano)
Automated Discovery of Simulation Between Programs

ABSTRACT. The paper presents SimAbs, the first fully automated SMT-based approach to synthesize an abstraction of one program (called target) that simulates another program (called source). SimAbs iteratively traverses the search space of existential abstractions of the target and choses the strongest abstraction among them that simulates the source. Deciding whether a given relation is a simulation relation is reduced to solving validity of Forall-Exists-formulas. We present a novel algorithm for dealing with such formulas using an incremental SMT solver. In addition to deciding validity, our algorithm extracts witnessing Skolem relations which further drive simulation synthesis in SimAbs. Our evaluation confirms that SimAbs is able to efficiently discover both, simulations and abstractions, for C programs from the Software Verification Competition.

Koen Claessen (Chalmers University of Technology)
SAT modulo intuitionistic implications

ABSTRACT. We show how to implement a theorem prover for intuitionistic propositional logic, by combining a standard SAT-solver with a theory for intuitionistic implications. The result is a scalable theorem prover that also produces counter examples in the form of Kripke models.