LPAR ON THURSDAY, DECEMBER 19TH, 2013
09:00  Where, What, and How? Lessons from the Evolution of E SPEAKER: Stephan Schulz ABSTRACT. E is a theorem prover for firstorder logic with equality. It has evolved over the last 15 years from a basic implementation of the superposition calculus to a fullyfeatured theorem prover supporting a rich language, an efficient inference engine, and several levels of manual and automatic control. Most of the changes to the prover have been conservative extensions. This allows us to take a virtual step back in time and to evaluate which features have had how much of an influence on the power of the system, and thus maybe inform us where future research is particularly promising.

10:30  Polarizing Double Negation Translations SPEAKER: Olivier Hermant ABSTRACT. Doublenegation translations allow to encode and decode classical proofs in intuitionistic logic. We show that, in the cutfree fragment, we can simplify the translations and introduce fewer negations. To achieve this, we consider the polarization of the formulae and adapt those translation to the different connectives and quantifiers. We show that the embedding results still hold, using a customized version of the focused classical sequent calculus. We also prove the latter equivalent to more usual versions of the sequent calculus. This polarization process allows lighter embeddings, and sheds some light on the relationship between intuitionistic and classical connectives. 
10:50  A Proof of Strong Normalisation of the Typed Atomic LambdaCalculus SPEAKER: Willem Heijltjes ABSTRACT. The atomic lambdacalculus is a typed lambdacalculus with explicit sharing, which originates in a CurryHoward interpretation of a deepinference system for intuitionistic logic. It has been shown that it allows fully lazy sharing to be reproduced in a typed setting. The talk will introduce the calculus in detail, and give an outline of the latest development, a proof of strong normalization of the typed atomic lambdacalculus using Tait's reducibility method. 
11:10  Complexity Analysis in Presence of Control Operators and Higherorder Functions SPEAKER: Giulio Pellitta ABSTRACT. A polarized version of Girard, Scedrov and Scott’s Bounded Linear Logic is introduced and its normalization properties studied. Following Laurent, the logic naturally gives rise to a type system for the λμcalculus, whose derivations reveal bounds on the time complexity of the underlying term. This is the first example of a type system for the λμcalculus guaranteeing time complexity bounds for typable programs. 
11:30  Semantic ATranslations and SuperConsistency entail Classical Cut Elimination SPEAKER: Olivier Hermant ABSTRACT. We show that if a theory R defined by a rewrite system is superconsistent, the classical sequent calculus modulo R enjoys the cut elimination property, which was an open question. For such theories it was already known that proofs strongly normalize in natural deduction modulo R, and that cut elimination holds in the intuitionistic sequent calculus modulo R. We first define a syntactic and a semantic version of Friedman's Atranslation, showing that it preserves the structure of preHeyting algebra, our semantic framework. Then we relate the interpretation of a theory in the Atranslated algebra and its Atranslation in the original algebra. This allows to show the stability of the superconsistency criterion and the cutelimination theorem. 
11:50  Effectively Monadic Predicates SPEAKER: Margus Veanes ABSTRACT. Monadic predicates play a prominent role in many decidable cases, including decision procedures for symbolic automata. We are here interested in discovering whether a formula can be rewritten into a Boolean combination of monadic predicates. Our setting is quantifierfree formulas over a decidable background theory, such as arithmetic and we here develop a semidecision procedure for extracting a monadic decomposition of a formula when it exists. 
14:00  Incorporating Hypothetical Views and Extended Recursion into SQL Database Systems SPEAKER: Gabriel Aranda ABSTRACT. Current database systems supporting recursive SQL impose restrictions on queries such as linearity, and do not implement mutual recursion. In a previous work we presented the language and prototype RSQL to overcome those drawbacks. Now we introduce a formalization and an implementation of the database system HRSQL that, in addition to extended recursion, incorporates hypothetical reasoning in a novel way which cannot be found in any other SQL system, allowing both positive and negative assumptions. The formalization extends the fixpoint semantics of RSQL. The implementation improves the eciency of the previous prototype and is integrated in a commercial DBMS. 
14:10  Zenon Modulo: When Achilles Outruns the Tortoise using Deduction Modulo SPEAKER: Pierre Halmagrand ABSTRACT. We propose an extension of the tableaubased first order automated theorem prover Zenon to deduction modulo. The theory of deduction modulo is an extension of predicate calculus, which allows us to rewrite terms as well as propositions, and which is well suited for proof search in axiomatic theories, as it turns axioms into rewrite rules. We also present a heuristic to perform this latter step automatically, and assess our approach by providing some experimental results obtained on the benchmarks provided by the TPTP library, where this heuristic is able to prove difficult problems in set theory in particular. Finally, we describe an additional backend for Zenon that outputs proof certificates for Dedukti, which is a proof checker based on the lambdaPicalculus modulo. 
14:30  Polar: A Framework for Proof Refactoring SPEAKER: Iain Whiteside ABSTRACT. We present a prototype refactoring framework based on graph rewriting and bidirectional transformations that is designed to be generic, extensible, and declarative. Our approach uses a languageindependent graph metamodel to represent proof developments in a generic way. We use graph rewriting to enrich the metamodel with dependency information and to perform refactorings, which are written as declarative rewrite rules. Our framework, called POLAR, is implemented in the GRGEN graph rewriting engine. 
14:50  The Complexity of Clausal Fragments of LTL SPEAKER: Vladislav Ryzhikov ABSTRACT. We introduce and investigate a number of fragments of propositional temporal logic LTL over the flow of time (Z, <). The fragments are defined in terms of the available temporal operators and the structure of the clausal normal form of the temporal formulas. We determine the computational complexity of the satisfiability problem for each of the fragments, which ranges from NLogSpace to PTime, NP and PSpace. 
15:10  A Seligmanstyle Tableau System SPEAKER: Klaus Frovin Joergensen ABSTRACT. Proof systems for hybrid logic typically use $@$operators to access information hidden behind modalities; this labeling approach lies at the heart of most resolution, natural deduction, and tableau systems for hybrid logic. But there is another, less wellknown approach, which we have come to believe is conceptually clearer. We call this Seligmanstyle inference, as it was first introduced and explored by Jerry Seligman in the setting of natural deduction and sequent calculus in the late 1990s. The purpose of this paper is to introduce a Seligmanstyle tableau system. The most obvious feature of Seligmanstyle systems is that they work with arbitrary formulas, not just formulas prefixed by $@$operators. To achieve this in a tableau system, we introduce a rule called $\mathsf{GoTo}$ which allows us to ``jump to a named world'' on a tableau branch, thereby creating a local proof context (which we call a \textit{block}) on that branch. To the surprise of at least some of the authors (who have worked extensively on developing the labeling approach) Seligmanstyle inference is often much clearer. Not only is the approach more modular, but individual proofs are often more direct. We conclude with a discussion of termination and firstorder hybrid logic. 
16:00  Dynamic and static symmetry breaking in answer set programming SPEAKER: Belaïd Benhamou ABSTRACT. Many research works had been done in order to define a semantics for logic programs. The well known one is the stable model semantics which selects for each program one of its canonical models. The stable models of a logic program are in a certain sens the minimal Herbrand models of its program "reducts". On the other hand, the notion of symmetry elimination had been widely used in constraint programming and shown to be useful to increase the efficiency of the associated solvers. However symmetry in non monotonic reasoning still not well studied in general. For instance Answer Set Programming (ASP) is a very known framework but only few recent works on symmetry breaking are known. Ignoring symmetry breaking in the answer set systems could make them doing redundant work and lose on their efficiency. Here we study the notion of local and global symmetry in the framework of answer set programming. We show how local symmetries of a logic program can be detected dynamically. We also give some properties that allow to eliminate theses symmetries in SATbased answer set solvers in order to enhance their efficiency. 
16:20  Tracking DataFlow with Open Closure Types SPEAKER: Jan Hoffmann ABSTRACT. Type systems hide data that is captured by function closures in function types. In most cases this is a beneficial design that enables simplicity and compositionality. However, some applications require explicit information about the data that is captured in closures. This paper introduces open closure types, that is, function types that are decorated with type contexts. They are used to track dataflow from the environment into the function closure. A simplytyped lambda calculus is used to study the properties of the type theory of open closure types. A distinctive feature of this type theory is that an open closure type of a function can vary in different type contexts. To present an application of the type theory, it is shown that a type derivation establishes a simple noninterference property in the sense of informationflow theory. A publicly available prototype implementation of the system can be used to experiment with type derivations for example programs. 
16:40  Characterizing Subset Spaces as BiTopological Structures SPEAKER: Bernhard Heinemann ABSTRACT. Subset spaces constitute a relatively new semantics for bimodal logic. This semantics admits, in particular, a modern, computer scienceoriented view of the classic interpretation of the basic modalities in topological spaces à la McKinsey and Tarski. In this paper, we look at the relationship of both semantics from an opposite perspective as it were, by asking for a consideration of subset spaces in terms of topology and topological modal logic, respectively. Indeed, we shall finally obtain a corresponding characterization result. A third semantics of modal logic, namely the standard relational one, and the associated firstorder structures, will play an important part in doing so as well. 
17:00  An Incremental Algorithm to Optimally Maintain Aggregate Views SPEAKER: Abhijeet Mohapatra ABSTRACT. We propose an algorithm called CReaM to incrementally maintain materialized aggregate views with userdefined aggregates in response to changes to the database tables from which the views are derived. We show that when the physical design of the underlying database is optimized, the time taken by CReaM to update an aggregate view is optimal. 
17:10  Three SCCbased Emptiness Checks for Generalized Büchi Automata SPEAKER: Etienne Renault ABSTRACT. The automatatheoretic approach for the verification of linear time properties involves checking the emptiness of a Büchi automaton. However generalized Büchi automata, with multiple acceptance sets, are preferred when verifying under weak fairness hypotheses. Existing emptiness checks for which the complexity is independent of the number of acceptance sets are all based on the enumeration of Strongly Connected Components (SCCs). In this paper, we review the state of the art SCC enumeration algorithms to study how they can be turned into emptiness checks. This leads us to define two new emptiness check algorithms, introduce new optimizations, and show that one of these can be of benefit to a classic SCCs enumeration algorithm. We have implemented all these variants to compare their relative performances and the overhead induced by the emptiness check compared to the corresponding SCCs enumeration algorithm. 