19TH INTERNATIONAL CONFERENCE ON LOGIC FOR PROGRAMMING, ARTIFICIAL INTELLIGENCE AND REASONING

PROGRAM FOR SATURDAY, DECEMBER 14TH, 2013

08:30-09:30 Session 1 (ALCS)
08:30
On the Algebraization of Non-finitary Logics
SPEAKER: James Raftery
ABSTRACT.

In this talk, we exhibit a non-finitary sentential logic that is algebraized by a quasivariety---in fact by a finitely based variety of finite type. The algebraization process requires infinitely many defining equations. The existence of such a logic settles questions posed by J. Czelakowski and by B. Herrmann.

09:00
On Deductive Systems Associated with Equationally Orderable Quasivarieties
SPEAKER: Ramon Jansana
ABSTRACT.

We discuss in general the logics of the order associated with quasivarieties of algebras with an equationally definable partial order and will present some examples related to BCK-alggebras and to Hilbert algebras.

09:30-10:00Coffee Break
10:00-12:00 Session 2A (ALFA)
10:00
Some Algebraic Means for Characterizing Logics
SPEAKER: Zoltan Esik
ABSTRACT.

We provide an overview of certain algebraic approaches for the characterization of the expressive power of logics on words and trees.

10:30
Software Model Synthesis using Satisfiability Solvers
SPEAKER: Marijn Heule
ABSTRACT.

We introduce a novel approach for synthesis of software models based on identifying deterministic finite state automata. Our approach consists of three important contributions. First, we argue that in order to model software, one should focus mainly on observed executions (positive data), and use the randomly generated failures (negative data) only for testing consistency. We present a new greedy heuristic for this purpose, and show how to integrate it in the state-of-the-art evidence-driven state-merging (EDSM) algorithm. Second, we apply the enhanced EDSM algorithm to iteratively reduce the size of the problem. Yet during each iteration, the evidence is divided over states and hence the effectiveness of this algorithm is decreased. We propose—when EDSM becomes too weak—to tackle the reduced identification problem using satisfiability solvers. Third, in case the amount of positive data is small, we solve the identification problem several times by randomizing the greedy heuristic and combine the solutions using a voting scheme. The interaction between these contributions appeared crucial to solve hard software models synthesis benchmarks. Our implementation, called DFASAT, won the StaMinA competition.

Joint work with Sicco Verwer.

11:00
Strategic Reasoning in Formal Verification
ABSTRACT.

In open systems verification, to formally check for reliability, one needs an appropriate formalism to model the interaction between agents and express the correctness of the system no matter how the environment behaves. Important contributions in this context are given by module checking (a.k.a., module checking of open systems) and modal logics for strategic ability, in the setting of multi-agent games, such as ATL, ATL*, and the like. In this talk, I will first recall and compare these two different approaches. Then, I will introduce Strategy Logic (SL) as a powerful logic framework for reasoning explicitly about strategies in multi-agent concurrent games, which includes both. In particular, I will mainly discuss about the model checking and the satisfiability questions for SL. As a key aspect, SL uses first-order quantifications over strategies, where strategies are not glued to a specific agent, but an explicit binding operator allows to bind an agent to a strategy variable. Notably, this allows agents to share strategies or reuse one previously adopted. ****

The talk is based on papers mainly done with Fabio Mogavero and Moshe Y. Vardi.

11:30
Automata Reasoning via First-order Superposition
ABSTRACT.

For various classes of automata, their reachability problem can be encoded as an unsatisfiability problem in a respective first-order Horn fragment. In this talk I will recapitulate results about this relationship and discuss potential mutual benefits exploiting this relationship.

10:00-12:00 Session 2B (ALCS)
10:00
The Quest for the Basic Fuzzy Logic
SPEAKER: Petr Cintula
ABSTRACT.

The quest for basic fuzzy logic was initiated by Petr Hajek when he proposed his basic fuzzy logic BL, complete with respect to the semantics given by all continuous t-norms. Later weaker systems, such as MTL, UL or psMTL^r, complete with respect to broader (but still meaningful for fuzzy logics) semantics, have been introduced and disputed the throne of the basic fuzzy logic. We contribute to the quest with our own proposal of a basic fuzzy logic. Indeed, we put forth a very weak logic called SLL, introduced and studied in earlier papers of the authors, and propose it as a base of a new framework which allows to work in a uniform way with both propositional and first-order fuzzy logics.

10:30
Short Communication: Complete MV-Algebra Valued Pavelka Logic
SPEAKER: Esko Turunen
ABSTRACT.

We prove that a Pavelka style fuzzy logic is semantically complete if, and only if the set of values of truth constitutes a complete MV-algebra. This is done by adding a new rule of inference and two axiom schema to Pavelka's original approach.

11:00
Co-Rotation, Co-Rotation-Annihilation, and Involutive Ordinal Sum Constructions of Residuated Semigroups
SPEAKER: Sándor Jenei
ABSTRACT.

Two co-rotation constructions and two co-rotation-annihilation constructions (in both cases a disjoint one and a connected one) will be introduced. In some sense these can be considered ’skew dual’ to the (disjoint and connected) rotation constructions. Just as the rotation(-annihilation) of FLe-algebras results in positive rank involutive FLe-algebras, the co-rotation(-annihilation) of FLe-algebras results in non-positive (zero or negative) rank involutive FLe-algebras; thus providing a wide spectrum of examples for the latter algebra. The fact that co-rotation and co-rotation-annihilation constructions are not simply dual of their rotation and rotation-annihilation counterparts is also reflected by the fact that the class of algebras which can be used in them are quite different. Also, a construction, called involutive ordinal sums will be introduced, and its use will be demonstrated in the structural description of a certain class of involutive FLe-algebras. This construction generalizes the generalized ordinal sums of [N. Galatos, Generalized Ordinals Sums and Translations, Logic Journal of the IGPL, 19 (3), (2011)] in the group-like case.

11:30
Discrete Dualities for n-potent MTL and BL-Algebras
ABSTRACT.

Discrete dualities are known for MTL-algebras; the corresponding relational systems consist of partially ordered sets with an additional ternary relation satisfying certain frame conditions. We extend this discrete duality to n-potent MTL-algebras. That is, we give the additional frame conditions needed to characterise the frames of such algebras. We also show how this approach may applied in the case of 2-potent BL-algebras.

12:00-14:00Lunch
14:00-15:30 Session 3A (ALFA)
14:00
Growth in Hecke groups
ABSTRACT.

For the Hecke groups we determine and collect a number of growth functions, their generating series and growth rates: numbers of elements of a given length, numbers of subgroups of a certain index, numbers of normal subgroups, numbers of normal subgroups of genus one and numbers of free subgroups. Depending on the type of these growth functions, we calculate the corresponding generating series or the generating Dirichlet series. A particular emphasis is put on explicit formulas representing these series as rational functions or via Euler product decompositions.

14:30
Newton's Method on Commutative Semirings, Tree Dimension, and Applications
ABSTRACT.

Recently we have proven a new theorem on the convergence speed for Newton's method on commutative semirings (presented at LATA 2013). In this talk I want to discuss the theorem and the techniques used in the proof, in particular the notion of the dimension of a derivation tree (aka. Horton-Strahler number), together with some new recent work: the tree-dimension has nice applications both in graph theory (it is related to the path-width of a tree), as well as in computational linguistics where it can be used for more accurate and faster statistical parsing.

This is joint work with Michael Luttenberger and Javier Esparza.

15:00
Ehrenfeucht-Fraisse Games on Omega-Terms
ABSTRACT.

Fragments of first-order logic over words can often be characterized in terms of finite monoids or finite semigroups. Usually these algebraic descriptions yield decidability of the question whether a given regular language is definable in a particular fragment. An effective algebraic characterization can be obtained from identities of so-called omega-terms. In order to show that a given fragment satisfies some identity of omega-terms, one can use Ehrenfeucht-Fraisse games on word instances of the omega-terms. The resulting proofs often require a significant amount of book-keeping with respect to the constants involved. In this paper we introduce Ehrenfeucht-Fraisse games on omega-terms. To this end we assign a labeled linear order to every omega-term. Our main theorem shows that a given fragment satisfies some identity of omega-terms if and only if Duplicator has a winning strategy for the game on the resulting linear orders. This allows to avoid the book-keeping. As an application of our main result, we show that one can decide in exponential time whether all aperiodic monoids satisfy some given identity of omega-terms, thereby improving a result of McCammond (Int. J. Algebra Comput., 2001).

This is joint work with Martin Huschenbett.

14:00-15:30 Session 3B (IWIL)
14:00
The TPTP Process Instruction Language
ABSTRACT.

Most input languages for ATP systems, including the existing TPTP language, are designed primarily to write the logical formulae that are input to the ATP systems' reasoning processes. There is minimal or no support for controlling the processing of the logical formulae: tasks such as incrementally adding and deleting formulae, controlling reasoning processes, and interacting with non-logical data. The TPTP Process Instruction (TPI) language augments the existing and well-known logical TPTP language forms with command and control instructions for manipulating logical formulae in the context of ATP systems. The TPI language makes it easy(er) for ATP users to specify their problems, process the logical formulae, and report results, in a fully automatic way.

14:30
Gödel's God on the Computer
ABSTRACT.

Attempts to prove the existence (or non-existence) of God by means of abstract ontological arguments are an old tradition in philosophy and theology. 

 We have analyzed Dana Scott’s version of Goedel’s proof for the first-time with an unprecedented degree of detail and formality with the help of higher-order theorem provers; cf. https://github.com/ FormalTheology/GoedelGod. Our computer-assisted formalizations rely on an embedding of quantified modal logic into classical higher-order logic with Henkin semantics.

The following has been done (and in this order): (i) a detailed natural deduction proof; (ii) a formalization in TPTP THF syntax; (iii) an automatic verification of the consistency of the axioms and definitions with Nitpick; (iv) an automatic demonstration of the theorems with the provers LEO-II and Satallax; (v) a step-by-step formalization using the Coq proof assistant; (vi) a formalization using the Isabelle proof assistant, where the theorems have been automated with the Isabelle tools Sledgehammer and Metis. Furthermore, the multi-step THF development of the proof has recently been encoded in Sutcliffe’s new TPTP TPI scripting language. This scripting language offers important features that were missing in the TPTP world so far. 

In our ongoing computer-assisted study of G ̈odel’s proof, the automated reasoning tools have made some interesting observations, some of which were unknown so far. Our work attests the maturity of contemporary interactive and automated deduction tools for classical higher-order logic and demonstrates the elegance and practical relevance of the embeddings-based approach. Most importantly, our work opens new perspectives for a computer-assisted theoretical philosophy. 

14:00-15:30 Session 3C: Logics and Reasoning for Conceptual Modeling (LRCM)
14:00
Answering queries over inconsistent databases
ABSTRACT.

Depending on the data that is stored, it is common to impose certain constraints, or semantic rules, that the database instances must obey (e.g. two different students cannot have the same id number). In various settings, we may have to handle databases that are inconsistent, i.e. databases violating the constraints.

About fifteen years ago, the principled approach of consistent query answering has been introduced in order to answer queries over such databases. Intuitively, a piece of data belongs to the consistent answer of a query if no matter how we minimally restore the consistency of the database, the data is selected by the query.

In the last few years, the complexity of querying inconsistent databases in that framework has been thoroughly investigated. An important problem is to identify classes of constraints and queries for which the problem can be solved efficiently. Another related problem is to solve the dichotomy problem for consistent query answering w.r.t. a class C of constraints and queries. Such a result is usually hard to obtain and would provide an automatic classification of the complexity of all the consistent query answering problems over the class C.

The goal of the talk is to introduce the notion of consistent query answering and to present the related  complexity issues and main results. We also give a brief overview of the dichotomy problem for consistent query answering.

15:00
Scenario Testing on UML Class Diagrams using Description Logic
ABSTRACT.

As part of the requirements phase of building a software system, a conceptual model could be created which describes the information and processes of the business. This conceptual model is often expressed as a UML class diagram. In an attempt to validate the UML class diagram, stakeholders test various usage scenarios against it. The availablity of formal reasoning procedures will greatly ease finding associated inconsistencies. To date, substantial research has been done on transforming UML class diagrams into different Description Logics. In this paper we explore how Description Logics can be used to fascilitate scenario testing as a means to validate the UML class diagrams. Since identity constraints on UML class diagrams have not been mapped to a Description Logic as yet, we evaluate the suitability of using either DLRifd or SROIQ(D) for this purpose.

14:00-15:30 Session 3D (ALCS)
14:00
Representations for Ramsey Relation Algebras
ABSTRACT.

A Ramsey Relation Algebra (RaRA), also called a Monk algebra or a Maddux algebra, is a certain finite relation algebra with n+1 atoms. Such an algebra is representable, if there exists an edge-colouring of a complete graph, satisfying the following two conditions. (1) There are no monochromatic triangles, (2) Every non-monochromatic triangle occurs everywhere it can.

The first condition is clearly a Ramsey type constraint; the second is a loose expression intended to mean the following: (a) each vertex has an outgoing edge of every colour, and (b) each edge is a base for every consistent (that is, non-monochromatic) triangle. The first requirement gives an upper bound for the size of the graph. The second gives a lower bound. It is not known whether the lower bound is smaller than the upper bound in general, that is, for an arbitrary number of colours. For two colours, there is precisely one colouring satisfying the requirements. For three colours, there are precisely three such colourings. It has been known for some time that appropriate colourings exist for 4 and 5 colours.

Using a construction based on finite fields, and some easy computations in GAP, we can show that appropriate colourings exist for up to 120 colours. Curiously, the construction does not work for 8 colours, has to be slightly tweaked to produce a colouring with 9 colours, and for 13 colours it is not known whether it works.

Unfortunately, we have no general theorem stating that for every n > 13 the construction will work, but it seems to be a reasonable conjecture.

14:30
An Isomorphism Criterion for Colimits of Sequences of Finitely Presented Objects
SPEAKER: Luca Spada
ABSTRACT.

We prove a general criterion reducing isomorphisms between colimits of sequences of finitely presented objects in a category to the notion of a confluence between their diagrams.

15:00
A Completeness Theorem for Two-Layer Modal Logics
ABSTRACT.

In this talk we provide a new general framework for two-layer modal fuzzy logics that encompasses known examples by Petr Hájek and others and paves the way for future development. We go far beyond the landscape of fuzzy logics by showing how one can construct a modal logic (for an arbitrary modality, not necessarily read as a probability) over an arbitrary non-classical logic (under certain technical requirements). Therefore, we need not assume that the starting logic is fuzzy, and we can develop a general theory of two-layer modal logics, showing how the methods used in the fuzzy literature can lead to completeness results using very few properties of the underlying logics. As a semantics, we propose particular kinds of measured Kripke Frames and prove corresponding completeness theorems.

15:30-16:00Coffee Break
16:00-17:30 Session 4A (IWIL)
16:00
An Interactive Prover for Bi-Intuitionistic Logic
SPEAKER: Daniel Méry
ABSTRACT.

In this paper we present an interactive prover for deciding formulas in propositional bi-intuitionistic logic (BiInt). This tool is based on a recent connection-based characterization of bi-intuitionistic validity through bi-intuitionistic resource graphs (biRG). After giving the main concepts and principles we illustrate how to use this interactive proof or counter-model building assistant and emphasize the interest of bi- intuitionistic resource graphs for proving or refuting BiInt formulas. We complete this work by studying how to make our tool a fully automated theorem prover.

16:30
Towards Explicit Rewrite Rules in the Lambda-Pi Calculus Modulo
ABSTRACT.

This paper provides a new presentation of the lambda-pi calculus modulo where the addition of rewrite rules is made explicit. The lambda-pi calculus modulo is a variant of the lambda calculus with dependent types where beta-reduction is extended with user-defined rewrite rules. Its expressiveness makes it suitable to serve as an output language for theorem provers, certified development tools or proof assistants. Addition of rewrite rules becomes an iterative process and rules previously added can be used to type new rules. We also discuss the condition rewrite rules must satisfy in order to preserve the Subject Reduction property and we give a criterion weaker than the usual one. Finally we describe the new version of Dedukti, a type-checker for the lambda-pi calculus modulo for which we assess its efficiency in comparison with Coq Twelf and Maude.

17:00
Proof Certification in Zenon Modulo: When Achilles Uses Deduction Modulo to Outrun the Tortoise with Shorter Steps
ABSTRACT.

We present the certifying part of the Zenon Modulo automated theorem prover, which is an extension of the Zenon tableau-based first order automated theorem prover to deduction modulo. The theory of deduction modulo is an extension of predicate calculus, which allows us to rewrite terms as well as propositions, and which is well suited for proof search in axiomatic theories, as it turns axioms into rewrite rules. In addition, deduction modulo allows Zenon Modulo to compress proofs by making computations implicit in proofs. To certify these proofs, we use Dedukti, an external proof checker for the lambda-Pi-calculus modulo, which can deal natively with proofs in deduction modulo. To assess our approach, we rely on some experimental results obtained on the benchmarks provided by the TPTP library.

16:00-17:00 Session 4B (ALCS)
16:00
Algorithmic-Algebraic Canonicity for Mu-Calculi
ABSTRACT.

The modal mu-calculus was defined by Kozen and is obtained by adding to the basic modal logic the least and greatest fixed point operators. The correspondence and completeness of logics with fixed point operators has been the subject of recent studies by Bezhanishvili and Hodkinson, and by Conradie et al. Both of these works aim to develop a Sahlqvist-like theory for their respective fixed point settings.

Sahlqvist theory consists of two parts: canonicity and correspondence. The Sahlqvist formulas are a recursively defined class of modal formulas with a particular syntactic shape. Any modal logic axiomatized by Sahlqvist formulas is strongly complete (via canonicity) with respect to its class of Kripke frames, and the latter is moreover guaranteed to be an elementary class. The work of Bezhanishvili and Hodkinson looks at both of these aspects, obtaining a syntactic class which allows for limited use of fixed point operators and for which a modified version of canonicity is proved. By contrast, Conradie et. al. look at only correspondence, and using an algorithmic approach obtains results for a much broader class of formulas. This algorithmic approach builds on work by Conradie and Palmigiano on canonicity and correspondence for distributive modal logic.

In the current work we prove that the members of a certain class of intuitionistic mu-formulas are canonical, in the sense of Bezhanishvili and Hodkinson; that is, they are preserved under certain modified canonical extensions. Our methods use a variation of the ALBA algorithm (Ackermann Lemma Based Algorithm). We show that all mu-inequalities that can be successfully processed by our algorithm, $\mu^*$-ALBA, are canonical.

16:30
Cut-free Calculi for Challenge Logics in a Lazy Way
ABSTRACT.

The development of cut-free calculi for expressive logics, e.g. quantified non-classical logics, is usually a non-trivial task. However, for a wide range of challenge logics there exists an elegant and uniform solution: By modeling and studying these logics as fragments of classical higher-order logic (HOL) --- a research direction I have recently proposed --- existing results for HOL can often be reused. We illustrate the idea with quantified conditional logics.

16:00-17:00 Session 4C (LRCM)
16:00
Toward Representing Attributes in Temporal Conceptual Data Models
SPEAKER: Nasubo Ongoma
ABSTRACT.

Traditional conceptual data models, such as EER, ORM, and UML Class dia- grams, are time-independent, meaning that they do not represent time changing data. Many applications across subject domains, such as in medicine, engineer- ing, manufacturing, administration, however, need to store time-dependent data. For example in manufacturing plants, the successful completion of a product triggers a new process of shipping, and employees evolve to play dierent roles in the organisation throughout their employment. As time flows, data evolves, which induces the need for temporal conceptual data models as a means to repre- sent information in an implementation-independent way. Research on temporal data models spans about 15 years, which focuses typically on extending EER diagrams [2-7]. However, there is no accepted standard temporal ER that has met all requirements for a temporal model [3, 8], in particular when also a solid, unambiguous, model theoretic semantics is required, except for ERVT [6] and MADS [9]. ERVT is a temporal extension of EER that uses the very expressive Description Logic DLRUS [10] to formalize the model's entity types, relation- ships, and constraints. Concerning the temporal aspects, object migration for ERV T has been formalised [6] and relationship migration was added afterward [11-13]. However, there is no formalization of a full temporalization of attributes. In conceptual data models, however, attributes play an important role, and so do temporal attributes. For instance, a regular Employee who is promoted to Man- ager may not have her Union membership set to inactive and her Salary must have a value 1million Rand, unlike regular employees, and a HIV positive outpatient will migrate to AIDS patient when her CD4 count drops below 180. The aim of our research is to give a model theoretic semantics for attributes, and temporal attributes in particular, extending ERVT and DLRUS to ensure the complete formalization of a temporal conceptual data modelling language. That is, we focus on both the logic reconstruction of atemporal attributes and all temporal aspects, i.e., the scope is comprehensiveness for modelling temporal information. While we can avail of the notions of status classes, status relations, and the evolution constraints introduced in [6, 11, 12] to have "status attributes" in a similar way, there are some variations in the details and, notably, in the logical implications, which we will present. Since DLRUS is undecidable, it does not look good for automated reasoning, but it can inform other works in extending computationally well-behaved languages with attributes [14] and it will provide insight in what is still feasible for temporal attributes in TDL-lite for ontology- based data access [15] that, so far, does not consider temporal attributes.

16:30
Defeasible ORM2 Constraints to Preserve Coherence (Extended Abstract)
ABSTRACT.

The paper aims at discussing the possibility to exploit the introduction of a defeasible semantics for a number of ORM2 constraints, in order to support the schema design and modelling activity in this language. The paper relies on previous theoretical results by the same authors concerning the definition of defeasible ORM2 constraints, and their interplay with their classical version. The nonmonotonic semantics introduced by the authors allows for the desing and implementation of several interesting reasoning tasks that can be offered to a potential ORM2 user, such as automatic entity types and predicates consistency checking, constraint entailment, or constraint refinement.