VSL 2014: VIENNA SUMMER OF LOGIC 2014
PROGRAM FOR TUESDAY, JULY 22ND, 2014
Days:
previous day
next day
all days

View: session overviewtalk overview

10:15-10:45Coffee Break
10:45-12:15 Session 149A: Description Logic 2 (KR)
Location: EI, EI 7
10:45
Exact Learning of Lightweight Description Logic Ontologies
SPEAKER: unknown

ABSTRACT. We study polynomial time learning of description logic TBoxes in Angluin et al.'s framework of exact learning via queries. We admit entailment queries ("is a given subsumption entailed by the target TBox?") and equivalence queries ("is a given TBox equivalent to the target TBox?"), assuming that the signature and logic of the target TBox are known. We present four main results: (1) TBoxes formulated in DL-Lite with role inclusions and ELI concepts on the right-hand side of concept inclusions can be learned in polynomial time. (2) Neither general nor acyclic EL TBoxes can be learned in polynomial time. (3) EL TBoxes with only concept names on the right-hand side of concept inclusions can be learned in polynomial time. It follows, in particular, that non-polynomial time learnability of EL TBoxes is caused by the interaction between existential restrictions on the right and left-hand side of concept inclusions. Finally, we show that neither entailment nor equivalence queries alone are sufficient for learning TBoxes in polynomial time for any of the description logics in (1)-(3).

11:15
Finite Model Reasoning in Horn Description Logics

ABSTRACT. We study finite model reasoning in the popular expressive Horn description logics (DLs) Horn-SHIF and Horn-SHIQ, starting with a reduction of finite ABox consistency to unrestricted ABox consistency. The reduction relies on reversing certain cycles in the TBox, an approach that originated in database theory, was later adapted to the inexpressive DL DL-Lite_F, and is shown here to extend all the way to Horn-SHIQ. The model construction used to establish correctness sheds much more light on the structure of finite models than existing approaches to finite model reasoning in (non-Horn) DLs which rely on solving systems of inequations over the integers. Since the reduction incurs an exponential blow-up, we then devise a dedicated consequence-based algorithm for finite ABox consistency in Horn-SHIQ that implements the reduction on-the-fly rather than executing it up-front. The algorithm has optimal worst-case complexity and provides a promising foundation for implementations. Finally, we show that our approach can be adapted to finite (positive existential) query answering relative to Horn-SHIF TBoxes, which allows us to prove that this problem is ExpTime-complete in combined complexity and PTime-complete in data complexity.

11:45
Lightweight Description Logics and Branching Time: a Troublesome Marriage

ABSTRACT. We study branching-time temporal description logics (BTDLs) based on the temporal logic CTL in the presence of rigid (time-invariant) roles and general TBoxes. There is evidence that, if full CTL is combined with the classical ALC in this way, then reasoning becomes undecidable. In this paper, we begin by substantiating this claim, establishing undecidability for a fragment of this combination. In view of this negative result, we then investigate BTDLs that emerge from combining fragments of CTL with lightweight DLs from the EL and DL-Lite families. We show that even rather inexpressive BTDLs based on EL exhibit very high complexity. Most notably, we identify two convex fragments which are undecidable and hard for non-elementary time, respectively. For BTDLs based on DL-Lite, we obtain tight complexity bounds that range from PSPACE to EXPTIME.

10:45-12:15 Session 149B: Belief Revision and Nonmonotonicity 2 (KR)
Location: EI, EI 9
10:45
Belief Change and Base Dependence

ABSTRACT. A strong intuition for AGM belief change operations, Gärdenfors suggests, is that formulas that are independent of a change should remain intact. Based on this intuition, Fariñas and Herzig axiomatize a dependence relation w.r.t. a belief set, and formalize the connection between dependence and belief change. In this paper, we introduce base dependence as a relation between formulas w.r.t. a belief base. After an axiomatization of base dependence, we formalize the connection between base dependence and a particular belief base change operation, saturated kernel contraction. Moreover, we prove that base dependence is a reversible generalization of Fariñas and Herzig’s dependence. That is, in the special case when the underlying belief base is deductively closed (i.e., it is a belief set), base dependence reduces to dependence. Finally, an intriguing feature of Fariñas and Herzig’s formalism is that it meets other criteria for dependence, namely, Keynes’ conjunction criterion for dependence (CCD) and Gärdenfors’ conjunction criterion for independence (CCI). We show that our base dependence formalism also meets these criteria. More interestingly, we offer a more specific criterion that implies both CCD and CCI, and show our base dependence formalism also meets this new criterion.

11:15
Justified Beliefs by Justified Arguments
SPEAKER: unknown

ABSTRACT. The paper addresses how the information state of an agent relates to the arguments that the agent endorses. Information states are modeled in doxastic logic and arguments by recasting abstract argumentation theory in a modal logic format. The two perspectives are combined by an application of the theory of product logics, delivering sound and complete systems in which the interaction of arguments and beliefs is investigated.

11:45
Belief Change Operations: A Short History of Nearly Everything, Told in Dynamic Logic of Propositional Assignments

ABSTRACT. We examine several belief change operations in the light of Dynamic Logic of Propositional Assignments DL-PA. We show that we can encode in a systematic way update operations (such as Winslett's `Possible Models Approach') and revision operations (such as Dalal's) as particular DL-PA programs. Every DL-PA formula being equivalent to a boolean formula, one obtains syntactical counterparts for all these belief change operations.

10:45-13:05 Session 149C: FLoC Inter-Conference Topic: SAT/SMT/QBF (CAV and IJCAR)
Location: FH, Hörsaal 1
10:45
Monadic Decomposition
SPEAKER: Margus Veanes

ABSTRACT. Monadic predicates play a prominent role in many decidable cases, including decision procedures for symbolic automata. We are here interested in discovering whether a formula can be rewritten into a Boolean combination of monadic predicates. Our setting is quantifier free formulas over a decidable background theory, such as linear arithmetic, and we here develop a semi-decision procedure for extracting a monadic decomposition of a formula when it exists.

11:05
A DPLL(T) Theory Solver for a Theory of Strings and Regular Expressions

ABSTRACT. An increasing number of applications in verification and security rely or could benefit from automatic solvers that can check the satisfiability of constraints over a rich set of data types that includes character strings. Unfortunately, most string solvers today are standalone tools that can reason only about (some fragment) of the theory of strings and regular expressions, sometimes with strong restrictions on the expressiveness of their input language such as, for instance, length bounds on all string variables. These solvers are based on reductions to satisfiability problems over other data types, such as bit vectors, or to decision problems over automata. In this paper, we present a set of algebraic techniques for solving constraints over the theory of (unbounded) strings natively, without reduction to other problems. Our techniques can be used to construct string theory solvers that can be integrated into general, multi-theory SMT solvers based on the DPLL(T) architecture. We have implemented these techniques in our SMT solver CVC4 to expand its set of built-in data types to strings with concatenation, length, and membership in regular languages. This implementation makes CVC4 the first solver able to accept a rich set of mixed constraints over strings, integers, reals, arrays and algebraic data types. Our initial experimental results show that, in addition, over pure string problems CVC4 is highly competitive in a number ways with specialized string solvers.

11:25
Bit-Vector Rewriting with Automatic Rule Generation

ABSTRACT. Rewriting is essential for efficient bit-vector SMT solving. The rewriting algorithm commonly used by modern SMT solvers iteratively applies a set of ad hoc rewriting rules hard-coded into the solver to simplify the given formula at the preprocessing stage. This paper proposes a new approach to rewriting. In our approach, the solver starts each invocation with an empty set of rewriting rules. The set is extended by applying at run-time an automatic SAT-based algorithm for new rewriting rule generation. The set of rules differs from instance to instance. We implemented our approach in the framework of an algorithm for equivalence and constant propagation, called 0-saturation, which we extended from purely propositional reasoning to bit-vector reasoning. Our approach results in a substantial performance improvement in a state-of-the-art SMT solver over various SMT-LIB families.

11:45
A Tale of Two Solvers: Eager and Lazy Approaches to Bit-vectors

ABSTRACT. The standard method for deciding bit-vector constraints is via eager reduction to propositional logic. This is usually done after first applying powerful rewrite techniques. While often efficient in practice, this method does not scale on problems for which top-level rewrites cannot reduce the problem size sufficiently. A lazy solver can target such problems by doing many satisfiability checks, each of which only reasons about a small subset of the problem. In addition, the lazy approach enables a wide range of optimization techniques that are not available to the eager approach. In this paper we describe the architecture and features of our lazy solver. We provide a comparative analysis of the eager and lazy approaches, and show how they are complementary in terms of the types of problems they can efficiently solve. For this reason, we propose a portfolio approach that runs a lazy and eager solver in parallel. Our empirical evaluation shows that the lazy solver can solve problems none of the eager solvers can and that the portfolio solver outperforms other solvers both in terms of total number of problems solved and the time taken to solve them.

12:05
AVATAR: The New Architecture for First-Order Theorem Provers

ABSTRACT. This paper describes a new architecture for first-order resolution and superposition theorem provers called AVATAR (Advanced Vampire Architecture for Theories and Resolution). Its original motivation comes from the problem well-studied in the past --- dealing with problems having clauses containing propositional variables and other clauses that can be split into components with disjoint sets of variables. Such clauses are common for problems coming from applications, for example in verification and program analysis, where many ground literals occur in the problems and even more are generated during the proof-search.

This problem was previously studied by adding various versions of splitting. The addition of splitting resulted in considerable improvements in performance of theorem provers. However, even with various versions of splitting implemented, the performance of superposition theorem provers is nowhere near SMT solvers on purely ground problems or SAT solvers on propositional problems.

This paper describes a new, revolutionary architecture for superposition theorem provers, where a superposition theorem prover is tightly integrated with a SAT or an SMT solver. Its implementation in our theorem prover Vampire resulted in drastic improvements over all previous implementation of splitting. Over four hundred TPTP problems previously unsolvable by any modern prover, including Vampire itself, have been proved, most of them with short runtimes. Many problems solved with one of 481 variants of splitting previously implemented in Vampire can also be solved with AVATAR.

We also believe that AVATAR is ideally suited for applications where reasoning with both quantifiers and theories is required.

12:25
Automating Separation Logic with Trees and Data
SPEAKER: unknown

ABSTRACT. Separation logic (SL) is a widely used formalism for verifying heap manipulating programs. Existing SL solvers focus on decidable fragments for list-like structures. More complex data structures such as trees are typically unsupported in implementations, or handled by incomplete heuristics. While complete decision procedures for reasoning about trees have been proposed, these procedures suffer from high complexity, or make global assumptions about the heap that contradict the separation logic philosophy of local reasoning. In this paper, we present a fragment of classical first-order logic for local reasoning about tree-like data structures. The logic is decidable in NP and the decision procedure allows for combinations with other decidable first-order theories for reasoning about data. Such extensions are essential for proving functional correctness properties. We have implemented our decision procedure and, building on earlier work on translating SL proof obligations into classical logic, integrated it into an SL-based verification tool. We successfully used the tool to verify functional correctness of tree-based data structure implementations.

12:45
A Nonlinear Real Arithmetic Fragment
SPEAKER: Ashish Tiwari

ABSTRACT. We present a new procedure for testing satisfiability (over the reals) of a conjunction of polynomial equations. There are three possible return values for our procedure: it either returns a model for the input formula, or it says that the input is unsatisfiable, or it fails because the applicability condition for the procedure, called the eigen-condition, is violated. For the class of constraints where the eigen-condition holds, our procedure is a decision procedure. The design of the procedure is motivated by the problems arising from exists-forall problems -- generated by template-based verification and synthesis methods -- that are translated into exists problems using the Positivstellensatz. Our procedure can be seen as a generalized SAT solver: in particular, the eigen-condition holds on nonlinear real arithmetic encodings of SAT problems. We experimentally evaluate the procedure and describe transformations that can potentially turn a problem into another one where the eigen-condition holds.

12:55
Yices 2.2

ABSTRACT. Yices is an efficient SMT solver developed by SRI International. The first version of Yices, was released in 2006 and has been continuously updated since then. In 2007, we started a complete re-implementation of the Yices solver intended to achieve grater performance, modularity, and flexibility. We describe the latest release of Yices, namely Yices 2.2. We present the tool's architecture and algorithms it implements, and latest developements such as support for the SMT2 notation and various performance improvements. Like its predecessors, Yices 2.2 is distributed free-of-charge for non-commercial use.

10:45-13:00 Session 149D: Test of Time Award: 20 Years / Technical Session- Constraint Programming (ICLP)
Location: FH, Hörsaal 8
10:45
Test of Time Awards: 20 Years - CLP(Intervals) Revisited
11:30
SUNNY: a Lazy Portfolio Approach for Constraint Solving
SPEAKER: unknown

ABSTRACT. Within the context of constraint solving, a portfolio approach allows one to exploit the synergy between different solvers in order to create a globally better solver. In this paper we present SUNNY: a simple and flexible algorithm that takes advantage of a portfolio of constraint solvers in order to compute - without learning an explicit model - a schedule of them for solving a given Constraint Satisfaction Problem (CSP). Motivated by the performances reached by SUNNY w.r.t. different simulations of other state of the art approaches, we developed sunny-csp, an effective portfolio solver that exploits the underlying SUNNY algorithm in order to solve a given CSP. Empirical tests conducted on exhaustive benchmarks of MiniZinc models show that the actual performances of sunny-csp conform to the predictions. This is encouraging both for improving the power of CSP portfolio solvers and for trying to export them to fields such as Answer Set Programming and Constraint Logic Programming.

12:00
Using Tabled Logic Programming to Solve the Petrobras Planning Problem
SPEAKER: unknown

ABSTRACT. Tabling has been used for some time to improve efficiency of Prolog programs by memorizing answered queries. The same idea can be naturally used to memorize visited states during search for example in planning. In this paper we present a planner developed in the Picat language to solve the Petrobras planning problem. Picat is a novel Prolog-like language that provides pattern matching, deterministic and non-deterministic rules, and tabling as its core modeling and solving features that are very appropriate for solving planning problem. We demonstrate these capabilities using the Petrobras domain, where the goal is to plan transport of cargo items from ports to platforms using ships with limited capacity. This domain has been introduced in the Fourth International Competition on Knowledge Engineering for Planning and Scheduling (ICKEPS 2012) as a challenge real-life motivated problem. Monte Carlo Tree Search was so far the best technique to tackle this problem and we will show that by using tabling we can achieve comparable efficiency.

12:30
A Proof Theoretic Study of Soft Concurrent Constraint Programming
SPEAKER: Carlos Olarte

ABSTRACT. Concurrent Constraint Programming (CCP) is a simple and powerful model for concurrency where agents interact by telling and asking constraints. Since their inception, CCP-languages have been designed for having a strong connection to logic. In fact, the underlying constraint system can be built from a suitable fragment of intuitionistic (linear) logic --ILL-- and processes can be interpreted as formulas in ILL. Constraints as ILL formulas fail to represent accurately situations where "preferences'' (called soft constraints) such as probabilities, uncertainty or fuzziness are present. In order to circumvent this problem, c-semirings have been proposed as algebraic structures for defining constraint systems where agents are allowed to tell and ask soft constraints. Nevertheless, in this case, the tight connection to logic and proof theory is lost. In this work, we give a proof theoretical interpretation to soft constraints: they can be defined as formulas in a suitable fragment of ILL with subexponentials (SELL) where subexponentials, ordered in a c-semiring structure, are interpreted as preferences. We hence achieve two goals: (1) obtain a CCP language where agents can tell and ask soft constraints and (2) prove that the language in (1) has a strong connection with logic. Hence we keep a declarative reading of processes as formulas while providing a logical framework for soft-CCP based systems. An interesting side effect of (1) is that one is also able to handle probabilities (and other modalities) in SELL, by restricting the use of the promotion rule for non-idempotent c-semirings. This finer way of controlling subexponentials allows for considering more interesting spaces and restrictions, and it opens the possibility of specifying more challenging computational systems.

10:45-13:00 Session 149E: Invited talk & Complexity (IJCAR)
Location: FH, Hörsaal 5
10:45
Structured Search and Learning
11:45
The Complexity of Theorem Proving in Circumscription and Minimal Entailment
SPEAKER: unknown

ABSTRACT. We provide the first comprehensive proof-complexity analysis of different proof systems for propositional circumscription. In particular, we investigate two sequent-style calculi: MLK defined by Olivetti (J. Autom. Reasoning, 1992) and CIRC introduced by Bonatti and Olivetti (ACM ToCL, 2002), and the tableaux calculus NTAB suggested by Niemelä (TABLEAUX, 1996).

In our analysis we obtain exponential lower bounds for the proof size in NTAB and CIRC and show a polynomial simulation of CIRC by MLK. This yields a chain NTAB < CIRC < MLK of proof systems for circumscription of strictly increasing strength with respect to lengths of proofs.

12:15
Visibly Linear Temporal Logic
SPEAKER: unknown

ABSTRACT. We introduce a robust and tractable temporal logic, we call Visibly Linear Temporal Logic (VLTL), which captures the full class of Visibly Pushdown Languages. The novel logic avoids fix points and provides instead natural temporal operators with simple and intuitive semantics. We prove that the complexities of the satisfiability and pushdown model checking problems are the same as for other well known logics, like CARET and the nested word temporal logic NWTL, which in contrast are strictly more limited in expressive power than VLTL. Moreover, formulas of CARET and NWTL can be easily and inductively translated in linear-time into VLTL.

10:45-12:45 Session 149F: Network Security (CSF)
Location: FH, Hörsaal 6
10:45
The Complexity of Estimating Systematic Risk in Networks
SPEAKER: unknown

ABSTRACT. This risk of catastrophe from an attack is a consequence of a network's structure formed by the connected individuals, businesses and computer systems. Understanding the likelihood of extreme events, or, more generally, the probability distribution of the number of compromised nodes is an essential requirement to provide risk-mitigation or cyber-insurance. However, previous network security research has not considered the higher moments of this distribution, while previous cyber-insurance research has not considered the effect of topologies on the supply side.

We provide a mathematical basis for bridging this gap: we study the complexity of computing these loss-number distributions, both generally and for special cases of common real-world networks. In the case of scale-free networks, we demonstrate that expected loss alone cannot determine the riskiness of a network, and that this riskiness cannot be naively estimated from smaller samples, which highlights the lack/importance of topological data in security incident reporting.

11:15
Automated Generation of Attack Trees
SPEAKER: Roberto Vigo

ABSTRACT. Attack trees are widely used to represent threat scenarios in a succinct and intuitive manner, suitable for conveying security information to non-experts. The manual construction of such objects relies on the creativity and experience of specialists, and therefore it is error-prone and impracticable for large systems. Nonetheless, the automated generation of attack trees has only been explored in connection to computer networks and levering rich models, whose analysis typically leads to an exponential blow-up of the state space. We propose a static analysis approach where attack trees are automatically inferred from a process algebraic specification in a syntax-directed fashion, encompassing a great many application domains and avoiding incurring systematically an exponential explosion. Moreover, we show how the standard propositional denotation of an attack tree can be used to phrase interesting quantitative problems, that can be solved through an encoding into Satisfiability Modulo Theories. The flexibility and effectiveness of the approach is demonstrated on the study of a national-scale authentication system, whose attack tree is computed thanks to a Java implementation of the framework.

11:45
Mignis: A semantic based tool for firewall configuration
SPEAKER: Pedro Adão

ABSTRACT. The management and specification of access control that enforce a given policy is a non-trivial, complex, and time consuming task. In this paper we aim at simplifying this task both at specification and verification levels. For that, we propose a formal model of Netfilter, a firewall system integrated in the Linux kernel. We define an abstraction of the concepts of chains, rules, and packets existent in Netfilter configurations, and give a semantics that mimics packet filtering and address translation. We then introduce a simple but powerful language that permits to specify firewall configurations which are unaffected by the relative ordering of rules and that does not depend on the underlying Netfilter chains. We give a semantics for this language and show that it can be translated into our Netfilter abstraction. We present Mignis, a publicly available tool that translates abstract firewall specifications into real Netfilter configurations. Mignis is currently used to configure the whole firewall of the DAIS Department of Ca’ Foscari University.

12:15
Provably Sound Browser-Based Enforcement of Web Session Integrity

ABSTRACT. Enforcing protection at the browser side has recently become a popular approach for securing web authentication. Though interesting, existing attempts in the literature only address specific classes of attacks, and thus fall short of providing robust foundations to reason on web authentication security. In this paper we provide such foundations, by introducing a novel notion of web session integrity, which allows us to capture many existing attacks and spot some new ones. We then propose FF+, a security-enhanced model of a web browser that provides a full-fledged and provably sound enforcement of web session integrity. We leverage our theory to develop SessInt, a prototype extension for Google Chrome implementing the security mechanisms formalized in FF+. SessInt provides a level of security very close to FF+, while keeping an eye at usability and user experience.

10:45-13:00 Session 149G (GeTFun)
Location: FH, Hörsaal 4
10:45
Semi-implication and matrices
SPEAKER: Arnon Avron
11:30
Yoneda’s embedding and Post-completeness
SPEAKER: Hugo Macedo
12:00
On the characterization of broadly truth-functional logics

ABSTRACT. While an abstract characterization of truth-functionality in terms of a property called `cancellation' is known since long, an extension of this result so as to cover broader classes of logics characterized by semantics that extend the notion of truth-functionality resists a simple-minded approach.  In our contribution we discuss some difficulties encountered alongside this task and propose a solution to the characterization quest, for the case of nondeterministic semantics, and also show how the study of such properties is of paramount importance for the effort of understanding the semantical mechanism underlying the combination of logics through the fibring method.

12:30
On non-deterministic fuzzy negations

ABSTRACT. Fuzzy negations are an important mathematical tool in approximate reasoning as well as in decision making.  This fundamental role of fuzzy negation has led to the introduction of an analogous notion for several extensions of fuzzy logics, and in particular for typical hesitant fuzzy elements (HFE).  We here assume, nevertheless, that a fundamental point for defining fuzzy negations is the necessity of a partial order on the HFE set.  For that matter we entertain the use of the Xu-Xia-partial order, which satisfies the antitonicity property of typical hesitant fuzzy negations (THFN). We propose then that each THFN based on Xu-Xia order determines a non-deterministic negation and we study several classes of non-deterministic negations and also relate them to fuzzy negations.

12:15-13:05 Session 150A: 10 short presentations of 5 minutes each. (KR)
Location: EI, EI 7
12:15
On Redundant Topological Constraints (Extended Abstract)
SPEAKER: unknown

ABSTRACT. The Region Connection Calculus (RCC) is a well-known calculus for representing part-whole and topological relations. It plays an important role in qualitative spatial reasoning, geographical information science, and ontology. The computational complexity of reasoning with RCC has been investigated in depth in the literature. Most of these works focus on the consistency of RCC constraint networks. In this paper, we consider the important problem of redundant RCC constraints. For a set $\Gamma$ of RCC constraints, we say a constraint (x R y) in $\Gamma$ is redundant if it can be entailed by the rest of $\Gamma$. A prime network of $\Gamma$ is a subset of $\Gamma$ which contains no redundant constraints but has the same solution set as $\Gamma$. It is natural to ask how to compute a prime network, and when it is unique. In this paper, we show that this problem is in general co-NP hard, but becomes tractable if $\Gamma$ is over a tractable subclass of RCC. If S is a tractable subclass in which weak composition distributes over non-empty intersections, then we can show that $\Gamma$ has a unique prime network, which is obtained by removing all redundant constraints from $\Gamma$. As a byproduct, we identify a sufficient condition for a path-consistent network being minimal.

12:20
Knowledge Maps of Web Graphs
SPEAKER: unknown

ABSTRACT. In this paper we investigate the problem of building knowledge maps of graph-like information, motivated by representing knowledge on the Web. We introduce a mathematical formalism that captures the notion of map of a graph and enables its development and manipulation in a semi-automated way. We present an implementation of our formalism over the Web of Linked Data graph and discuss algorithms to efficiently generate and combine (via an algebra) regions and maps. We present the MaGE tool and discuss some examples of knowledge maps.

12:25
On the Progression of Knowledge in Multiagent Systems
SPEAKER: Vaishak Belle

ABSTRACT. In a seminal paper, Lin and Reiter introduced the progression of basic action theories in the situation calculus. In this paper, we study the progression of knowledge in multiagent settings, where after actions, an agent updates his beliefs but also updates what he believes other agents know given what has occurred. By appealing to the notion of only knowing, we are able to avoid limitations of earlier work on multiagent progression, and obtain a new general account: we show that after an action, knowledge bases are updated in a Lin and Reiter fashion at every nesting of modalities. Consequently, recent results on the first-order definability of progression carry over to a multiagent setting without too much effort.

12:30
Heuristic Guided Optimization for Propositional Planning
SPEAKER: unknown

ABSTRACT. Planning as Satisfiability is an important approach to Propositional Planning. A serious drawback of the method is its limited scalability, as the instances that arise from large planning problems are often too hard for modern SAT solvers.

This work tackles this problem by combining two powerful techniques that aim at decomposing a planning problem into smaller sub-problems, so that the satisfiability instances that need to be solved do not grow prohibitively large. The first technique, incremental goal achievement, turns planning into a series of boolean optimization problem, each seeking to maximize the number of goals that are achieved within a limited planning horizon. This is coupled with a second technique, called heuristic guidance, that directs search towards a state that satisfies all goals. Heuristic guidance is based on the combination of a number of constraint relaxation techniques of varying strength.

Initial experiments with a system that implements these ideas demonstrate that enriching propositional satisfiability based planning with these methods delivers a competitive planning algorithm that is capable of generating plans of good quality for challenging problems in different domains.

12:35
Action Theories over Generalized Databases with Equality Constraints (Extended Abstract)
SPEAKER: unknown

ABSTRACT. Situation calculus action theories allow full first-order expressiveness but, typically, restricted cases are studied such that projection or progression become decidable or first-order, respectively, and computationally feasible. In this work we focus on KBs that are specified as generalized databases with equality constraints, thus able to finitely represent complete information over possibly infinite number of objects. First, we show that this form characterizes the class of definitional KBs and provide a transformation for putting KBs in this form that we call \emph{generalized fluent DB}. Then we show that for action theories over such KBs, the KBs are closed under progression, and discuss how this view exposes some differences with existing progression methods compared to DB update. We also look into the temporal projection problem and show how queries over these theories can be decided based on an induced transition system and evaluation of local conditions over states. In particular, we look into a wide class of generalized projection queries that include quantification over situations and prove that it is decidable under a practical restriction. The proposed action theories are to date the most expressive ones for which there are known decidable methods for computing both progression and generalized projection.

12:40
Representing and Reasoning About Time Travel Narratives: Foundational Concepts

ABSTRACT. The paper develops a branching-time ontology that maintains the classical restriction of forward movement through a temporal tree structure, but permits the representation of paths in which one can perform inferences about time-travel scenarios. Central to the ontology is the notion of an agent embodiment whose beliefs are equivalent to those of an agent who has time-traveled from the future. We discuss the formalization of several example scenarios, define what it means for such scenarios to be motivated with respect to an agent embodiment, and relate the underlying structure to a more general theory of (non-time-travel) narrative.

12:45
Canonical Logic Programs are Succinctly Incomparable with Propositional Formulas
SPEAKER: unknown

ABSTRACT. Canonical (logic) programs} (CP) refer to \emph{normal} programs (LP) augmented with connective $not\ not$. In this paper we address the question of whether CP are \emph{succinctly incomparable} with \emph{propositional formulas} (PF). Our main result shows that the PARITY problem, which can be polynomially represented in PF, \emph{only} has exponential representations in CP. In other words, PARITY \emph{separates} PF from CP. Simply speaking, this means that exponential size blowup is generally inevitable when translating a set of formulas in PF into an equivalent program in CP (without introducing new variables). Furthermore, since it has been shown by Lifschitz and Razborov that there is also a problem which separates CP from PF (assuming $\mathsf{P}\nsubseteq \mathsf{NC^1/poly}$), it follows that the two formalisms are indeed succinctly incomparable. In addition, we show that PARITY separates logic programs with \emph{cardinality constraints} and \emph{choice rules} (CC) from CP. Moreover, assuming $\mathsf{P}\nsubseteq \mathsf{NC^1/poly}$, CP and \emph{definite causal theories} (DT) are succinctly incomparable, \emph{two-valued} programs (TV) are strictly more succinct than CP and DT.

12:50
Using Answer Set Programming for Solving Boolean Games

ABSTRACT. Boolean games are a framework for reasoning about the rational behaviour of agents whose goals are formalised using propositional formulas. They offer an attractive alternative to normal-form games, because they allow for a more intuitive and more compact encoding. Unfortunately, however, there is currently no general, tailor-made method available to compute the equilibria of Boolean games. In this paper, we introduce a method for finding the pure Nash equilibria based on disjunctive answer set programming. Our method is furthermore capable of finding the core elements and the Pareto optimal equilibria, and can easily be modified to support other forms of optimality, thanks to the declarative nature of disjunctive answer set programming. Experimental results clearly demonstrate the effectiveness of the proposed method.

12:55
ASP Encodings of Acyclicity Properties
SPEAKER: unknown

ABSTRACT. Many knowledge representation tasks involve trees or similar structures as abstract datatypes. However, devising compact and efficient declarative representations of such properties is non-obvious and can be challenging indeed. In this paper, we take acyclicity properties into consideration and investigate logic-based approaches to encode them. We use answer set programming as the primary representation language but also consider mappings to related formalisms, such as propositional logic, difference logic, and linear programming.

13:00
Stable Models of Multi-Valued Formulas: Partial vs. Total Functions
SPEAKER: Joohyung Lee

ABSTRACT. Recent extensions of the stable model semantics that allow "intensional" functions--functions that can be specified by logic programs using other functions and predicates--can be divided into two groups. One group defines a stable model in terms of minimality on the values of partial functions, and the other defines it in terms of uniqueness on the values of total functions. We show that, in the context of multi-valued formulas, these two different approaches can be reduced to each other, and further, each of them can be viewed in terms of propositional formulas under the stable model semantics. Based on these results, we present a prototype implementation of different versions of functional stable model semantics by using existing answer set solvers. 

12:15-13:00 Session 150B: 9 short presentations of 5 minutes each. (KR)
Location: EI, EI 9
12:15
First-Order Default Logic Revisited
SPEAKER: Yi Zhou

ABSTRACT. It is well known that Reiter's original proposal for default logic in the first-order case is problematic because of Skolemization. This paper reconsiders this long-standing open problem, and proposes a new world view semantics for first-order default logic. Roughly speaking, a world view of a first-order default theory is a maximal collection of structures satisfying the default theory where the default part is fixed by the world view itself. We show that how this semantics generalizes propositional/closed default logic, classical first-order logic and first-order answer set programming, and we argue that first-order default logic under the world view semantics provides a rich framework for integrating classical logic and rule-based formalism in the first-order case.

12:20
Strong Equivalence of Non-monotonic Temporal Theories

ABSTRACT. Temporal Equilibrium Logic (TEL) is a non-monotonic temporal logic that extends Answer Set Programming (ASP) by introducing modal operators as those considered in Linear-time Temporal Logic (LTL). TEL allows proving temporal properties of ASP-like scenarios under the hypothesis of infinite time while keeping decidability. Formally, it extends Equilibrium Logic (the best-known logical formalisation of ASP) and, as the latter, it selects some models of a (temporal) monotonic basis: the logic of Temporal Here-and-There (THT). In this paper we solve a problem that remained unanswered for the last six years: we prove that equivalence in the logic of THT is not only a sufficient, but also a necessary condition for strong equivalence of two TEL theories. This result has both theoretical and practical consequences. First, it reinforces the need of THT as a suitable monotonic basis for TEL. Second, it has allowed constructing a tool, ABSTEM, that can be used to check different types of equivalence between two arbitrary temporal theories. More importantly, when the theories are not THT-equivalent, the system provides a context theory that makes them behave differently, together with a Buchi automaton showing the temporal stable models that arise from that difference.

12:25
Belief Revision in the Propositional Closure of a Qualitative Algebra
SPEAKER: unknown

ABSTRACT. Belief revision is an operation that aims at modifying old beliefs so that they become consistent with new ones. The issue of belief revision has been studied in various formalisms, in particular, in qualitative algebras (QAs) in which the result is a disjunction of belief bases that is not necessarily representable in a QA. This motivates the study of belief revision in formalisms extending QAs, namely, their propositional closures: in such a closure, the result of belief revision belongs to the formalism. Moreover, this makes it possible to define a contraction operator thanks to the Harper identity. Belief revision in the propositional closure of QAs is studied, an algorithm for a family of revision operators is designed, and an open-source implementation is made freely available on the web.

12:30
Minimal Change in AGM Revision for Non-classical Logics
SPEAKER: unknown

ABSTRACT. In this paper, we address the problem of applying AGM-style belief revision to non-classical logics. We discuss the idea of minimal change in revision and show that for non-classical logics, some sort of minimality postulate has to be explicitly introduced. We also present two constructions for revision which satisfy the AGM postulates and prove the representation theorems including minimality postulates. For each result, we point out the class of logics for which the theorem holds, giving some examples at the end.

12:35
Toward a Knowledge Level Analysis of Forgetting

ABSTRACT. Forgetting has been addressed in different areas of Knowledge Representation, including classical logic, logic programming, modal logic, and description logics. In this paper, we present an account of forgetting as an abstract belief change operator, independent of an underlying logic. We argue that forgetting amounts to a reduction in the language, specifically the signature, of a logic. The central definition is simple: the result of forgetting signature $\sigma$ in a theory is simply the set of logical consequences over the reduced language. This definition offers various advantages. It provides a uniform approach to forgetting, with a single definition applicable to any logic with a well-defined consequence relation. Results obtained are obviously applicable to all subsumed formal systems, and in fact many results are obtained much more straightforwardly. We argue that the perspective provided by the approach leads to simpler approaches to computing forgetting in specific logics. This view also leads to insights with respect to specific logics: forgetting in first-order logic is somewhat different from the accepted approach; the definition relativised to logic programs yields a new syntax-independent notion of forgetting; in modal logic the specification is simpler. Moreover, the obtained perspective clarifies the relation between forgetting and the belief change operation of contraction.

12:40
An Abductive Reasoning Approach to the Belief-Bias Effect

ABSTRACT. The tendency to accept or reject arguments based on own beliefs or prior knowledge rather than on the reasoning process is called the belief bias. A psychological syllogistic reasoning task shows this phenomenon, wherein participants were asked whether they accept or reject a given syllogism.  By introducing abnormalities, abduction and background knowledge, we model this task under the weak completion semantics. Our formalization reveals new questions about observations and their explanations which might include some relevant prior abductive contextual information concerning some side-effect. Inspection points, introduced by Pereira and Pinto, allow us to express these definitions syntactically and intertwine them into an operational semantics.

12:45
Tracking Beliefs and Intentions in the Werewolf Game

ABSTRACT. We propose a formalization framework for modeling how beliefs and intentions change over the course of a dialogue, in the case where the decisions taken during the dialogue affect the possibly conflicting goals of the agents involved. We illustrate our formalization with the game of Werewolf, as an example of such a domain.

We use Situation Calculus to model the evolution of the world and an observation model to analyze the evolution of intentions and beliefs. In our formalization, utterances, that only change the beliefs and intentions, are observations. The beliefs and intentions are modeled with Kripke structure-style accessibility relation predicates. We illustrate our model on instances of the game.

12:50
Axioms .2 and .4 as Interaction Axioms

ABSTRACT. In epistemic logic, some axioms dealing with the notion of knowledge are rather convoluted and it is difficult to give them an intuitive interpretation, even if some of them, like .2 and .4,  are considered by some epistemic logicians to be  key axioms. We show that they can be  characterized in terms of understandable interaction axioms relating knowledge and belief. In order to show it, we first present a theory dealing with the characterization of axioms in terms of interaction axioms in modal logic. We then apply  the main  results and methods of this theory  to obtain  our results related to epistemic logic.

12:55
Aggregative Deontic Detachment for Normative Reasoning
SPEAKER: unknown

ABSTRACT. In this paper, we provide foundations for deontic logic, by defining an undemanding semantics based on the sole notion of detachment. We introduce System O, which handles iteration of successive detachments in a more principled manner than the traditional systems do. This is achieved by injecting a new form of deontic detachment, called aggregative deontic detachment. It allows to keep track of previously detached obligations. Soundness and completeness of the proof system are established. Properties of the logic are discussed.

13:00-14:30Lunch Break
14:30-16:00 Session 151A: Reasoning about Actions and Processes 1 (KR)
Location: EI, EI 7
14:30
Decidable Reasoning in a Fragment of the Epistemic Situation Calculus
SPEAKER: unknown

ABSTRACT. The situation calculus is a popular formalism for reasoning about actions and change. Since the language is first-order, reasoning in the situation calculus is undecidable in general. An important question then is how to weaken reasoning in a principled way to guarantee decidability. Existing approaches either drastically limit the representation of the action theory or neglect important aspects such as sensing. In this paper we propose a model of limited belief for the epistemic situation calculus, which allows very expressive knowledge bases and handles both physical and sensing actions. The model builds on an existing approach to limited belief in the static case. We show that the resulting form of limited reasoning is sound with respect to the original epistemic situation calculus and eventually complete for a large class of formulas. Moreover, reasoning is decidable.

15:00
Model Checking Unbounded Artifact-Centric Systems
SPEAKER: unknown

ABSTRACT. Artifact-centric systems are a recent paradigm used to represent and implement business processes. We present further results on the verification problem of artifact-centric systems specified by means of FO-CTL specifications. While the general problem is known to be undecidable, results in the literature prove decidability for artifact systems with infinite domains under boundedness and conditions such as uniformity. We here follow a different approach and investigate the general case under the infinite domains assumption. We show the decidability for the class of artifact-centric systems whose database schemas consist of a single unary relation, and we show that that the problem is undecidable if artifact systems are defined by using one binary relation or two unary relations.

15:30
State-Boundedness for Decidability of Verification in Data-Aware Dynamic Systems
SPEAKER: unknown

ABSTRACT. Verification of dynamic systems that manipulate data, stored in a database or ontology, has lately received increasing attention. A plethora of recent works has shown that verification of systems working over unboundedly many data is decidable even for very rich temporal properties, provided that the system is state-bounded. This condition requires the existence of an overall bound on the amount of data stored in each single state along the system evolution. In general, checking state-boundedness is undecidable. An open question is whether it is possible to isolate significant classes of dynamic systems for which state-boundedness is decidable. In this paper we provide a surprisingly strong negative answer, by resorting to a novel connection with variants of Petri nets. In particular, we show undecidability for systems whose data component contain unary relations only, and whose action component query and update such relations in a very limited way. To contrast this result, we propose interesting relaxations of the sufficient conditions proposed in the concrete setting of Data-Centric Dynamic Systems, building on recent results on chase termination for tuple-generating dependencies.

14:30-16:00 Session 151B: Automated Reasoning and Computation 2 (KR)
Location: EI, EI 9
14:30
Skolemization for Weighted First-Order Model Counting
SPEAKER: unknown

ABSTRACT. First-order model counting recently emerged as a novel reasoning task, at the core of efficient algorithms for probabilistic logics. We present a Skolemization algorithm for model counting problems that eliminates existential quantifiers from a first-order logic theory without changing its weighted model count. For subsets of first-order logic, lifted model counters were shown to run in time polynomial in the number of objects in the domain of discourse, where propositional model counters require exponential time. However, these guarantees only apply to Skolem normal form theories, and the presence of existential quantifiers reduces lifted model counters to propositional ones. Since textbook Skolemization is not sound for model counting, these restrictions precluded efficient model counting for directed models such as probabilistic logic programs. Our Skolemization procedure extends the applicability of first-order model counters to these representations. Moreover, it simplifies the design of lifted model counting algorithms.

15:00
Analyzing the Computational Complexity of Abstract Dialectical Frameworks via Approximation Fixpoint Theory

ABSTRACT. Abstract dialectical frameworks (ADFs) have recently been proposed as a versatile generalization of Dung's abstract argumentation frameworks (AFs). In this paper, we present a comprehensive analysis of the computational complexity of ADFs. Our results show that while ADFs are one level up in the polynomial hierarchy compared to AFs, there is a useful subclass of ADFs which is as complex as AFs while arguably offering more modeling capacities. As a technical vehicle, we employ the approximation fixpoint theory of Denecker, Marek and Truszczyński, thus showing that it is also a useful tool for complexity analysis of operator-based semantics.

15:30
The Parameterized Complexity of Reasoning Problems Beyond NP
SPEAKER: unknown

ABSTRACT. Today's propositional satisfiability (SAT) solvers are extremely powerful and can be used as an efficient back-end for NP-complete problems. However, many fundamental problems in knowledge representation and reasoning are located at the second level of the Polynomial Hierarchy or even higher, and hence polynomial-time transformations to SAT are not possible, unless the hierarchy collapses. Recent research shows that in certain cases one can break through these complexity barriers by fixed-parameter tractable (fpt) reductions which exploit structural aspects of problem instances in terms of problem parameters.

In this paper we develop a general theoretical framework that supports the classification of parameterized problems on whether they admit such an fpt-reduction to SAT or not.

We instantiate our theory by classifying the complexities of several case study problems, with respect to various natural parameters. These case studies include the consistency problem for disjunctive answer set programming and a robust version of constraint satisfaction.

14:30-16:10 Session 151C: Bounds and Termination (CAV)
Location: FH, Hörsaal 1
14:30
A Simple and Scalable Static Analysis for Bound Analysis and Amortized Complexity Analysis
SPEAKER: Moritz Sinn

ABSTRACT. We present the first scalable bound analysis that achieves amortized complexity analysis. In contrast to earlier work, our bound analysis is not based on general purpose reasoners such as abstract interpreters, software model checkers or computer algebra tools. Rather, we derive bounds directly from abstract program models, which we obtain from programs by comparatively simple invariant generation and symbolic execution techniques. As a result, we obtain an analysis that is more predictable and more scalable than earlier approaches. Our experiments demonstrate that our analysis is fast and at the same time able to compute bounds for challenging loops in a large real-world benchmark.

Technically, our approach is based on lossy vector addition systems (VASS). Our bound analysis first computes a lexicographic ranking function that proves the termination of a VASS, and then derives a bound from this ranking function. Our methodology achieves amortized analysis based on a new insight how lexicographic ranking functions can be used for bound analysis.

14:50
Symbolic Resource Bound Inference for Functional Programs

ABSTRACT. We present an approach for inferring symbolic resource bounds for purely functional programs consisting of recursive functions, algebraic data types and nonlinear arithmetic operations. In our approach, the developer specifies the desired shape of the bound as a program expression containing numerical holes which we refer to as templates. For e.g, time ≤ a ∗ size(tree) + b where a, b are unknowns, is a template that specifies a bound on the execution time. We present a scalable algorithm for computing strongest bounds for sequential and parallel execution times by solving for the unknowns in the template. We empirically evaluate our approach on several benchmarks that manipulate complex data-structures such as binomial heap, lefitist heap, red-black tree and AVL tree. Our implementation is able to infer hard, nonlinear symbolic time bounds for our benchmarks that are beyond the capability of the existing approaches.

15:10
Proving Non-termination Using Max-SMT
SPEAKER: unknown

ABSTRACT. We show how Max-SMT-based invariant generation can be exploited for proving non-termination of programs. The construction of the proof of non-termination is guided by the generation of quasi-invariants -- properties such that if they hold at a location during execution once, then they will continue to hold at that location from then onwards. The check that quasi-invariants can indeed be reached is then performed separately. Our technique considers strongly connected subgraphs of a program's control flow graph for analysis and thus produces more generic witnesses of non-termination than existing methods. Moreover, it can handle programs with unbounded non-determinism and is more likely to converge than previous approaches.

15:30
Termination Analysis by Learning Terminating Programs

ABSTRACT. We present a novel approach to termination analysis. In a first step, the analysis uses a program as a black-box which exhibits only a finite set of sample traces. Each sample trace is infinite but can be represented by a finite lasso. The analysis can "learn" a program from a termination proof for the lasso, a program that is terminating by construction. In a second step, the analysis checks that the set of sample traces is representative in a sense that we can make formal. An experimental evaluation indicates that the approach is a potentially useful addition to the portfolio of existing approaches to termination analysis.

15:50
Causal Termination of Multi-threaded Programs

ABSTRACT. We present a new model checking procedure for the termination analysis of multi-threaded programs. Current termination provers scale badly in the number of threads; our new approach easily handles 100 threads on multi-threaded benchmarks like Producer-Consumer. In our procedure, we characterize the existence of non-terminating executions as Mazurkiewicz-style concurrent traces and apply causality-based transformation rules to refine them until a contradiction can be shown. The termination proof is organized into a tableau, where the case splits represent a novel type of modular reasoning according to different causal explanations of a hypothetical error. We report on experimental results obtained with a tool implementation of the new procedure, called Arctor, on previously intractable multi-threaded benchmarks.

14:30-16:00 Session 151D: Invited Talk / Technical Session - Program Analysis (ICLP)
Location: FH, Hörsaal 8
14:30
(Quantified) Horn Constraint Solving for Program Verification and Synthesis
15:30
Resource Usage Analysis of Logic Programs via Abstract Interpretation Using Sized Types
SPEAKER: unknown

ABSTRACT. We present a novel general resource analysis for logic programs based on sized types. Sized types are representations that incorporate structural (shape) information and allow expressing both lower and upper bounds on the size of a set of terms and their subterms at any position and depth. They also allow relating the sizes of terms and subterms occurring at different argument positions in logic predicates. Using these sized types, the resource analysis can infer both lower and upper bounds on the resources used by all the procedures in a program as functions on input term (and subterm) sizes, overcoming limitations of existing resource analyses and enhancing their precision. Our new resource analysis has been developed within the abstract interpretation framework, as an extension of the sized types abstract domain, and has been integrated into the Ciao preprocessor, CiaoPP. The abstract domain operations are integrated with the setting up and solving of recurrence equations for inferring both size and resource usage functions. We show that the analysis is an improvement over the previous resource analysis present in CiaoPP and compares well in power to state of the art systems.

14:30-16:00 Session 151E: Description Logic (IJCAR)
Location: FH, Hörsaal 5
14:30
Count and Forget: Uniform Interpolation of SHQ-Ontologies
SPEAKER: unknown

ABSTRACT. We propose a method for forgetting concept and non-transitive roles symbols of SHQ-ontologies, or for computing uniform interpolants in SHQ. A uniform interpolant of an ontology over a specified set of symbols is a new ontology that only uses the specified symbols, and preserves all logical entailments that can be expressed in SHQ using these symbols. Uniform interpolation has applications in ontology reuse, information hiding and ontology analysis, but so far no method for computing uniform interpolants for expressive description logics with number restrictions has been developed. Our results are not only interesting because they allow to compute uniform interpolants of ontologies that are in a more expressive language. Using number restrictions also allows to preserve more information in uniform interpolants of ontologies in less complex logics, such as ALC or EL. The presented method computes uniform interpolants on the basis of a new resolution calculus for SHQ-ontologies, that allows to derive consequences in a goal-oriented manner. The output of our method is expressed using SHQmu, which is SHQ extended with fixpoint operators, to always enable a finite representation of the uniform interpolant.

15:00
Coupling Tableau Algorithms for Expressive Description Logics with Completion-Based Saturation Procedures

ABSTRACT. Nowadays, saturation-based reasoners for the OWL EL profile are able to handle large ontologies such as SNOMED very efficiently. However, saturation-based reasoning procedures become incomplete if the ontology is extended with axioms that use features of more expressive Description Logics, e.g., disjunctions. Tableau-based procedures, on the other hand, are not limited to a specific OWL profile, but even highly optimised reasoners might not be efficient enough to handle large ontologies such as SNOMED. In this paper, we present an approach for tightly coupling tableau- and saturation-based procedures that we implement in the OWL DL reasoner Konclude. Our detailed evaluation shows that this combination significantly improves the reasoning performance on a wide range of ontologies.

15:30
EL-ifying Ontologies
SPEAKER: unknown

ABSTRACT. The OWL 2 EL profile is a fragment of the ontology language OWL 2 for which standard reasoning tasks are feasible in polynomial time. Many OWL ontologies, however, fall outside the EL profile and hence profile-specific reasoners are not applicable. Such ontologies, however, typically contain a small number of axioms outside EL, which may have little or no influence on reasoning outcomes. In this paper, we investigate techniques for rewriting non-EL axioms into EL that preserve the outcomes of classification and instance queries and which are applicable to the DL SHOIQ. Furthermore, for Horn ontologies, we devise a technique inspired by EL-reasoning that can be used to reduce the size of pre-models in (hyper)tableau reasoning. We have implemented and extensively tested our techniques with very encouraging results.

14:30-15:30 Session 151F: Invited Talk (CSF)
Location: FH, Hörsaal 6
14:30
Privacy in the Age of Augmented Reality

ABSTRACT. I will present the results of a series of studies connecting research on privacy, behavioural economics, and online social networks. I will present the results of a series of behavioural experiments aimed at investigating consumers' privacy valuations and the adequacy of "notice and consent" mechanisms for privacy protection. Then, I will discuss security and privacy challenges in the context of social media. In particular, I will focus on the feasibility of combining publicly available Web 2.0 data with off-the-shelf face recognition software for the purpose of large-scale, automated individual re-identification. I will discuss the implications of the inevitable convergence of face recognition technology and increasing online self-disclosures, the emergence of "personally predictable" information, and the future of privacy in an "augmented" reality world in which online and offline data will seamlessly blend.

14:30-16:00 Session 151G (GeTFun)
Location: FH, Hörsaal 4
14:30
Non truth-functional bivalent logics extending classical bivalent logics
15:00
A survey on Suszko’s Thesis and its formal developments

ABSTRACT. Many-valued logics received several criticisms since their birth. Among their main opponents was the Polish logician Roman Suszko. In the mid-70s, at a conference held in Craców, he called into question the very nature of many-valuedness by claiming the existence of "but two truth-values", a statement nowadays recognized as Suszko's Thesis. Suszko's motivation for his ideas lies in defending the existence of a double semantic role expressed by truth-values. This duplicity is revealed by him in drawing the difference between algebraic and logical values. According to him, even though we can have more than two algebraic values, there are only two genuine logical ones: truth and falsehood. The philosophical content of his ideas finds support in a technical result called Suszko's Reduction, a theorem that shows that every Tarskian logic may be characterized by a bivalent semantics. Several interesting lines of investigation have arisen from Suszko's ideas. It is the purpose of the present talk to survey the main developments in the area. Our exposition will show not only the main results that corroborate Suszko's ideas, but also those that pose limitations to them and have been leading to discussions about their meaning.

15:30
Effective first-order reasoning about incomplete and inconsistent information
SPEAKER: Anna Zamansky
15:30-16:00 Session 152: Privacy 1 (CSF)
Location: FH, Hörsaal 6
15:30
Balancing Societal Security and Individual Privacy: Accountable Escrow System
SPEAKER: unknown

ABSTRACT. Privacy is a core human need, but society sometimes has the requirement to do targeted, proportionate investigations in order to provide security. To reconcile individual privacy and societal security, we explore whether we can have surveillance in a form that is verifiably accountable to citizens. This means that citizens get verifiable proofs of the quantity and nature of the surveillance that actually takes place. In our scheme, governments are held accountable for the extent to which they exercise their surveillance power, and political parties can pledge in election campaigns their intention about reducing (or increasing) this figure.

We propose a general idea of {\em accountable escrow} and motivate this idea as an approach to reconciling and balancing the requirements of individual privacy and societal security. We design a balanced crypto system for asynchronous communication (e.g., email). We propose a novel method for escrowing the decryption capability in public-key cryptography. A government can decrypt it in order to conduct targetted surveillance, but doing so necessarily puts records in a public log against which the government is held accountable.

16:00-16:30Coffee Break
16:30-18:00 Session 153A: Technical Session - Answer Set Programming (ICLP)
Location: FH, Hörsaal 8
16:30
Dynamic Consistency Checking in Goal-Directed Answer Set Programming
SPEAKER: unknown

ABSTRACT. One obstacle to goal-directed execution of answer set programs is that a valid answer set must satisfy the constraints imposed by every rule containing an odd loop over negation, even when those rules have no relation to a particular query. In this paper, we introduce a technique for dynamic consistency checking, under which only those constraints which are relevant to the partial answer set are tested. However, the algorithm guarantees that, if a program has at least one consistent answer set, any partial answer set returned will be a subset of some consistent answer set.

17:00
Anytime Computation of Cautious Consequences in Answer Set Programming
SPEAKER: unknown

ABSTRACT. Query answering in Answer Set Programming is usually solved by computing (a subset of) the cautious consequences of a logic program. This task is computationally very hard, and there are programs for which computing cautious consequences is not viable in reasonable time. However, current ASP solvers produce the (whole) set of cautious consequences only at the end of their computation. This paper reports on strategies for computing cautious consequences, also introducing anytime algorithms able to produce sound answers during the computation.

17:30
Efficient Computation of the Well-Founded Semantics over Big Data

ABSTRACT. Data originating from the Web, sensor readings and social media result in increasingly huge datasets. The so called Big Data comes with new scientific and technological challenges while creating new opportunities, hence the increasing interest in academia and industry. Traditionally, logic programming has focused on complex knowledge structures/programs, so the question arises whether and how it can work in the face of Big Data. In this paper, we examine how the well-founded semantics can process huge amounts of data through mass parallelization. More specifically, we propose and evaluate a parallel approach using the MapReduce framework. Our experimental results indicate that our approach is scalable and that well-founded semantics can be applied to billions of facts. To the best of our knowledge, this is the first work that addresses large scale nonmonotonic reasoning without the restriction of stratification for predicates of arbitrary arity.

16:30-18:10 Session 153B: Knowledge Representation & Reasoning (IJCAR)
Location: FH, Hörsaal 5
16:30
The Bayesian Description Logic BEL

ABSTRACT. We introduce the probabilistic Description Logic BEL. In BEL, axioms are required to hold only in an associated context. The probabilistic component of the logic is given by a Bayesian network (BN) that describes the joint probability distribution of the contexts. We study the main reasoning problems in this logic; in particular, we (i) prove that deciding positive and almost-sure entailments is not harder for BEL than for the BN, and (ii) show how to compute the probability, and the most likely context for a consequence.

17:00
Otter proofs of theorems in Tarskian geometry
SPEAKER: unknown

ABSTRACT. We report on a project to use Otter to find proofs of the theorems in Tarskian geometry proved in Szmielew's part (Part I) of the book by Schwabhäuser, Szmielew, and Tarski. These theorems start with fundamental properties of betweenness, and end with the development of geometric definitions of addition and multiplication that permit the representation of models of geometry as planes over Euclidean fields, or over real-closed fields in the case of full continuity. They include the four challenge problems left unsolved by Quaife, who two decades ago found some Otter proofs in Tarskian geometry. Our methodology was to present each theorem to Otter as an isolated problem, giving it the (negated) theorem as a goal, and the previous theorems (and the axioms) to use. In practice, we had to coax Otter a bit more.

Quaife's four challenge problems were: every line segment has a midpoint; every segment is the base of some isosceles triangle; the outer Pasch axiom (assuming inner Pasch as an axiom); and the first outer connectivity property of betweenness. These are to be proved without any parallel axiom and without even line-circle continuity. These are difficult theorems, the first proofs of which were the heart of Gupta's Ph.~D. thesis under Tarski. Otter proved them all in 2012. Our success, we argue, is due to improvements in techniques of automated deduction, rather than to increases in computer speed and memory.

The theory of Hilbert (1899) can be translated into Tarski's language, interpreting lines as pairs of distinct points, and angles as ordered triples of non-collinear points. Under this interpretation, the axioms of Hilbert either occur among, or are easily deduced from, theorems in the first 11 (of 16) chapters of Szmielew. At the time of writing this abstract, we have proved all but the two axioms about angles, III.3 and III.4, and we will finish that soon. Narboux and Braun have recently checked these same proofs in Coq.

17:30
NESCOND: an Implementation of Nested Sequent Calculi for Conditional Logics
SPEAKER: unknown

ABSTRACT. In this paper we present NESCOND, a theorem prover for normal conditional logics. NESCOND implements some recently introduced NESted sequent calculi for propositional CONDitional logics CK and some of its significant extensions with axioms ID, MP and CEM. NESCOND also deals with the flat fragment (i.e., without nested conditionals) of the conditional logic CK+CSO+ID, which corresponds to the cumulative logic C introduced by Kraus, Lehmann and Magidor. NESCOND is inspired to the methodology of leanTAP and it is implemented in Prolog. The paper also shows some experimental results, witnessing that the performances of NESCOND are promising. NESCOND is available at http://www.di.unito.it/~pozzato/nescond/

17:50
Knowledge Engineering for Large Ontologies with Sigma KEE 3.0
SPEAKER: unknown

ABSTRACT. The Suggested Upper Merged Ontology (SUMO) is a large, comprehensive ontology stated in higher-order logic. It has co-evolved with a development environment called the Sigma Knowledge Engineering Environment (SigmaKEE). SUMO has been applied to commercial applications in concept extraction and metadata verification. A large and important subset of SUMO can be expressed in first-order logic with equality. To support the knowledge engineer, SigmaKEE has, over time integrated different reasoning systems, either via special modifications to the reasoner or in an ad-hoc manner that requires full re-processing of the full knowledge base for even trivial queries.

To overcome this problem, to create a simpler system configuration that is easier for users to install and manage, and to integrate a state-of-the-art theorem prover we have now integrated Sigma with the E theorem prover. The E distribution includes a simple server version that loads and indexes the full knowledge base, and supports interactive queries via a simple interface based on text streams. No special modifications to E were necessary for the integration, so SigmaKEE have an easy upgrade path to future versions.

16:30-18:30 Session 153C: Privacy 2 (CSF)
Location: FH, Hörsaal 6
16:30
TUC: Time-sensitive and Modular Analysis of Anonymous Communication

ABSTRACT. The anonymous communication protocol Tor constitutes the most widely deployed technology for providing anonymity for user communication over the Internet. Several frameworks have been proposed that show strong anonymity guarantees; none of these, however, are capable of modeling the class of traffic-related timing attacks against Tor, such as traffic correlation and website fingerprinting.

In this work, we present TUC: the first framework that allows for establishing strong anonymity guarantees in the presence of time-sensitive adversaries that mount traffic-related timing attacks. TUC incorporates a comprehensive notion of time in an asynchronous communication model with sequential activation, while offering strong compositionality properties for security proofs. We apply TUC to evaluate a novel countermeasure for Tor against website fingerprinting attacks. Our analysis relies on a formalization of the onion routing protocol that underlies Tor and proves rigorous anonymity guarantees in the presence of traffic-related timing attacks.

17:00
Differential Privacy: An Economic Method for Choosing Epsilon
SPEAKER: Justin Hsu

ABSTRACT. \emph{Differential privacy} is becoming a gold standard for privacy research; it offers a guaranteed bound on loss of privacy due to release of query results, even under worst-case assumptions. The theory of differential privacy is an active research area, and there are now differentially private algorithms for a wide range of interesting problems.

However, the question of when differential privacy works \emph{in practice} has received relatively little attention. In particular, there is still no rigorous method for choosing the key parameter ϵ, which controls the crucial tradeoff between the strength of the privacy guarantee and the accuracy of the published results.

In this paper, we examine the role that these parameters play in concrete applications of differential privacy, identifying the key challenges that must be addressed when choosing specific values. This choice requires balancing the interests of two different parties: the data {\em analyst} and the prospective {\em participant}, who must decide whether to allow their data to be included in the analysis. We propose a simple model that expresses this balance as formulas over a handful of parameters. We illustrate the model with several scenarios, including a medical study and a study of movie ratings data. We also explore a surprising insight: in some circumstances, a differentially private study can be \emph{more accurate} than a non-private study for the same cost, under our model. Finally, we discuss a number of simplifying assumptions in our model and outline a research agenda for possible refinements.

17:30
Proving differential privacy in Hoare logic
SPEAKER: unknown

ABSTRACT. \emph{Differential privacy} is a rigorous, worst-case notion of privacy-preserving computation. Informally, a probabilistic program is differentially private if the participation of a single individual in the input database has a limited effect on the program's distribution on outputs. More technically, differential privacy is a quantitative 2-safety property that bounds the distance between the output distributions of a probabilistic program on adjacent inputs. Like many 2-safety properties, differential privacy lies outside the scope of traditional verification techniques. Existing approaches to enforce privacy are based on intricate, non-conventional type systems, or customized relational logics. These approaches are difficult to implement and often cumbersome to use.

We present an alternative approach that verifies differential privacy by standard, non-relational reasoning on non-probabilistic programs. Our approach is based on transforming a probabilistic program into a non-probabilistic program which simulates two executions of the original program. We prove that if the target program is correct with respect to a Hoare specification, then the original probabilistic program is differentially private. We provide a variety of examples from the differential privacy literature to demonstrate the utility of our approach. Finally, we compare our approach with existing verification techniques for privacy.

18:00
Surveillance and Privacy
16:30-18:00 Session 153E (GeTFun)
Location: FH, Hörsaal 4
16:30
Separating truth and proof in the Logic of Proofs
SPEAKER: Roman Kuznets
16:40-17:30 Session 154: Abstraction (CAV)
Location: FH, Hörsaal 1
16:40
Counterexample to Induction-Guided Abstraction-Refinement (CTIGAR)
SPEAKER: unknown

ABSTRACT. Typical CEGAR-based verification methods refine the abstraction domain based on full traces. The finite-state model checking algorithm IC3 introduced the concept of discovering, generalizing from, and thereby eliminating individual state (or cube) counterexamples to induction (CTIs). This focus on individual states (or cubes) suggests a simpler abstraction-refinement scheme in which refinements are performed relative to single steps of the transition relation, thus reducing the expense of a refinement and eliminating the need for full traces. Interestingly, this change in refinement focus leads to a natural spectrum of refinement options, including when to refine and which type of concrete single-step query to refine relative to. Experiments validate that CTI-focused abstraction refinement, or CTIGAR, is competitive with existing CEGAR-based tools.

17:00
Unbounded Scalable Verification Based on Approximate Property-Directed Reachability and Datapath Abstraction
SPEAKER: Suho Lee

ABSTRACT. This paper introduces the Averroes formal verification system which exploits the power of two complementary approaches: counterexample-guided abstraction and refinement (CEGAR) of the design's datapath and the recently-introduced IC3 and PDR approximate reachability algorithms. Averroes is particularly suited to the class of hardware designs consisting of wide datapaths and complex control logic, a class that covers a wide spectrum of design styles that range from general-purpose microprocessors to special-purpose embedded controllers and accelerators. In most of these designs, the number of datapath state variables is orders of magnitude larger than the number of control state variables. Thus, for purposes of verifying the correctness of the control logic (where most design errors typically reside), datapath abstraction is particularly effective at pruning away most of a design's state space leaving a much reduced ``control space'' that can be efficiently explored by the IC3 and PDR method. Preliminary experimental results on a suite of industrial benchmarks show that Averroes significantly outperforms verification at the bit level. To our knowledge, this is the first empirical demonstration of the possibility of automatic scalable unbounded sequential verification.

17:20
QUICr: A Reusable Library for Parametric Abstraction of Sets and Numbers
SPEAKER: Arlen Cox

ABSTRACT. This paper introduces QUICr, a library that implements abstract domains for simultaneous reasoning about numbers and sets of numbers. QUICr is an abstract domain combinator that lifts any domain for numbers into an efficient domain for numbers and sets of numbers. As a library, it is useful for inferring relational data invariants in programs that manipulate data structures. As a testbed, it allows easy extension of widening and join heuristics, enabling adaptations to new and varied applications. In this paper we present the architecture of the library, guidelines on how to select heuristics, and an example instantiation of the library using the NewPolka polyhedra library to verify set-manipulating programs.

08:45-10:15 Session 156: VSL Keynote Talk (VSL)
Location: EI, EI 7 + EI 9, EI 10 + FH, Hörsaal 1
08:45
VSL Keynote Talk: Verification of Computer Systems with Model Checking
SPEAKER: Edmund Clarke

ABSTRACT. Model Checking is an automatic verification technique for large state transition systems. The technique has been used successfully to debug complex computer hardware and communication protocols. Now, it is beginning to be used for complex hybrid (continuous/discrete) systems as well. The major disadvantage of the technique is a phenomenon called the State Explosion Problem. This problem is impossible to avoid in worst case. However, by using sophisticated data structures and clever search algorithms, it is now possible to verify hybrid systems with astronomical numbers of states.

19:30-21:30 Session 158: KR Banquet (KR)
Location: Naturhistorisches Museum