View: session overviewtalk overview

**Invited Presentation**

**Speaker: **Mark Wallace

09:00 | Logic and Search SPEAKER: Mark Wallace ABSTRACT. Until recently every real-world optimisation problem was a major project. Indeed even this year, in 2015, I have heard of two large Australian companies paying millions of dollars for multi-year projects on supply-chain solutions that are still not working properly. These are projects that could now be completed by a small team in a few months. Technology has moved on, but large organisation prefer to rely on COTS – Commercial Off The Shelf – software which has been proven in large commercial applications. Such software is mature, which means out of date. Adapting this “mature” software to meet the specific needs of a new customer is surprisingly challenging. Every change has potential ramifications through the whole system, and takes months of testing. The impact on performance can be so severe that the actual customer requirement remains unsatisfied, but the costs still mount up. This paper reports on the changes in modelling and algorithms that now make it possible to solve industrial optimisation problems quickly and cheaply. |

10:30 | Gamifying Program Analysis SPEAKER: Martin Schäf ABSTRACT. Abstract interpretation is a powerful tool in program verification. Several commercial or industrial scale implementations of abstract interpretation have demonstrated that this approach can verify safety properties of real-world code. However, using abstract interpretation tools is not always simple. If no user-provided hints are available, the abstract interpretation engine may lose precision during widening and produce an overwhelming number of false alarms. However, manually providing these hints is time consuming and often frustrating when re-running the analysis takes a lot of time. We present an algorithm for program verification that combines abstract interpretation, symbolic execution and crowd sourcing. If verification fails, our procedure suggests likely invariants, or program patches, that provide helpful information to the verification engineer and makes it easier to find the correct specification. By complementing machine learning with well-designed games, we enable program analysis to incorporate human insights that help improve their scalability and usability. |

10:55 | Using Program Synthesis for Program Analysis SPEAKER: unknown ABSTRACT. In this paper, we identify a fragment of second-order logic with restricted quantification that is expressive enough to capture numerous static analysis problems (e.g. safety proving, bug finding, termination and non-termination proving, superoptimisation). We call this fragment the synthesis fragment. Satisfiability of a formula in the synthesis fragment is decidable over finite domains; specifically the decision problem is NEXPTIME-complete. If a formula in this fragment is satisfiable, a solution consists of a satisfying assignment from the second order variables to functions over finite domains. To concretely find these solutions, we synthesise programs that compute the functions. Our program synthesis algorithm is complete for finite state programs, i.e. every function over finite domains is computed by some \emph{program} that we can synthesise. We can therefore use our synthesiser as a decision procedure for the synthesis fragment of second-order logic, which in turn allows us to use it as a powerful backend for many program analysis tasks. To show the tractability of our approach, we evaluate the program synthesiser on several static analysis problems. |

11:20 | Verification of Concurrent Programs Using Trace Abstraction Refinement SPEAKER: Franck Cassez ABSTRACT. Verifying concurrent programs is notoriously hard due to the state explosion problem: 1) the data state space can be very large as the variables can range over very large sets, and 2) the control state space is the Cartesian product of the control state space of the concurrent components and thus grows exponentially in the number of components. On the one hand, the most successful approaches to address the control state explosion problem are based on assume-guarantee reasoning or model-checking coupled with partial order reduction. On the other hand, the most successful techniques to address the data space explosion problem for sequential programs verification are based on the abstraction/refinement paradigm which consists in refining an abstract over-approximation of a program via predicate refinement. In this paper, we show how to combine partial order reduction techniques with trace abstraction refinement. We apply our approach to standard benchmarks and show that it matches current state-of-the-art analysis techniques. |

14:00 | Synchronized Recursive Timed Automata SPEAKER: Yuya Uezato ABSTRACT. We present synchronized recursive timed automata (SRTA) that extend timed automata with a stack. Each frame of a stack is rational-valued clocks, and SRTA synchronously increase the values of all the clocks within the stack. Our main contribution is to show that the reachability problem of SRTA is ExpTime-complete. This decidability contrasts with the undecidability for recursive timed automata (RTA) introduced by Trivedi and Wojtczak, and Benerecetti et al. Unlike SRTA, the frames below the top are frozen during the computation at the top frame in RTA. Our construction of the decidability proof extends the region abstraction for dense timed pushdown automata (TPDA) of Abdulla et al. to accommodate together diagonal constraints and fractional constraints of SRTA. Since SRTA can be seen as an extension of TPDA with diagonal and fractional constraints, our result enlarges the decidable class of pushdown-extensions of timed automata. |

14:25 | Decidability, Introduction Rules and Automata SPEAKER: unknown ABSTRACT. We present a method to prove uniformly the decidability of provability in several well-known inference systems. This method generalizes both cut-elimination and the construction of an automaton recognizing the provable propositions. |

14:50 | SAT-based Minimization of Deterministic ω-Automata SPEAKER: unknown ABSTRACT. We describe a tool that inputs a deterministic ω-automaton with an arbitrary acceptance condition, and synthesize an equivalent ω-automaton with another arbitrary acceptance condition and a given number of states, if such an automaton exist. This tool, that relies on a SAT-based encoding of the problem, can be used to provide minimal ω-automata equivalent to given properties, for different acceptance conditions. |

15:15 | Symbolic WS1S SPEAKER: Margus Veanes ABSTRACT. We extend weak monadic second-order logic of one successor (WS1S) to symbolic alphabets by allowing character predicates to range over decidable first order theories and not just finite alphabets. We call this extension symbolic WS1S (s-WS1S). We then propose two decision procedures for such a logic: 1) we use symbolic automata to extend the classic reduction from WS1S to finite automata to our symbolic logic setting; 2) we show that every s-WS1S formula can be reduced to a WS1S formula that preserves satisfiability, at the price of an exponential blow-up. |

16:00 | Normalization by Completeness with Heyting Algebras SPEAKER: Olivier Hermant ABSTRACT. Usual normalization by evaluation techniques have a strong relationship with completeness with respect to Kripke structures. But Kripke structures is not the only semantics that fits intuitionistic logic: Heyting algebras are a more algebraic alternative. In this paper, we focus on this less investigated area: how completeness with respect to Heyting algebras generate a normalization algorithm for a natural deduction calculus, in the propositional fragment. Our main contributions is that we prove in a direct way completeness of natural deduction with respect to Heyting algebras, that the underlying algorithm natively deals with disjunction, that we formalized those proofs in Coq, and give an extracted algorithm. |

16:25 | On Subexponentials, Synthetic Connectives, and Multi-Level Delimited Control SPEAKER: unknown ABSTRACT. We construct a partially-ordered hierarchy of delimited control operators similar to those of the CPS hierarchy of Danvy and Filinski [1990]. However, instead of relying on nested CPS translations, these operators directly interpreted in linear logic extended with subexponentials (i.e., multiple pairs of ! and ?). We construct an independent proof theory for a fragment of this logic based on the principal of focusing. It is then shown that the new constraints placed on the permutation of cuts correspond to multiple levels of delimited control. |

16:50 | An adequate compositional encoding of bigraph structure in linear logic with subexponentials SPEAKER: unknown ABSTRACT. In linear logic, formulas can be split into two sets: classical (those that can be used as many times as necessary) or linear (those that are consumed and no longer available after being used). Subexponentials generalize this notion by allowing the formulas to be split into many sets, each of which can then be specified to be classical or linear. This flexibility increases its expressiveness: we already have adequate encodings of a number of other proof systems, and for computational models such as concurrent constraint programming, in linear logic with subexponentials (SEL). Bigraphs were proposed by Milner in 2001 as a model for ubiquitous computing, subsuming models of computation such as CCS and the $\pi$-calculus and capable of modeling connectivity and locality at the same time. In this work we present an encoding of the bigraph structure in SEL, thus giving an indication of the expressive power of this logic, and at the same time providing a framework for reasoning and operating on bigraphs. Our encoding is adequate and therefore the operations of composition and juxtaposition can be performed on the logical level. Moreover, all the proof-theoretical tools of SEL become available for querying and proving properties of bigraph structures. |

17:15 | A Lightweight Double-negation Translation SPEAKER: Frederic Gilbert ABSTRACT. Deciding whether a classical theorem can be proved constructively is a well-known undecidable problem. As a consequence, any computable double-negation translation inserts some unnecessary double negations. This paper shows that most of these unnecessary insertions can be avoided without any use of constructive proof search techniques. For this purpose, we restrict the analysis to syntax-directed double-negation translations, which translate a proposition through a single traversal -- and include most of the usual translations such as Kolmogorov's, Gödel-Gentzen's, and Kuroda's. A partial order among translations are presented to select translations avoiding as many double negations as possible. This order admits a unique minimal syntax-directed translation with noticeable properties. |

**Opening Reception**