View: session overviewtalk overview
Experiences with threadmodular static analysis of concurrent embedded systems
09:00  Title TBA  Invited talk ABSTRACT. Abstract TBA 
10:00  SPEAKER: Tim Lyon ABSTRACT. We begin with display and labelled calculi for a certain class of tense logics. These tense logics are extensions of Kt with any finite number of general path axioms, which are axioms of the form Xp > Yp, where X and Y represent sequences of black and white diamonds. For the minimal tense logic Kt, I provide a bidirectional embedding between display calculus proofs and labelled calculus proofs. It is then shown how to translate display proofs for Kt extended with general path axioms into labelled proofs. Providing a reverse translationtranslating labelled proofs into display proofs for Kt extended with general path axiomshas proven to be much more difficult because certain labelled structural rules do not correspond well to the display variants of such rules. Nevertheless, for a certain subclass of general path axioms a translation from labelled calculus proofs to display calculus proofs can be provided. Through the addition of special rules, referred to as "propagation rules," we can prove that the problematic structural rules are admissible in the resulting calculus. Every proof of a formula in this calculus contains sequents whose structure corresponds to a tree with two types of edges. This specific structure allows us to translate these proofs into display proofs, which gives an answer to an open question of translating labelled proofs to display proofs for extensions of the minimal tense logic Kt. 
09:00  The CakeML Verified Compiler and Toolchain 
10:00  SPEAKER: Michael Rawson ABSTRACT. Automated theorem provers are often used in other tools as blackboxes for discharging proof obligations. One example where this has enjoyed a lot of success is within interactive theorem proving. A key usability factor in such settings is the time taken for the provers to complete. Automated theorem provers run lists of proof strategies sequentially, which can cause proofs to be found more slowly than necessary. We show that it is possible to predict which strategies are likely to succeed while they are running using an artificial neural network. We also implement a runtime strategy scheduler in the Vampire theorem prover which improves average proof search time, and hence increases usability as a blackbox solver. 
Certification & Formalisation III
Invited Speaker
Contributed Talks: Robotics I
09:00  Reproducibility in Robotics  The Big 5 Issues ABSTRACT. The range of difficulties that can occur when running an experimental study in robotics is broad. 
10:00  SPEAKER: Alvaro Miyazawa ABSTRACT. We propose to demonstrate the application of RoboChart and its associated tool, RoboTool, for the verification and validation of robotic applications. In our demonstration, we will consider a few small examples for illustration, and the larger example of a transporter. It is part of a swarm and cooperates with other identical robots to push an object to a target location. 
10:00  Computing properties of stable configurations of thermodynamic binding networks SPEAKER: Keenan Breik 
11:00  SPEAKER: Eric Madelaine ABSTRACT. Open pNets are used to model the behaviour of open systems, both synchronous or asynchronous, expressed in various calculi or languages. They are endowed with a symbolic operational semantics in terms of socalled “Open Automata”. This allows to check properties of such systems in a compositional manner. We implement an algorithm computing this semantics, building predicates expressing the synchronization conditions between the events of the pNet subsystems. Checking such predicates requires symbolic reasoning over first order logics, but also over applicationspecific data. We use the Z3 SMT engine to check satisfiability of the predicates, and prune the open automaton of its unsatisfiable transitions. As an industrial oriented usecase, we use socalled "architectures" for BIP systems, that have been used in the framework of an ESA project and to specify the control software of a nanosatellite at the EPFL Space Engineering Center. We use pNets to encode a BIP architecture extended with explicit data, and compute its open automaton semantics. This automaton may be used to prove behavioural properties; we give 2 examples, a safety and a liveness property. 
11:30  SPEAKER: Yannick Moy ABSTRACT. There is a strong link between software quality and software reliability. By decreasing the probability of imperfection in the software, we can augment its reliability guarantees. At one extreme, software with one unknown bug is not reliable. At the other extreme, perfect software is fully reliable. Formal verification with SPARK has been used for years to get as close as possible to zerodefect software. We present the wellestablished processes surrounding the use of SPARK at Altran UK, as well as the deployment experiments performed at Thales to finetune the gradual insertion of formal verification techniques in existing processes. Experience of both longterm and new users helped us define adoption and usage guidelines for SPARK based on five levels of increasing assurance that map well with industrial needs in practice. 
12:00  A Framework for the Formal Verification of Networks of Railway Interlockings  Application to the Belgian Railways SPEAKER: Christophe Limbrée ABSTRACT. Modern railway stations are controlled by computerized systems called interlockings. In fact the middle size and large size stations usually required to use several interlockings then forming a network. Many researches propose to verify the safety properties of such system by mean of model checking. Our approach goes a step further and proposes a method to extend the verification process to interconnected interlockings. This process is known as compositional verification. Each interlocking is seen as the component of a larger system (i.e.; station) and interacts with its neighbours by mean of interfaces. Our first contribution comes in the form of a catalogue of formal definitions covering all the interfaces used by the Belgian Railways. Each interface can then be bound to a formal contract allowing its verification by the ocra tool. Our second contribution is to propose an algorithm able to split the topology controlled by a single interlocking into components. This allows to reduce the size of the model to be verified thereby reducing the state space explosion while providing guarantees on the whole interlocking. 
11:00  SPEAKER: Tiziano Dalmonte ABSTRACT. We propose a minimal nonnormal intuitionistic modal logic. We present it by Hilbert axiomatisation and an equivalent cutfree calculus. We then propose a semantic characterisation of this logic in terms of bineighbourhood models with respect to which the logic is sound. We finally present a cutfree labelled calculus matching with the bineighbourhood semantics. 
11:30  Proof theory for quantified monotone modal logics SPEAKER: Eugenio Orlandelli ABSTRACT. This paper provides the first prooftheoretic study of quantified nonnormal modal logics. It introduces labelled sequent calculi for the first order extension, both with free and with classical quantification, of all the monotone nonnormal modal logics, as well as of some interesting extensions thereof, and it studies the role of the Barcan Formulas in these calculi. It will be shown that the calculi introduced have good structural properties: they have invertibility of the rules, heightpreserving admissibility of weakening and contraction, and syntactic cut elimination. It will also be shown that each of the calculi introduced is sound and complete with respect to the appropriate class of neighbourhood frames with either constant or varying domains. In particular the completeness proof constructs a formal proof for derivable sequents and a countermodel for underivable ones, and it gives a semantic proof of the admissibility of cut. Finally, some preliminay results on the extension of our approach to the nonmonotonic case are discussed. 
12:00  Sequent Calculi for Logic that Includes an Infinite Number of Modalities ABSTRACT. Multimodal logic has been studied for various purposes. 
Deep Learning for Code
11:00  SPEAKER: Anders Schlichtkrull ABSTRACT. We present a simple prover for firstorder logic with certified soundness and completeness in Isabelle/HOL, taking formalizations by Tom Ridge and others as the starting point, but with the aim of using the approach for teaching logic and verification to computer science students at the bachelor level. The prover is simple in the following sense: It is purely functional and can be executed with rewriting rules or as code generation to a number of functional programming languages. The prover uses no higherorder functions, that is, no function takes a function as argument or returns a function as its result. This is advantageous when students perform rewriting steps by hand. The prover uses the logic of firstorder logic on negation normal form with a term language consisting of only variables. This subset of the full syntax of firstorder logic allows for a simple proof system without resorting to the much weaker propositional logic. 
11:30  Proof Search Optimizations for Nonclausal Connection Calculi ABSTRACT. The paper presents several proof search optimization techniques for nonclausal connection calculi. These techniques are implemented and integrated into the nonclausal connection prover nanoCoP. Their effectiveness is evaluated on the problems in the TPTP library. Together with a fixed strategy scheduling, these techniques are the basis of the new version of nanoCoP. 
12:00  TFX: The TPTP Extended Typed Firstorder Form SPEAKER: Geoff Sutcliffe ABSTRACT. The TPTP world is a well established infrastructure that supports research, development, and deployment of Automated Theorem Proving systems for classical logics. The TPTP language is one of the keys to the success of the TPTP world. Originally the TPTP world supported only firstorder clause normal form (CNF). Over the years support for full firstorder form (FOF), monomorphic typed firstorder form (TF0), rank1 polymorphic typed firstorder form (TF1), monomorphic typed higherorder form (TH0), and rank1 polymorphic typed higherorder form (TH1), have been added. TF0 and TF1 together form the TFF language family; TH0 and TH1 together form the THF language family. This paper introduces the eXtended Typed Firstorder form (TFX), which extends TFF to include boolean terms, tuples, conditional expressions, and let expressions. 
PRUV regular papers
11:00  ABSTRACT. When combining beliefs from different sources, often not only new knowledge but also conflicts arise. In this paper, we investigate how we can measure the disagreement among sources. We start our investigation with disagreement measures that can be induced from inconsistency measures in an automated way. After discussing some problems with this approach, we propose a new measures that is inspired by the etainconsistency measure. Roughly speaking, it measures how well we can satisfy all sources simultaneously. We show that the new measure satisfies desirable properties, scales well with respect to the number of sources and illustrate its applicability in inconsistencytolerant reasoning. 
11:30  Evidential Group Decision Making Model with BeliefBased Preferences SPEAKER: Aouatef Rouahi ABSTRACT. In a large number of problems such as product configuration, automated recommendation and combinatorial auctions, decisions stem from agents subjective preferences and requirements in the presence of uncertainty. In this paper, we introduce a framework based on the belief function theory to deal with problems in group decision settings where the preferences of the agents may be uncertain, imprecise, incomplete, conflicting and possibly distorted. A case study is conducted on product recommendation to illustrate the applicability and validity of the proposed framework. 
12:00  SPEAKER: Jan Maly ABSTRACT. Soft constraints play a major rule in AI, since they allow to restrict the set of possible worlds (obtained from hard constraints) to a small fraction of preferred or most plausible states. Only a few formalisms fully integrate soft and hard constraints. A prominent examples is Qual itative Choice Logic (QCL), where propositional logic is augmented by a dedicated connective and preferred models are discriminated via ac ceptance degress determined by this connective. In this work, we follow an analogous approach in terms of syntax but propose an alternative semantics. The key idea is to assign to formulas a set of models plus a partial relation on these models. Preferred models are then obtained from this partial relation. We investigate properties of our logic which demon strate that our semantics shows some favorable behavior compared to QCL. Moreover, we provide a partial complexity analysis of our logic. 
11:00  How to Free Yourself from the Curse of Bad Equilibria ABSTRACT. A standard problem in game theory is the existence of equilibria with undesirable properties. To pick a famous example, in the Prisoners Dilemma, the unique dominant strategy equilibrium leads to the only Pareto inefficient outcome in the game, which is strictly worse for all players than an alternative outcome. If we are interested in applying game theoretic solution concepts to the analysis of distributed and multiagent systems, then these undesirable outcomes correspond to undesirable system properties. So, what can we do about this? In this talk, I describe the work we have initiated on this problem. I discuss three approaches to dealing with this problem in multiagent systems: the design of norms or social laws; the use of communication to manipulate the beliefs of players; and in particular, the use of taxation schemes to incentivise desirable behaviours. 
11:30  Specification and Verification for Robots that Learn 
12:00  Moral Permissibility of Actions in Smart Home Systems SPEAKER: Michael Fisher ABSTRACT. With this poster, we present ongoing work of how to operate a smart home via a Hybrid Ethical Reasoning Agent (HERA). This work is part of the broader scientific effort to implement ethics on computer systems known as machine ethics. We showcase an everyday example involving a mother and a child living in the smart home. Our formal theory and implementation allows us to evaluate actions proposed by the smart home from different ethical points of view, i.e. utilitarianism, Kantian ethics and the principle of double effect. We discuss how formal verification, in the form of modelchecking, can be used to check that the modeling of a problem for reasoning by HERA conforms to our intuitions about ethical action. 
11:00  RuleBased Gillespie Simulation of Chemical Systems SPEAKER: Jakob Lykke Andersen 
11:20  Rulebased design of computational DNA devices SPEAKER: Carlo Spaccasassi 
11:40  Noise control for molecular computing SPEAKER: Tomislav Plesa ABSTRACT. With the advancement in nucleicacidbased technology in general, and stranddisplacement DNA computing in particular, To bridge the gap, the noisecontrol algorithm for designing biochemical networks will be presented in 
12:00  SPEAKER: Luca Laurenti 
Security
11:00  Constructing Independently Verifiable PrivacyCompliant Type Systems for Message Passing between BlackBox Components SPEAKER: Robin Adams ABSTRACT. Privacy by design (PbD) is the principle that privacy should be considered at every stage of the software engineering process. It is increasingly both viewed as best practice and required by law. It is therefore desirable to have formal methods that provide guarantees that certain privacyrelevant properties hold. We propose an approach that can be used to design a privacycompliant architecture without needing to know the source code or internal structure of any individual component. We model an architecture as a set of agents or components that pass messages to each other. We present in this paper algorithms that take as input an architecture and a set of privacy constraints, and output an extension of the original architecture that satisfies the privacy constraints. 
11:30  SideTrail: Verifying TimeBalancing of Cryptosystems SPEAKER: Daniel SchwartzNarbonne ABSTRACT. Timingbased sidechannel attacks are a serious security risk for modern cryptosystems. The timebalancing countermeasure used by several TLS implementations (e.g. s2n, gnuTLS) ensures that execution timing is negligibly influenced by secrets, and hence no attackerobservable timing behavior depends on secrets. These implementations can be difficult to validate, since timebalancing countermeasures depend on global properties across multiple executions. In this work we introduce the tool SideTrail, which we use to prove the correctness of timebalancing countermeasures in s2n, the opensource TLS implementation used across a range of products from AWS, including S3. SideTrail is used in s2n’s continuous integration process, and has detected three sidechannel issues that the s2n team confirmed and repaired before the affected code was deployed to production systems. 
12:00  Towards verification of Ethereum smart contracts: a formalization of core of Solidity ABSTRACT. Solidity is the most popular programming language for writing smart contracts on the Ethereum platform. Given that smart contracts often manage large amounts of valuable digital assets, considerable interest has arisen in formal verification of Solidity code. Designing verification tools requires good understanding of language semantics. Acquiring such an understanding in case of Solidity is difficult as the language lacks even an informal specification. In this work, we evaluate the feasibility of formalization of Solidity and propose a formalization of a small subset of Solidity that contains its core data model and some unique features, such as function modifiers. 
Contributed Talks: Robotics II
11:00  Robot Demonstration SPEAKER: Nick Hawes 
11:00  Complexity Analysis for Bitvector Programs SPEAKER: Jürgen Giesl ABSTRACT. In earlier work, we developed approaches for automated termination analysis of several different programming languages, based on backend techniques for termination proofs of term rewrite systems and integer transition systems. In the last years, we started adapting these approaches in order to analyze the complexity of programs as well. However, up to now a severe drawback was that we assumed the program variables to range over mathematical integers instead of bitvectors. This eases mathematical reasoning but is unsound in general. While we recently showed how to handle fixedwidth bitvector integers in termination analysis, we now present the first technique to analyze the runtime complexity of programs with bitvector arithmetic. We implemented our contributions in the tool AProVE and evaluate its power by extensive experiments. 
11:30  A PerronFrobenius Theorem for Jordan Blocks for Complexity Proving SPEAKER: Akihisa Yamada ABSTRACT. We consider complexity proofs for rewrite systems that involve matrix interpretations. In order to certify these proofs, we have to validate polynomial bounds on the matrix growth of A^n for some nonnegative realvalued square matrix A. Whereas our earlier certification criterion used algebraic number arithmetic in order to compute all maximal Jordan blocks, in this paper we present a PerronFrobenius like theorem. Based on this theorem it suffices to just compute the Jordan Blocks of eigenvalue 1, which can easily be done. This not only helps to improve the efficiency of our certifier CeTA, but might also be used to actually synthesize suitable matrix interpretations. 
12:00  Inference of Linear UpperBounds on the Expected Cost by Solving Cost Relations SPEAKER: Alicia Merayo ABSTRACT. In this extended abstract, we describe a preliminary work on inferring linear upperbounds on the expected cost for controlflow graphs as via cost relations, with the goal of integrating this process in the SACO tool, whose cost analyzer is based on the use of cost relations as well. 
Panel session focusing on key issues relating to the verification and validation of autonomous systems. Panelists include Calin Belta (Boston University), Jérémie Guiochet (University of Toulouse III), Florian Lier (Bielefeld University) and Alice Miller (University of Glasgow).
Mostly harmless  luring programmers into proof with SPARK
14:00  Cyclic proofs, hypersequents and Kleene algebra  Invited Talk ABSTRACT. Logics arising from automaton theory, relational algebra and formal language theory are still not well understood in prooftheoretic terms. However, there is increasing evidence to suggest that emerging trends in structural proof theory, namely richer proof calculi and nonwellfounded proofs, possess the capacity to capture these fundamental frameworks, and even deliver important metalogical results of independent interest. In this talk I will speak about a recent line of work that uses a combination of internal calculi and cyclic proofs (arguably an external feature) to recover proof theoretic treatments for Kleene algebras (KA) and action lattices (AL). Starting with KA, I show how a natural sound and complete nonwellfounded system can be made regular via enriching the syntax of proof lines to hypersequents. This takes advantage of several basic techniques on the cyclic proof side as well as known normal forms from automaton theory, morally corresponding to the use of hypersequents. This calculus exhibits optimal proofsearch complexity and, as an application, I will briefly present work in progress on how to recover an alternative proof of the completeness of KA with respect to inclusions of rational languages. I will also speak about a proof theoretic account of AL, which is of natural interest to structural proof theorists since it extends the full Lambek calculus by a single fixed point: the Kleene star. The main result here is cutelimination, which requires a sophisticated computational interpretation into a higherorder language. As an application we are able to recover an (optimal) complexity bound for *continuous action lattices. The aim of this talk is to give a concrete showcase of modern trends structural proof theory, and moreover to leave the audience with some motivational problems for the area at large. At a highlevel, I also hope to explore a moral connection, in certain settings, between the richness of a proof calculus and succinct representations arising from automaton theory, which could direct a line of future research for the area.

15:00  Proof Translations in BI logic SPEAKER: Daniel Mery ABSTRACT. In order to study proof translations in BI logic, we consider first the bunched sequent calculus LBI and then define a new labelled sequent calculus, called GBI. Therefore we propose a procedure that translates LBI proofs into GBI proofs and prove its soundness. Moreover we discuss some steps towards the reverse translation of GBI proofs into LBI proofs and propose an algorithm that is illustrated by some examples. Further work will be devoted to its proof of correctness and also to the design of translations of LBI proofs to TBI (labelledtableaux) proofs. 
14:00  SPEAKER: Matthias Schlaipfer ABSTRACT. We present a simple and efficient translation of the classical multisuccedent sequent calculus LK to natural deduction. This translation aims to produce few excluded middle inferences, and to keep the quantifier complexity of the instances of excluded middle low. We evaluate the translation on proofs imported from resolutionbased firstorder theorem provers, and show that it achieves these goals in an efficient manner. 
14:30  Set of Support for HigherOrder Reasoning SPEAKER: Giles Reger ABSTRACT. Higherorder logic (HOL) is utilised in numerous domains from program verification to the formalisation of mathematics. However, automated reasoning in the higherorder domain lags behind firstorder automation. Many higherorder automated provers translate portions of HOL problems to firstorder logic (FOL) and pass them to FOL provers. However, FOL provers are not optimised for dealing with these translations. One of the reasons for this is that the axioms introduced during the translation (e.g. those defining combinators) may combine with each during proof search, deriving consequences of the axioms irrelevant to the goal. In this work we evaluate the extent to which this issue affects proof search and introduce heuristics based on the setofsupport strategy for minimising the effects. Our experiments are carried out within the Vampire theorem prover and show that limiting how axioms introduced during translation can improve proof search with higherorder problems. 
15:00  Evaluating PreProcessing Techniques for the Separated Normal Form for Temporal Logics SPEAKER: Cláudia Nalon ABSTRACT. We consider the transformation of propositional linear time temporal logic formulae into a clause normal form, called Separated Normal Form, suitable for resolution calculi. In particular, we investigate the effect of applying various preprocessing techniques on characteristics of the normal form and determine the best combination of techniques on a large collection of benchmark formulae. 
14:00  An abstractionrefinement framework for automated reasoning  a pragmatic approach (invited talk) ABSTRACT. Abstractionrefinement is widely used in verification but, with some exceptions, is largely overlooked in the stateoftheart automated theorem proving. One of our main motivations comes from the problem of reasoning with large theories where usually only a small part of the theory is needed for proving the conjecture. Efficient reasoning with large theories is one of the main challenges in automated theorem proving arising in many applications including reasoning with ontologies, large mathematical theories and verification. Current methods for reasoning with large theories are mainly based on axiom selection which we believe is limited in the scope. We propose a general approach to automated reasoning based on abstractionrefinement which aims at over and under approximation of theories in order to speed up reasoning. In particular, in this approach axiom selection can be seen as a special case of under approximation. In this talk we present a general abstractionrefinement framework for automated reasoning, draw connections with current approaches and discuss some practical abstractions. This is a joint work with Julio Cesar Lopez Hernandez. 
15:10  Reasoning about exceptions in ontologies: from the lexicographic closure to the skeptical closure SPEAKER: Laura Giordano ABSTRACT. Reasoning about exceptions in ontologies is nowadays one of the challenges the description logics community is facing. The paper describes a preferential approach for dealing with exceptions in Description Logics, based on the rational closure. The rational closure has the merit of providing a simple and efficient approach for reasoning with exceptions, but it does not allow independent handling of the inheritance of different defeasible properties of concepts. In this work we outline a possible solution to this problem by introducing a variant of the lexicographical closure, that we call {\em skeptical closure}, which requires to construct a single base. This work is based on the extended abstract presented at CILC/ICTCS 2017. 
14:00  TBA 
14:30  The computational neuroscience of humanrobot relationships 
15:00  Trust, Failure, and SelfAssessment During HumanRobot Interaction 
New Applications
14:00  Relational Equivalence Proofs Between Imperative and MapReduce Algorithms SPEAKER: Bernhard Beckert ABSTRACT. Distributed programming frameworks like MapReduce, Spark and Thrill, are widely used for the implementation of algorithms operating on large datasets. However, implementing in these frameworks is more demanding than coming up with sequential implementations. One way to achieve correctness of an optimized implementation is by deriving it from an existing imperative sequential algorithm description through a sequence of behaviorpreserving transformations. We present a novel approach for proving equivalence between imperative and MapReduce algorithms based on partitioning the equivalence proof into a sequence of equivalence proofs between intermediate programs with smaller differences. Our approach is based on the insight that proofs are best conducted using a combination of two kinds of steps: (1) uniform contextindependent rewriting transformations; and (2) contextdependent flexible transformations that can be proved using relational reasoning with coupling invariants. We demonstrate the feasibility of our approach by evaluating it on two prototypical algorithms commonly used as examples in MapReduce frameworks: kmeans and PageRank. To carry out the proofs, we use a higherorder theorem prover with partial proof automation. The results show that our approach and its prototypical implementation enable equivalence proofs of nontrivial algorithms and could be automated to a large degree. 
14:30  SPEAKER: David Cok ABSTRACT. We describe new capabilities added to the Java Modeling Language and the OpenJML deductive program verification tool to support functional programming features introduced in Java 8. We also report on the application of the extensions to a secure streaming protocol library developed by Amazon Web Services and used as a foundation by services it provides. We found that the application under study used a small set of functional programming idioms; methods using these idioms could be verified by techniques that used only firstorder logic and did not need all the features that might be required for full generality of functional programming. 
15:00  Verification of Binarized Neural Networks via InterNeuron Factoring SPEAKER: ChungHao Huang ABSTRACT. Binarized Neural Networks (BNN) have recently been proposed as an energyefficient alternative to more traditional learning networks. Here we study the problem of formally verifying BNNs by reducing it to a corresponding hardware verification problem. The main step in this reduction is based on factoring computations among neurons within a hidden layer of the BNN in order to make the BNN verification problem more scalable in practice. The main contributions of this paper include results on the NPhardness and hardness of PTAS approximability of this essential optimization and factoring step, and we design polynomialtime search heuristics for generating approximate factoring solutions. With these techniques we are able to scale the verification problem to moderatelysized BNNs for embedded devices with thousands of neurons and inputs. 
14:00  ControlFlow Refinement via Partial Evaluation SPEAKER: Jesús Doménech ABSTRACT. In this extended abstract we explored the use of partial evaluation as a controlflow refinement technique in the context for termination and cost analysis. Our preliminary experiments show that partial evaluation can improve both analyses. 
14:30  Verification of Rewritingbased Query Optimizers SPEAKER: Balaji Krishnamurthy ABSTRACT. We report on our ongoing work on automated verification of rewritingbased query optimizers. Rewritingbased query optimizers are a widely adapted in relational database architecture however, designing these rewrite systems remains a challenge. In this paper, we discuss automated termination analysis of optimizers where rewriterules are expressed in HoTTSQL. We discuss how it is not sufficient to reason about rule specific (local) properties such as semantic equivalence, and it is necessary to work with setofrules specific (global) properties such as termination and loopfreeness to prove correctness of the optimizer. We put forward a way to translate the rules in HoTTSQL to Term Rewriting Systems, opening avenues for the use of termination tools in the analysis of rewritingbased transformations of HoTTSQL database queries. 
15:00  TTT2 with Termination Templates for Teaching SPEAKER: Jonas Schöpf ABSTRACT. On the one hand, checking specific termination proofs by hand, say using a particular collection of matrix interpretations, can be an arduous and errorprone task. On the other hand, automation of such checks would save time and help to establish correctness of exam solutions, examples in lecture notes etc. To this end, we introduce a template mechanism for the termination tool TTT2 that allows us to restrict parameters of certain termination methods. In the extreme, when all parameters are fixed, such restrictions result in checks for specific proofs. 
15:00  Bayesian Verification of Chemical Reaction Networks SPEAKER: Gareth Molyneux 
15:10  Computational Approaches for the Programmed Assembly of Nanocellulose Meshes SPEAKER: Corina Itcus 
16:00  RuleBased Synthesis of Chains of Security Functions for SoftwareDefined Networks SPEAKER: Nicolas Schnepf ABSTRACT. Softwaredefined networks (SDN) offer a high degree of programmability for handling and forwarding packets. In particular, they allow network administrators to combine different security functions, such as firewalls, intrusion detection systems, and external services, into security chains designed to prevent or mitigate attacks against end user applications. Because of their complexity, the configuration of these chains can benefit from formal techniques for their automated construction and verification. We propose in this paper a rulebased system for automating the composition and configuration of chains of security functions for Android applications. Given a characterization of the network traffic generated by an application and the set of permissions it requires, our rules construct an abstract representation of a custom security chain. This representation is then translated into a concrete implementation of the chain in Pyretic, a domainspecific language for programming SDN controllers. We prove that the chains produced by our rules satisfy a number of correctness properties such as the absence of black holes or loops, and shadowing freedom, and that they are coherent with the underlying security policy. 
16:30  SPEAKER: Narjes Jomaa ABSTRACT. The development of provably secure OS kernels represents a fundamental step in the creation of safe and secure systems. To this aim, we propose the notion of protokernel and an implementation  the Pip protokernel  as a separation kernel whose trusted computing base is reduced to its bare bones, essentially providing separation of tasks in memory, on top of which noninfluence can be proved. This prooforiented design allows us to formally prove separation properties on a concrete executable model very close to its automatically extracted C implementation. Our design is shown to be realistic as it can execute isolated instances of a realtime embedded system that has moreover been modified to isolate its own processes through the Pip services. 
16:00  Intuitionistic multiagent subatomic natural deduction for belief and knowledge ABSTRACT. We outline a prooftheoretic approach to the semantics of intensional operators for intuitionistic belief and knowledge, which does neither intend to capture a given target modal logic, nor to import ideas from relational modeltheoretic semantics (e.g., truth at indices, accessibility between indices) via labelled and relational formulae into the language. On this approach, the meaning of these intensional operators is explained exclusively by appeal to the structure of proofs which are unaided by such means. The prooftheoretic framework chosen for this purpose is subatomic natural deduction proposed by the author in earlier work. The resulting systems normalize and enjoy the subexpression property (a refinement of the subformula property). They can be used for the logical analysis of reasoning patterns which involve complex multiagent belief constructions (e.g., reciprocating or universal beliefs).

16:30  SPEAKER: Marianna Girlando ABSTRACT. The logic of Conditional Beliefs (CDL) has been introduced by Board, Baltag and Smets to reason about knowledge and revisable beliefs in a multiagent setting. Our aim is to develop standard internal calculi for this logic. As a preliminary result we propose an internal hypersequent calculus for it in the single agent case. 
17:00  SPEAKER: Camillo Fiorentini ABSTRACT. The inverse method is a saturation based theorem proving technique; it relies on a forward proofsearch strategy and can be applied to cutfree calculi enjoying the subformula property. Here we apply this method to derive the unprovability of a goal formula G in Intuitionistic Propositional Logic. To this aim we design a forward calculus FRJ(G) for Intuitionistic unprovability which is prone to constructively ascertain the unprovability of a formula G by providing a concise countermodel for it; in particular we prove that the generated countermodels have minimal height. Moreover, we clarify the role of the saturated database obtained as result of a failed proofsearch in FRJ(G) by showing how to extract from such a database a derivation witnessing the Intuitionistic validity of the goal. 
16:00  SPEAKER: Mukund Raghothaman ABSTRACT. Building effective program analysis tools is a challenging endeavor: analysis designers must balance multiple competing objectives, including scalability, fraction of false alarms, and the possibility of missed bugs. Not all of these design decisions are optimal when the analysis is applied to a new program with different coding idioms, environment assumptions, and quality requirements. Furthermore, the alarms produced are typically accompanied by limited information such as their location and abstract counterexamples. We present a framework Difflog that fundamentally extends the deductive reasoning rules that underlie program analyses with numerical weights. Each alarm is now naturally accompanied by a score, indicating quantities such as the confidence that the alarm is a real bug, the anticipated severity, or expected relevance of the alarm to the programmer. To the analysis user, these techniques offer a lens by which to focus their attention on the most important alarms and a uniform method for the tool to interactively generalize from human feedback. To the analysis designer, these techniques offer novel ways to automatically synthesize analysis rules in a datadriven style. Difflog shows large reductions in false alarm rates and missed bugs in large, complex programs, and it advances the stateoftheart in synthesizing nontrivial analyses. 
16:30  SPEAKER: Milan Cvitkovic ABSTRACT. A major challenge when using techniques from Natural Language Processing for supervised learning on computer program source code is that many words in code are neologisms. Reasoning over such an unbounded vocabulary is not something NLP methods are typically suited for. We introduce a deep model that contends with an unbounded vocabulary (at training or test time) by embedding new words as nodes in a graph as they are encountered and processing the graph with a Graph Neural Network. 
17:00  SPEAKER: Vadim Markovtsev ABSTRACT. Programmers make rich use of natural language in the source code they write through identifiers and comments. Source code identifiers are selected from a pool of tokens which are strongly related to the meaning, naming conventions, and context. These tokens are often combined to produce more precise and obvious designations. Such multipart identifiers count for 97% of all naming tokens in the Public Git Archive  the largest dataset of Git repositories to date. We introduce a bidirectional LSTM recurrent neural network to detect subtokens in source code identifiers. We trained that network on 41.7 million distinct splittable identifiers collected from 182,014 open source projects in Public Git Archive, and show that it outperforms several other machine learning models. The proposed network can be used to improve the upstream models which are based on source code identifiers, as well as improving developer experience allowing writing code without switching the keyboard case. 
17:30  Open Business Meeting for Future of Machine Learning for Programming 
16:00  SPEAKER: Alireza Ensan ABSTRACT. Automated decision making often requires solving difficult and primarily NPhard problems. In many AI applications (e.g., planning, robotics, recommender systems), users can assist decision making by specifying their preferences over some domain of interest. To take preferences into account, we take a modeltheoretic approach to both computations and preferences. Computational problems are characterized as model expansion, that is the logical task of expanding an input structure to satisfy a specification. The uniformity of the modeltheoretic approach allows us to link preferences and computations by introducing a framework of a prioritized model expansion. The main technical contribution is an analysis of the impact of preferences on the computational complexity of model expansion. We also study an application of our framework for the case where some information about preferences is missing. Finally, we discuss how prioritized model expansion can be related to other preferencebased declarative programming paradigms. 
16:30  A New Probabilistic Algorithm for Approximate Model Counting SPEAKER: Cunjing Ge ABSTRACT. t. Constrained counting is important in domains ranging from artificial intelligence to software analysis. There are already a few approaches for counting models over various types of constraints. Recently, hashingbased approaches achieve success but still rely on solution enumeration. In the IJCAR version, a probabilistic approximate model counter is proposed, which is also a hashingbased universal framework, but with only satisfiability queries. A dynamic stopping criteria, for the algorithm, is presented, which has not been studied yet in previous works of hashingbased approaches. Although the algorithm lacks theoretical guarantee, it works well in practice. In this paper, we further extend our approach to SMT(LIA) formulas with a new encoding technique. 
17:00  SPEAKER: Maria Vanina Martinez ABSTRACT. The entity resolution problem in traditional databases, also known as deduplication, seeks to map multiple virtual objects to its corresponding set of realworld entities. Though the problem is challenging, it can be tackled in a variety of ways by means of leveraging several simplifying assumptions, such as the fact that the multiple virtual objects appear as the result of name or attribute ambiguity, clerical errors in data entry or formatting, missing or changing values, or abbreviations. However, in cyber security domains the entity resolution problem takes on a whole different form, since malicious actors that operate in certain environments like hacker forums and markets are highly motivated to remain semianonymousthis is because, though they wish to keep their true identities secret from law enforcement, they also have a reputation to keep with their customers. The above simplifying assumptions cannot be made, and we therefore coin the term "adversarial deduplication" to refer to this setting. In this paper, we propose the use of probabilistic existential rules (also known as Datalog+/) to model knowledge engineering solutions to this problem; we show that tuplegenerating dependencies can be used to generate probabilistic deduplication hypotheses, and equalitygenerating dependencies can later be applied to leverage existing data towards grounding such hypotheses. The main advantage with respect to existing deduplication tools is that our model operates under the openworld assumption, and thus is capable of modeling hypotheses over unknown objects, which can later become known if new data becomes available. 
17:30  SPEAKER: Cunjing Ge ABSTRACT. We present VolCE, a tool for computing and estimating the size of the solution space of SMT(LA) constraints. VolCE supports the SMTLIB format. It integrates SMT solving with volume computation/estimation and integer solution counting for convex polytopes. Several effective techniques are adopted, which enable the tool to deal with highdimensional problem instances efficiently. 
16:00  Safety and Trust in Autonomous Driving 
16:00  Equivalence of chemical reaction networks in a CRNtoDNA compiler framework. SPEAKER: Stefan Badelt 
16:20  Using Transitivity and Modularity in Chemical Reaction Network Bisimulation 
16:40  Conservative approximations of polymers SPEAKER: Jerome Feret ABSTRACT. We propose a systematic approach to approximate the behaviour of models of polymers synthesis/degradation. Our technique consists in discovering timedependent lower and upper bounds for the concentration of some patterns of interest. These bounds are obtained by approximating the state of the system by a hyperbox, with differential equations defining the evolution of the coordinates of each hyperface. The equation of each hyperface is obtained by pessimistically bounding the derivative with respect to the corresponding coordinate when the system state ranges over this hyperface. In order to synthesise these bounds, we use Kappa to describe our models of polymers. This provides symbolic equalities and inequalities which intentionally may be understood as algebraic constructions over patterns, and extensionally as sound properties about the concentration of the biomolecular species that contain these patterns. 
17:00  Executable Biochemical Space for Specification and Analysis of Biochemical Systems SPEAKER: David Šafránek 
17:20  Formal Verification of NetworkBased Biocomputation Circuits SPEAKER: Hillel Kugler 
Off the beaten track
16:00  SPEAKER: Alexander Nutz ABSTRACT. We present a method that allows us to infer expressive in variants for programs that manipulate arrays and, more generally, data that are modeled using maps (including the program memory which is modeled as a map over integer locations). The invariants can express, for example, that memory cells have changed their contents only at lo cations that have not been previously allocated by another procedure. The motivation for the new method stems from the fact that, although stateoftheart SMT solvers are starting to be able to check the validity of more and more complex invariants, there is not much work yet on their automatic inference. We present our method as a static analysis over an abstract domain that we introduce, the map equality domain. 
16:30  SPEAKER: Sarah Winkler ABSTRACT. Logically constrained rewrite systems constitute a very general rewriting formalism that can capture simplification processes in various domains as well as computation in imperative programs. In both of these contexts, nontermination is a critical source of errors. We present new criteria to find loops in logically constrained rewrite systems which are implemented in the tool Ctrl. We illustrate the usefulness of these criteria in three example applications: to find loops in LLVM peephole optimizations, to detect looping program executions of C programs, and to establish nontermination of integer transition systems. 
17:00  Store Buffer Reduction in The Presence of MixedSize Accesses and Misalignment ABSTRACT. Naive programmers believe that a multithreaded execution of their program is some simple interleaving of steps of individual threads. To increase performance, modern Intel and AMD processors make use of store buffers, which cause unexpected behaviors that can not be explained by the simple interleaving model. Programs that in the simple interleaving model obey one of various programming disciplines do not suffer from these unexpected behaviors in the presence of store buffers. These disciplines require that the program does not make use of several concrete features of modern processors, such as mixedsize/misaligned memory accesses and interprocessor interrupts. A common assumption is that this requirement is posed only to make the formal description and soundness proof of these disciplines tractable, but that the disciplines can be extended to programs that make use of these features with a lot of elbow grease and straightforward refinements of the programming discipline. In this paper we discuss several of such features where that assumption is correct and two such features where it is not, namely mixedsize/misaligned accesses and interprocessor interrupts. We base our discussion on two programming disciplines from the literature. We present nontrivial extensions of the more efficient of the two programming disciplines that work for programs that use these features. Our work is based directly on the 500 page PhD thesis of the author, which includes a formal treatment of the extensions and a detailed soundness proof. 
16:00  Termination of LambdaPi modulo rewriting using the sizechange principle SPEAKER: Guillaume Genestier ABSTRACT. The SizeChange Principle was first introduced to study the termination of firstorder functional programs. In this work, we show that it can also be used to study the termination of higherorder rewriting in a system of dependent types extending LF. 
16:30  Improving Static Dependency Pairs for HigherOrder Rewriting SPEAKER: Carsten Fuhs ABSTRACT. We revisit the static dependency pair method for termination of higherorder term rewriting. In this extended abstract, we generalise the applicability criteria for the method. Moreover, we propose a static dependency pair framework based on an extended notion of computable dependency chains that harnesses the computabilitybased reasoning used in the soundness proof of static dependency pairs. This allows us to propose a new termination proving technique to use in combination with static dependency pairs: the static subterm criterion. 