CADE-28: THE 28TH INTERNATIONAL CONFERENCE ON AUTOMATED DEDUCTION
LFMTP 2021 ON FRIDAY, JULY 16TH

View: session overviewtalk overviewside by side with other conferences

09:00-10:00 Session 36A: LFMTP Workshop: Meta-Theory Reasoning
Location: ZoomRoom 3
09:00
SMLtoCoq: Automated Generation of Coq Specifications and Proof Obligations from SML Programs with Contracts
PRESENTER: Laila El-Beheiry

ABSTRACT. Formally reasoning about functional programs is supposed to be straightforward and elegant, however, it is not typically done as a matter of course. Reasoning in a proof assistant requires "reimplementing" the code in those tools, which is far from trivial. SMLtoCoq provides an automatic translation of SML programs and function contracts into Coq. Programs are translated into Coq specifications, and function contracts into theorems, which can then be formally proved. Using the Equations plugin and other well established Coq libraries, we are able to translate SML programs without side-effects containing partial functions, structures, functors, records, among others. Additionally, we provide a Coq version of many parts of SML’s basis library, so that calls to these libraries are kept almost as is.

09:30
Countability of Inductive Types Formalized in the Object-Logic Level
PRESENTER: Xiwei Wu

ABSTRACT. The set of integer number lists with finite length, and the set of binary trees with integer labels are both countably infinite. Many inductively defined types also have countable elements. In this paper, we formalize the syntax of first-order inductive definitions in Coq and prove them countable, under some side conditions. Instead of writing a proof generator in a meta-language, we develop an axiom-free proof in the Coq object logic. Our proof is a dependently typed Coq function from the syntax of the inductive definition to the countability of the type. Based on this proof, we provide a Coq tactic to automatically prove the countability of concrete inductive types. We also developed Coq libraries for countability and for the syntax of inductive definitions, which are values in their own right.

10:00-10:30Break
10:30-12:00 Session 37D: LFMTP Workshop: Foundations
Location: ZoomRoom 3
10:30
Automating Induction by Reflection

ABSTRACT. Despite recent advances in automating theorem proving in full first-order theories, inductive reasoning still poses a serious challenge to state-of-the-art theorem provers. The reason for that is that in first-order logic induction requires an infinite number of axioms, which is not a feasible input to a computer-aided theorem prover requiring a finite input. Mathematical practice is to specify these infinite sets of axioms as axiom schemes. Unfortunately these schematic definitions cannot be formalized in first-order logic, and therefore not supported as inputs for first-order theorem provers. In this work we introduce a new method, inspired by the field of axiomatic theories of truth, that allows to express schematic inductive definitions, in the standard syntax of multi-sorted first-order logic. Further we test the practical feasibility the method with state- of-the-art theorem provers, comparing it to solvers’ native techniques for handling induction.

11:00
Interacting safely with an unsafe environment

ABSTRACT. We give a presentation of Pure type systems where contexts need not be well-formed and show that this presentation is equivalent to the usual one. The main motivation for this presentation is that, when we extend Pure type systems with computation rules, like in the logical framework Dedukti, we want to declare the constants before the computation rules that are needed to check the well-typedness of their type.

11:30
Foundations of the Squirrel meta-logic for reasoning over security protocols
PRESENTER: David Baelde

ABSTRACT. The formal verification of security protocols can be carried out in two categories of models. Symbolic models, pioneered by Dolev and Yao, represent messages by first-order terms and attacker capabilities by inference rules or equational theories. This approach allows automatic verification, notably using techniques from rewriting and logic. The computational model, however, represents messages as bitstrings and attackers as probabilistic polynomial-time (PTIME) Turing machines. It is the standard model for provable security in cryptography, but formal verification in that model remains challenging. Bana and Comon have recently proposed a compelling approach to derive guarantees in the computational model, which they call the computationally complete symbolic attacker (CCSA). It is based on first-order logic, with complex axioms derived from cryptographic assumptions. Verification in the original CCSA approach involves a translation of protocol traces into first-order terms that necessitates to bound the length of traces. The generated goals are difficult to process by humans and, so far, cannot be handled automatically either. We have proposed a refinement of the approach based on a meta-logic which conveniently replaces the translation step. We have implemented this refined approach in an interactive theorem prover, Squirrel, and validated it on a first set of case studies. In this paper, we present some improvements of the foundations of the Squirrel meta-logic and of its proof system, which are required as we are considering more complex case studies. We extend our model to support memory cells in order to allow the analysis of security protocols that manipulate states. Moreover, we adapt the notion of trace model, which provides the semantics of our meta-logic formulas, to enable more natural and expressive modelling. We also present a generalized proof system which allows to derive more protocol properties.

12:00-12:30Break
13:30-14:00 Session 40: Foundations: MetaCoq
Location: ZoomRoom 3
13:30
Translating Formalizations of Type Theories from Intrinsic to Extrinsic Style
PRESENTER: Navid Roux

ABSTRACT. Type theories can be formalized using the intrinsically (hard) or the extrinsically (soft) typed style. In large libraries of type theoretical features, often both styles are present, which can lead to code duplication and integration issues.

We define an operator that systematically translates a hard-typed into the corresponding soft-typed formulation. Even though this translation is known in principle, a number of subtleties make it more difficult than naively expected. Importantly, our translation preserves modularity, i.e., it maps structured sets of hard-typed features to correspondingly structured soft-typed ones.

We implement our operator in the MMT system and apply it to a library of type-theoretical features.

14:00-14:30Break
14:30-16:00 Session 41C: LFMTP Workshop: Formalization
Location: ZoomRoom 3
14:30
Adelfa: A System for Reasoning about LF Specifications
PRESENTER: Mary Southern

ABSTRACT. We present a system called Adelfa that provides mechanized support for reasoning about specifications developed in the Edinburgh Logical Framework or LF. Underlying Adelfa is a new logic in which typing judgements in LF serve as atomic formulas and quantification is permitted over contexts and terms that appear in such formulas. Contexts, which constitute type assignments to uniquely named variables that are modelled using the technical device of nominal constants, are characterized in the logic by context schemas that describe their inductive structure. We present these formulas and an associated semantics before sketching a proof system for constructing arguments that are sound with respect to the semantics. We then outline the realization of this proof system in Adelfa and illustrate its use through a few example proof developments. We conclude the paper by relating Adelfa to existing systems for reasoning about LF specifications.

15:00
A logical framework with a graph meta-language
PRESENTER: Bruno Cuconato

ABSTRACT. We conjecture that the relative unpopularity of logical frameworks among practitioners is partly due to their complex meta-languages, which often demand both programming skills and theoretical knowledge of the meta-language in question for them to be fruitfully used. We present ongoing work on a logical framework with a meta-language based on graphs. A simpler meta-language leads to a shallower embedding of the object language, but hopefully leads to easier implementation and usage. A graph-based logical framework also opens up interesting possibilities in time and space performance, by the use of heavily optimized graph databases as backends, and by proof compression algorithms geared towards graphs. Deductive systems can be specified using simple domain specific languages that build on top of the graph database's query language, and there is support for both interactive (through a web-based interface) and semiautomatic (following user-defined tactics specified by a domain-specific language) proof modes. We have so far implemented nine systems for propositional logics, including Fitch-style and backwards-directed Natural Deduction systems for intuitionistic and classical logic (with the classical systems reusing the rules of the intuitionistic ones), and a Hilbert-style system for the K modal logic.

15:30
Towards a Semantic Framework for Policy Exchange in the Internet

ABSTRACT. The major issue addressed in {\it autonomous\/} data source management, notably within the framework of data exchange, is the variety of heterogeneous data schemas --- the {\it forms\/} the data must take. But semantics that give meaning to the factual data can be autonomous as well. In certain context such as Internet routing that lacks a central authority, both the data item --- distributed routing tables that collectively constitute a global path, and the semantic data --- polices that determine the selection of routing path, are independently managed by networks (autonomous systems (ASes)) participating in the Internet. A long standing networking problem is inflexible policy routing due to the lack of policy sharing mechanisms. This paper seeks to extend the classic data exchange framework to semantic data as a means to manage distributed, autonomous routing polices in the Internet. Specifically, we make the case of knowledge exchange, and use policy exchange as a driving problem --- a special case --- to understand the semantic framework of knowledge exchange: We identify several unique challenges, and illustrate how answer set programming --- originally developed for knowledge representation with negation and non-monotonic reasoning, and residue method based transformation --- originally developed for semantic query optimization, may serve as a point solution to the specific policy exchange problem.