FSCD 2020: INTERNATIONAL CONFERENCE ON FORMAL STRUCTURES FOR COMPUTATION AND DEDUCTION
PROGRAM FOR THURSDAY, JULY 2ND
Days:
previous day
next day
all days

View: session overviewtalk overview

13:00-14:00 Session 4: Invited Speaker: S. Ronchi Della Rocca. (all times are in CEST timezone - UTC+2)
13:00
A foundational lambda calculus for probabilistic computation

ABSTRACT. In order to model higher-order probabilistic computation, it is a natural approach to take the λ-calculus as general paradigm, and to enrich it with a probabilistic construct. The most simple and concrete way to do so is to equip the untyped λ-calculus with an operator ⊕, which models flipping a fair coin.  The resulting calculus is however non-confluent, as it has been observed early.  We define a new probabilistic  λ-calculus, which can play the role of a foundational calculus for probabilistic computation: in fact it satisfies all the basic properties we expect for a calculus, namely confluence and standardization. This result is reached thanks to a suitable restriction of the reduction rule dealing with the probabilistic choice. Moreover the notion of solvability is extended in the probabilistic setting, and it is characterized in the calculus by an intersection type assignment system.

14:00-14:30Coffee Break
14:30-16:30 Session 5A: Grammars, Automata and Decision Trees. (all times are in CEST timezone - UTC+2)
14:30
Adaptive Non-linear Pattern Matching Automata
PRESENTER: Rick Erkens

ABSTRACT. Efficient pattern matching is fundamental for practical term rewrite engines. By preprocessing the given patterns into a finite deterministic automaton the matching patterns can be decided in a single traversal of the relevant parts of the input term. Most automaton-based techniques are restricted to linear patterns, where each variable occurs at most once, and require an additional post-processing step to check so-called variable consistency. However, we can show that interleaving the variable consistency and pattern matching phases can reduce the number of required steps to find a match. We take the existing adaptive pattern matching automata as introduced by Sekar et al and extend it with consistency checks. We prove that the resulting deterministic pattern matching automaton is correct, and that its evaluation time is more efficient than two-phase approaches.

15:00
Size-Preserving Translations from Order-(n+1) Word Grammars to Order-n Tree Grammars
PRESENTER: Kazuyuki Asada

ABSTRACT. Higher-order grammars have recently been studied actively in the context of automated verification of higher-order programs. Asada and Kobayashi have previously shown that, for any order-(n+1) word grammar, there exists an order-n grammar whose frontier language coincides with the language generated by the word grammar. Their translation, however, blows up the size of the input word grammar, which inhibited complexity-preserving reductions from decision problems on word grammars to those on tree grammars. In this paper, we present a new translation from order-(n+1) word grammars to order-n tree grammars that is size-preserving in the sense that the size of the output tree grammar is polynomial in the size of an input tree grammar. The new translation and its correctness proof are arguably much simpler than the previous translation and proof.

15:30
Unital Anti-unification: Type and Algorithms

ABSTRACT. Unital theories are characterized by the property of some function symbols having (left and right) unit elements. We address three problems related to unital anti-unification (AU). First, we prove that when the term signature contains at least two unital functions, anti-unification is of type zero (nullary) by showing that there exists an AU problem, which does not have a minimal complete set of generalizations. Next, we consider two special cases: the linear variant (computed generalizations are linear) and one-unital fragment (the problems are formulated over a language with one unital symbol) and design AU algorithms for them. The algorithms are terminating, sound, complete and return tree grammars from which set of generalizations can be constructed. For the linear variant, the language of generalizations generated by the grammar is finite. In the one-unital fragment, the language might be infinite, but it contains a finite minimal complete set of generalizations. It means that both linear and one-unital AU are finitary. We also design an AU algorithm for the unrestricted case. It terminates and returns a tree grammar which produces an infinite set of generalizations.

This nullarity proof can be extended to other equational theories such as associative-unital, commutative-unital, and their combination, when multiple function symbols may have these properties. However, our argument fails for idempotent-unital theories. We end with a list of open questions derived from this as well as earlier work.

16:00
Undecidability of Semi-unification on a Napkin *** Best paper by a junior researcher

ABSTRACT. Semi-unification (unification combined with matching) has been proven undecidable by Kfoury, Tiuryn, and Urzyczyn in the 1990s. The original argument reduces Turing machine immortality via Turing machine boundedness to semi-unification. The latter part is technically most challenging, involving several intermediate models of computation.

This work presents a novel, simpler reduction from Turing machine boundedness to semi-unification. In contrast to the original argument, we directly translate boundedness to solutions of semi-unification and vice versa. In addition, the reduction is mechanized in the Coq proof assistant. Taking advantage of the simpler proof, the mechanization is comparatively short and fully constructive.

14:30-16:30 Session 5B: Categorical Semantics and Structures. (all times are in CEST timezone - UTC+2)
14:30
Towards Constructive Hybrid Semantics
PRESENTER: Sergey Goncharov

ABSTRACT. With hybrid systems becoming ever more pervasive, the underlying semantic challenges emerge in their entirety. The need for principled semantic foundations has been recognised previously in the case of discrete computation and discrete data, with subsequent implementations in programming languages and proof assistants. Hybrid systems, contrastingly, do not directly fit into the classical semantic paradigms due to the presence of quite specific “non-programmable” features, such as Zeno behaviour and the inherent indispensable reliance on an underlying notion of continuous time. Here, we analyse the phenomenon of hybrid semantics from a constructive viewpoint. In doing so, we propose a monad-based semantics, generic over a given ordered monoid, representing the time domain, hence abstracting over the monoid of constructive reals. We implement our construction as a higher inductive-inductive type in the recent cubical extension of Agda proof assistant, significantly using state-of-the-art advances of homotopy type theory. We show that classically, i.e. under the axiom of choice, the constructed monad admits an alternative charaterization in terms of directed chain completion.

15:00
String diagrams for optics

ABSTRACT. Optics are a data representation for compositional data access, with lenses as a popular special case. Hedges has presented a diagrammatic calculus for lenses, but in a way that does not generalize to other classes of optic. We present a calculus that works for all optics, not just lenses; this is done by embedding optics into their presheaf category, which naturally features string diagrams. We apply our calculus to the common case of lenses, and explore how the laws of optics manifest in this setting.

15:30
A Profunctorial Scott Semantics

ABSTRACT. In this paper, we study the bicategory of profunctors with the free finite coproduct pseudo-comonad and show that it constitutes a model of linear logic that generalizes the Scott model. We formalize the connection between the two models as a change of base for enriched categories which induces a pseudo-functor that preserves all the linear logic structure. We prove that morphisms in the co-Kleisli bicategory correspond to the concept of strongly finitary functors (sifted colimits preserving functors) between presheaf categories. We further show that this model provides solutions of recursive type equations which provides 2-dimensional models of the pure lambda calculus and we also exhibit a fixed point operator on terms which allows for the study of recursively defined terms.

16:00
Comprehension and quotient structures in the language of 2-categories

ABSTRACT. Lawvere observed in his celebrated work on hyperdoctrines that the set-theoretic schema of comprehension can be expressed in the language of categorical logic, as a comprehension structure on the functor p: E->B defining the hyperdoctrine. In this paper, we formulate and study a strictly ordered hierarchy of three notions of comprehension structure on a given functor p:E -> B, which we call (i) comprehension structure, of (ii) comprehension structure with section, and of (iii) comprehension structure with image. Our approach is 2-categorical and we thus formulate the three levels of comprehension structure on a general morphism p:E -> B in a 2-category K. This conceptual point of view on comprehension structures enables us to revisit the works by Fumex, Ghani and Johann on the duality between comprehension structures and quotient structures on a given functor p:E -> B. In particular, we show how to lift the comprehension and quotient structures on a functor p: E -> B to the categories of algebras or coalgebras associated to functors F_E: E -> B and F_B:B -> B of interest, in order to interpret reasoning by induction and coinduction in the traditional language of categorical logic, formulated in an appropriate 2-categorical way.

16:30-17:00Coffee Break
17:00-18:30 Session 6A: Theorem Proving Algorithms and Constraint Solving. (all times are in CEST timezone - UTC+2)
17:00
Constraint Solving over Multiple Similarity Relations
PRESENTER: Temur Kutsia

ABSTRACT. Similarity relations are reflexive, symmetric, and transitive fuzzy relations. They help to make approximate inferences, replacing the notion of equality. Similarity-based unification has been quite intensively investigated, as a core computational method for approximate reasoning and declarative programming. In this paper we make a step forward, considering solving constraints over several similarity relations, instead of a single one. Multiple similarities pose challenges to constraint solving, since we can not rely on the transitivity property anymore. Existing methods for unification with fuzzy proximity relations (reflexive, symmetric, non-transitive relations) do not provide a solution that would adequately reflect particularities of dealing with multiple similarities simultaneously. To address this problem, we develop a constraint solving algorithm for multiple similarity relations, prove its termination, soundness, and completeness properties, and discuss applications.

17:30
A Type Checker for a Logical Framework with Union and Intersection Types - System description

ABSTRACT. We present the syntax, semantics, and typing rules of Bull, a prototype theorem prover based on the ∆-framework, i.e. a fully-typed lambda-calculus decorated with union and intersection types, as described in previous papers by the authors. Bull also implements a subtyping algorithm for the Type Theory Ξ of Barbanera-Dezani-de'Liguoro. Bull has a command-line interface where the user can declare axioms, terms, and perform computations and some basic terminal-style features like error pretty-printing, subexpressions highlighting, and file loading. Moreover, it can typecheck a proof or normalize it. These terms can be incomplete, therefore the typechecking algorithm uses unification to try to construct the missing subterms. Bull uses the syntax of Berardi’s Pure Type Systems to improve the compactness and the modularity of the kernel. Abstract and concrete syntax are mostly aligned and similar to the concrete syntax of Coq. Bull uses a higher-order unification algorithm for terms, while typechecking and partial type inference are done by a bidirectional refinement algorithm, similar to the one found in Matita and Beluga. The refinement can be split into two parts: the essence refinement and the typing refinement. Binders are implemented using commonly-used de Bruijn indices. We have defined a concrete language syntax that will allow user to write ∆-terms. We have defined the reduction rules and an evaluator. We have implemented from scratch a refiner which does partial typechecking and type reconstruction. We have experimented Bull with classical examples of the intersection and union literature, such as the ones formalized by Pfenning with his Refinement Types in LF. We hope that this research vein could be useful to experiment, in a proof theoretical setting, forms of polymorphism alternatives to Girard's parametric one.

18:00
Efficient Full Higher-Order Unification *** Best paper by junior researchers

ABSTRACT. We developed a procedure to enumerate complete sets of higher-order unifiers based on work by Jensen and Pietrzykowski. Our procedure removes many redundant unifiers by carefully restricting the search space and tightly integrating decision procedures for fragments that admit a finite complete set of unifiers. We identify a new such fragment and describe an algorithm for computing its unifiers. Our unification procedure is implemented in the Zipperposition theorem prover. Experimental evaluation shows a clear advantage over Jensen and Pietrzykowski's procedure.

17:00-18:00 Session 6B: Program Specification. (all times are in CEST timezone - UTC+2)
17:00
Hierarchy Builder: algebraic hierarchies made easy in Coq with Elpi - System description
PRESENTER: Cyril Cohen

ABSTRACT. It is nowadays customary to organize libraries of machine checked proofs around hierarchies of algebraic structures. One influential example is the Mathematical Components library on top of which the long and intricate proof of the Odd Order Theorem could be fully formalized.

Still, building algebraic hierarchies in a proof assistant such as Coq requires a lot of manual labor and often a deep expertise in the internals of the prover. Moreover, according to our experience, making a hierarchy evolve without causing breakage in client code is equally tricky: even a simple refactoring such as splitting a structure into two simpler ones is hard to get right.

In this paper we describe HB, a high level language to build hierarchies of algebraic structures and to make these hierarchies evolve without breaking user code. The key concepts are the ones of factory, builder and abbreviation that let the hierarchy developer describe an actual interface for their library. Behind that interface the developer can provide appropriate code to ensure backward compatibility. We implement the HB language in the hierarchy-builder addon for the Coq system using the Elpi extension language.

17:30
Modules over monads and operational semantics
PRESENTER: Ambroise Lafont

ABSTRACT. This paper is a contribution to the search for efficient and high-level mathematical tools to specify and reason about (abstract) programming languages or calculi. Generalising the reduction monads of Ahrens et al., we introduce operational monads, thus covering new applications such as the pi-calculus, Positive GSOS specifications, and the big-step, simply-typed, call-by-value lambda-calculus. Finally, we design a notion of signature for operational monads that covers all our examples.