CPP 2016: THE FIFTH INTERNATIONAL CONFERENCE ON CERTIFIED PROGRAMS AND PROOFS
PROGRAM FOR MONDAY, JANUARY 18TH
Days:
next day
all days

View: session overviewtalk overview

09:00-10:00 Session 1: Invited talk by Harvey Friedman

Invited talk by Harvey Friedman, Ohio State University

Title: Perspectives on Formal Verfication

Abstract: I will discuss the importance, uses, and future directions of formal verification from the point of view of a mathematical foundationalist. These include issues of certainty, proof structure, and (finitary) completeness and decidability.

10:30-12:00 Session 2: Verifying Imperative Programs

Verifying Imperative Programs

10:30
Higher-order Representation Predicates in Separation Logic

ABSTRACT. In Separation Logic, representation predicates are used to describe mutable data structures, by establishing a relationship between the entry point of the structure, the piece of heap over which this structure spans, and the logical model associated with the structure. When a data structure is polymorphic, such as in the case of a container, its representation predicate needs to be parameterized not just by the type of the items stored in the structure, but also by the representation predicates associated with these items. Such higher-order representation predicates can be used in particular to control whether containers should own their items. In this paper, we present, through a collection of practical examples, solutions to the challenges associated with reasoning about accesses into data structures that own their elements.

11:00
A Unified Coq Framework for Verifying C Programs with Floating-Point Computations
SPEAKER: unknown

ABSTRACT. We provide concrete evidence that floating-point computations in C programs can be verified in a homogeneous verification setting based on Coq only, by evaluating the practicality of the combination of the formal semantics of CompCert Clight and the Flocq formal specification of IEEE 754 floating-point arithmetic for the verification of properties of floating-point computations in C programs. To this end, we develop a framework to automatically compute real-number expressions of C floating-point computations with rounding error terms along with their correctness proofs. We apply our framework to the complete analysis of an energy-efficient C implementation of a radar image processing algorithm, for which we provide a certified bound on the total noise introduced by floating-point rounding errors and energy-efficient approximations of square root and sine.

11:30
Refinement Based Verification of Imperative Data Structures
SPEAKER: Peter Lammich

ABSTRACT. In this paper we present a stepwise refinement based top-down approach to verified imperative data structures. Our approach is modular in the sense that already verified data structures can be used for construction of more complex data structures. Moreover, our data structures can be used to verify real algorithms. Our tool chain supports refinement down to executable code in various programming languages, and is fully implemented in Isabelle/HOL, such that its trusted code base is only the inference kernel and code generator of Isabelle/HOL.

As a case study, we verify an indexed heap data structure, and use it to generate an efficient verified implementation of Dijkstra's algorithm.

14:00-15:30 Session 3: Design and Implementation of Theorem Provers

Design and Implementation of Theorem Provers

14:00
The Vampire and the FOOL
SPEAKER: unknown

ABSTRACT. This paper presents new features recently implemented in the theorem prover Vampire, namely support for first-order logic with a first class boolean sort (FOOL) and polymorphic arrays. In addition to having a first class boolean sort, FOOL also contains if-then-else and let-in expressions. We argue that presented extensions facilitate reasoning-based program analysis, both by increasing the expressivity of first-order reasoners and by gains in efficiency.

14:30
Improving automation in interactive theorem provers by efficient encoding of lambda-abstractions

ABSTRACT. Hammers are tools for empolying external automatic theorem provers (ATPs) to improve automation in formal proof assistants. Strong automation can greatly ease the task of developing formal proofs. An essential component of any hammer is the translation of the logic of a proof assistant to the format of an ATP. Formalisms of state-of-the-art ATPs are usually first-order, so some method for eliminating lambda-abstractions is needed. We present an experimental comparison of several combinatory abstraction algorithms for HOL(y)Hammer -- a hammer for HOL Light. The algorithms are compared on the HOL Light standard library extended with a library for multivariate analysis. We succeeded in developing algorithms which outperform both lambda-lifting and the simple Schonfinkel's algorithm used in Sledgehammer for Isabelle/HOL. This visibly improves the ATPs' success rate on translated problems, thus enhancing automation in proof assistants.

15:00
Mizar Environment for Isabelle
SPEAKER: unknown

ABSTRACT. The Isabelle/Isar language has been heavily inspired by the Mizar style, however already from the beginning it has been different in many ways and it has been evolving in different directions than the syntax the Mizar language. These differences were mostly motivated by the particular needs of integration with Isabelle at a particular time, in particular in order to make various proof tactics and other techniques associated with the LCF style available.

In this paper we explore the possibility of emulating the Mizar environment as close as possible inside the Isabelle logical framework. We introduce adaptations to the Isabelle/FOL object logic that correspond to the meta-logic of Mizar, as well as Isar inner syntax notations that correspond to these of the Mizar language. We show how meta-level types can be used to differenciate between the syntactic categories of the Mizar language, such as sets and Mizar types including modes and attributes, and show how they interact with the basic constructs of the Tarski-Groethendieck set theory. We discuss Mizar definitions and provide simple abbreviations that allow the introduction of Mizar predicates, functions, attributes and modes using the Pure definition syntax. We finally consider the definite and indefinite description operators in Mizar and their use to introduce definitions by "means" and "equals". We demonstrate the usability of the environment on a sample Mizar-style formalization, with cluster inferences and "by" steps performed manually.

16:00-18:00 Session 4: Mathematics

Mathematics

16:00
A Modular, Efficient Formalisation of Real Algebraic Numbers
SPEAKER: unknown

ABSTRACT. This paper shows a construction of real algebraic numbers with executable arithmetic operations in Isabelle/HOL. Instead of verified resultants, arithmetic operations on real algebraic numbers are based on a decision procedure to decide the sign of a bivariate polynomial at real algebraic point. The modular design allows the safe use of fast external code. This work can be the basis for decision procedures that rely on real algebraic numbers.

16:30
Formal proofs of transcendence for e and pi as an application of multivariate and symmetric polynomials
SPEAKER: unknown

ABSTRACT. We describe the formalisation in Coq of a proof that the numbers e and pi are transcendental. This is the first formalized proof of transcendance for pi. This proof lies at the interface of two domains of mathematics that are often considered separately: calculus (real and elementary complex analysis) and algebra. For the work on calculus, we rely on the Coquelicot library and for the work on algebra, we rely on the Mathematical Components library. Moreover, some of the elements of formalized proof originate in the more ancient library for real numbers included in the Coq distribution. The case of pi relied extensively on properties of multivariate polynomials and this experiment was also an occasion to put to test a newly developed library for these multivariate polynomials. In the paper, we describe the various original aspects of the work: orchestrating a development based on several libraries, designing a new library for multivariate polynomials, and specialized developments performed solely for this proof. Apart from achieving a landmark result, this work also teaches lessons about the evolution of modern formalized mathematics.

17:00
Formalizing Jordan Normal Forms in Isabelle/HOL
SPEAKER: unknown

ABSTRACT. In automated complexity analysis of term rewriting, estimating the growth rate of the values in A n -- for a fixed matrix A and increasing n -- is of fundamental interest. This growth rate can be exactly characterized via A’s Jordan normal form (JNF). We formalized this result in our library IsaFoR and our certifier CeTA, and thereby improve the support for certifying polynomial bounds derived by (untrusted) complexity analysis tools. To this end, we develop a new library for matrices that admits to conveniently work with block matrices. Besides the mentioned complexity result, we formalize Gram-Schmidt’s orthogonalization algorithm and Schur decomposition in order to prove existence of JNFs. We also provide a uniqueness result for JNFs which allows us to compute Jordan blocks for individual eigenvalues. In order to determine eigenvalues automatically, we moreover formalize Yun’s square-free factorization algorithm.

17:30
Formalization of a Newton series representation of polynomials
SPEAKER: unknown

ABSTRACT. We formalize an algorithm to change the representation of a polynomial to a Newton power series. This provides a way to compute efficiently polynomials which roots are the sums or products of roots of other polynomials, and hence provide a base component of efficient computation for algebraic numbers. In order to achieve this, we formalize a notion of partial power series and develop an abstract theory of poles of fractions.