VSL 2014: VIENNA SUMMER OF LOGIC 2014
PAS ON THURSDAY, JULY 17TH, 2014
Days:
next day
all days

View: session overviewtalk overviewside by side with other conferences

09:00-10:15 Session 64C: Opening + Invited Talk by Viorica Sofronie-Stokkermans
Location: FH, Seminarraum 138A
09:00
Hierarchical reasoning and quantifier elimination for the verification of parametric reactive and hybrid systems
10:15-10:45Coffee Break
10:45-12:30 Session 66AN: Automated Debugging
Location: FH, Seminarraum 138A
10:45
An Attempt to Formalize Popper's Logic of Scientific Discovery
SPEAKER: Wei Li

ABSTRACT. In 1935, Karl Popper published his famous book entitled ``The Logic of Scientific Discovery.'' In this book he proposed a general procedure of scientific discovery which contains the following key points. (1) A scientific theory consists of ``conjectures'' about domain knowledge.(2) Every conjecture may be disproved, that is, conclusions or predictions derived on the basis of it may be and must be subject to verification using the results of observations of human beings and scientific experiments on natural phenomena. (3) When the conclusions or predications contradict the results of observations or experiments, the theory is refuted by facts. (4) In this case, one should abandon those conjectures in the current theory which lead to contradictions, propose new conjectures, and form a new theory. (5) Every theory obtained in this way is provisional: it is expected to be refuted by facts any time. Truth in science is reached in the process of unlimited explorations.

The above procedure is given in the layer of Scientific Philosophy. In this paper, we provide a mathematical-logical framework for this general procedure. It consists of the following four parts. (1) Representability theory, that is, the theory of representing basic concepts such as theory, conjecture, provability, and refutation by facts, which are involved in the general procedure of scientific discovery, in first-order languages. (2) R-calculus, that is, a revision calculus for deleting conjectures refuted by facts or experiments from the current theory. It is a formal inference system about logical connective symbols and quantifier symbols and is sound, complete, and reachable. (3) Theory of version evolution, that is, the theory about version sequences, their limits, proschemes, and the convergency, soundness, and independency of proschemes. (4) I-calculus, that is, a calculus for forming conjectures. It is a formal inference system about contexts and involves the theory of rationality about conjecture calculus.

The above four parts constitute the mathematical framework for the logic of scientific discovery, which may be viewed as a formal description of ``the logic of scientific discovery.'' As examples, we will use this mathematical framework to give formal descriptions of the processes of Einstein's discovery of ``the special theory of relativity" and Darwin's proposal of ``the theory of evolution.'' The difference between our mathematical framework and the general procedure of Popper's logic of scientific discovery in the sense of scientific philosophy is: the former can be implemented on computer, so that the revision of scientific theories and evolution of version sequences can be accomplished via human-computer interactions and in the case where the current theory encounters refutation by facts, the computer can provide schemes for rational revision.

11:30
A Model Theory for R-calculus without Cut
SPEAKER: unknown

ABSTRACT. In this paper, we propose a model theory for R-calculus without cut. First, validity of R-transitions and R-transition sequences is defined using models of first-order languages. Then reducibility of R-transition sequences is defined and it is proved that every R-transition sequence, which transits to an R-termination, can be reduced to an irreducible R-transition sequence. Finally, the soundness and completeness of R-calculus is redefined. And R-calculus without cut is proved to still be sound and complete.

12:00
RCK: A Software Toolkit for R-calculus
SPEAKER: unknown

ABSTRACT. R-calculus is an inference system to formally deduce all possible maximal contractions of a theory in first-order language. Based on the theories of R-calculus, we implemented a software toolkit, called RCK, to perform R-calculus on computer. The toolkit enable users to obtain and verify maximal contractions of a given theory whenever the theory is refuted by given facts. We briefly introduce the framework of RCK and demonstrate how R-calculus can serve for scientific discovery.

13:00-14:30Lunch Break
14:30-16:00 Session 75AM: Invited Talk by Bruno Buchberger
Location: FH, Seminarraum 138A
14:30
The Theorema Approach to Mathematics

ABSTRACT. In this talk I will give an overview on the main ideas of my "Theorema" approach to mathematics, which reflects my view on mathematics that evolved over my life as a mathematician ("from 17 to 71"):

- Mathematics is the "art of explanation", the art of reducing the complex to the simple. Hence, mathematics is of highest relevance for the "wo-man on the street".

- Mathematics is in the center of the "innovation spiral" that starts from basic research on the nature of nature and goes all the way via technology to economy and welfare. Hence, mathematics is of highest relevance to our society.

- Mathematics is the meta-theory of itself. It can explain how we derive new mathematical knowledge and methods from available knowledge and methods. This is possible because mathematics is essentially formal. The application of mathematics to itself, i.e. to the mathematical exploration process, was hardly noticeable until the middle of 20th century. However, now, we can observe what "application of mathematics to itself" means practically. This will be of highest relevance to the way we will do mathematics (including software) in the future, i.e. the (semi-)automation of the mathematical exploration process will be of highest relevance to the "working mathematicians".

I think that predicate logic (in some nicely readable two-dimensional - even user-definable - syntax) is the natural frame for providing a version of mathematics that pleases the wo-man from the street, society, and the working mathematicians. And all the effort and attention must go to automating natural reasoning in predicate logic for pleasing everybody with more and more attractive mathematics.

This is the reason why approximately in the middle of the 1990-ies I launched the Theorema Project.

In the talk, I will give an overview on the current state of the project, which - due to my many services for "society" that took lots of my time - is not yet at a stage which is satisfactory. However, I hope I will be able to explain a few highlights and to draw a picture from which the avenues to the future can be guessed.

Of course, Theorema is just one in a group of research projects that pursue the automation of mathematics. I am very optimistic, nervous, and excited to enjoy the dawn of a new era of mathematics that will put mathematics right into the center of modern society in a tangible, natural, artistic and surprising way.

15:00
A note on real quantifier elimination by virtual term substitution of unbounded degree

ABSTRACT. We describe work in progress on constructing a complete implementation of Weispfenning's virtual term substitution method for real closed field formulas of unbounded degree. We build upon a recent high-performance library for computing over non-Archimedean real closed fields using Thom's Lemma.

(Extended abstract)

16:00-16:30Coffee Break
16:30-19:00 Session 79A: VSL Joint Award Ceremony 1
Location: MB, Kuppelsaal
16:30
Foundations and Technology Competitions Award Ceremony

ABSTRACT. The third round of the Kurt Gödel Research Prize Fellowships Program, under the title: Connecting Foundations and Technology, aims at supporting young scholars in early stages of their academic careers by offering highest fellowships in history of logic, kindly supported by the John Templeton Foundation. Young scholars being less or exactly 40 years old at the time of the commencement of the Vienna Summer of Logic (July 9, 2014) will be awarded one fellowship award in the amount of EUR 100,000, in each of the following categories:

  • Logical Foundations of Mathematics,
  • Logical Foundations of Computer Science, and
  • Logical Foundations of Artificial Intelligence

The following three Boards of Jurors were in charge of choosing the winners:

  • Logical Foundations of Mathematics: Jan Krajíček, Angus Macintyre, and Dana Scott (Chair).
  • Logical Foundations of Computer Science: Franz Baader, Johann Makowsky, and Wolfgang Thomas (Chair).
  • Logical Foundations of Artificial Intelligence: Luigia Carlucci Aiello, Georg Gottlob (Chair), and Bernhard Nebel.

http://fellowship.logic.at/

17:30
FLoC Olympic Games Award Ceremony 1

ABSTRACT. The aim of the FLoC Olympic Games is to start a tradition in the spirit of the ancient Olympic Games, a Panhellenic sport festival held every four years in the sanctuary of Olympia in Greece, this time in the scientific community of computational logic. Every four years, as part of the Federated Logic Conference, the Games will gather together all the challenging disciplines from a variety of computational logic in the form of the solver competitions.

At the Award Ceremonies, the competition organizers will have the opportunity to present their competitions to the public and give away special prizes, the prestigious Kurt Gödel medals, to their successful competitors. This reinforces the main goal of the FLoC Olympic Games, that is, to facilitate the visibility of the competitions associated with the conferences and workshops of the Federated Logic Conference during the Vienna Summer of Logic.

This award ceremony will host the

  • 3rd Confluence Competition (CoCo 2014);
  • Configurable SAT Solver Challenge (CSSC 2014);
  • Ninth Max-SAT Evaluation (Max-SAT 2014);
  • QBF Gallery 2014; and
  • SAT Competition 2014 (SAT-COMP 2014).
18:15
FLoC Closing Week 1
SPEAKER: Helmut Veith
16:30-18:30 Session 80G: Program Analysis
Location: FH, Seminarraum 138A
16:30
Successfully Coping with Scalability of Symbolic Analysis in Timing Analysis
SPEAKER: unknown

ABSTRACT. Symbolic analysis is known and valued for its strength of statically inferring non-trivial properties of programs. This strength, however, comes typically at high computation costs often preventing a full path coverage of programs of real-world applications.

In this presentation we reconsider and highlight two examples of successfully coping and circumventing the scalabality issue related to straightforward applications of symbolic analysis in the timing analysis of real-time systems. In the first approach, we restrict symbolic analysis to program loops matching particular structural patterns for which symbolic analysis reduces to the solution of a set of recurrence equations which does not require an explicit symbolic path analysis and coverage of these loops. In the second approach, we focus the pathwise symbolic analysis of a program to a set of program paths which are known by other program analyses to be crucial for the timing behaviour of the program that is usually small and manageable. Both approaches have important complementary orthogonal applications in different stages of the timing analysis of safety-critical real-time systems and proved to be useful in practice.

17:00
Confluence of Pattern-Based Calculi with Finitary Matching
SPEAKER: unknown

ABSTRACT. The pattern calculus described in this paper integrates the functional mechanism of the lambda-calculus with the capabilities of finitary pattern matching algorithms. We propose a generic confluence proof, where the way pattern abstractions are applied in a non-deterministic calculus is axiomatized. Instances of the matching function are presented.

17:30
The Verification of Conversion Algorithms between Finite Automata

ABSTRACT. This paper presents three verified conversion algorithms between finite automata in Isabelle/HOL. The specifications of all algorithms are formalized in a direct way, from which Isabelle/HOL can generate executable programs. The correctness of each algorithm is totally verified from three perspectives: the loops in each algorithm terminates, the output of an algorithm is an automata with the expected type, and the input and output of an algorithm accept the same language. The whole work includes over 3000 lines of code in Isabelle/HOL.

18:00
Source code profiling and classification for automated detection of logical errors

ABSTRACT. Research and industrial experience reveal that code reviews as a part of software inspection might be the most cost-effective technique a team can use to reduce defects. Tools that automate code inspection mostly focus on the detection of a priori known defect patterns and security vulnerabilities. Automated detection of logical errors, due to a faulty implementation of applications’ functionality is a relatively uncharted territory. Automation can be based on profiling the intended behavior behind the source code. In this paper, we present a code profiling method based on token classification. Our method combines an information flow analysis, the crosschecking of dynamic invariants with symbolic execution and code classification heuristics with the use of a fuzzy logic system. Our goal is to detect logical errors and exploitable vulnerabilities. The theoretical underpinnings and the practical implementation of our approach are discussed. We test the APP_LogGIC tool that implements the proposed analysis on two real-world applications. The results show that profiling the intended program behavior is feasible in diverse applications. We discuss in detail the heuristics used to overcome the problem of state space explosion and that of the large data sets. Code metrics and test results are provided to demonstrate the effectiveness of the proposed approach. This paper extends the work reported in an article that has been recently submitted to an international conference and is currently pending review. In this extended version of our method we propose new classification mechanisms and we provide a detailed description of the used source code classification techniques and heuristics.