SCSS 2016:Papers with AbstractsPapers 

Abstract. Partial evaluation and staging are two of the wellknown symbolic manipulation techniques of programs which generate efficient specialized code. On one hand, partial evaluation provides us with an automatic means to separate a program into two (or more) stages but its behavior is perceived as hard to control. On the other hand, staging (or staged calculus) requires us to manually separate a program but with full control over its behavior. In the previous work, the first author introduced a framework to relate the two techniques, giving a unified view to the two techniques. In this paper, we extend the framework to handle the crossstage persistence (CSP) and show that the 2level staging annotation obtained by the automatic separation is the best staging annotation in a system where CSP is allowed for basetype values only. In the presence of CSP for highertype values, on the other hand, there is no single annotation that is better than all the other annotations.  Abstract. This paper describes the formalization of the arithmetization of Euclidean geometry in the Coq proof assistant. As a basis for this work, Tarski’s system of geometry was chosen for its wellknown metamathematical properties. This work completes our formalization of the twodimensional results contained in part one of Metamathematische Methoden in der Geometrie. We define the arithmetic operations geometrically and prove that they verify the properties of an ordered field. Then, we introduce cartesian coordinates, and provide an algebraic characterization of the main geometric predicates. In order to prove the characterization of the segment congruence relation, we provide a synthetic formal proof of two crucial theorems in geometry, namely the intercept and Pythagoras' theorems. The arithmetization of geometry justifies the use the algebraic automated deduction methods in geometry. We give an example of the use this formalization by deriving from Tarski's system of geometry a formal proof of theorems of nine points using Gröbner basis.  Abstract. We analyze the space complexity of monitoring streams of messages whose expected behavior is specified in a fragment of predicate logic; this fragment is the core of the LogicGuard specification language that has been developed in industrial context for the runtime monitoring of network traffic. The execution of the monitors is defined by an operational semantics for the stepwise evaluation of formulas, of which require the preservation of instances of the formulas in memory until their truth value can be determined. In the presented work, we analyze the number of instances that have to be preserved over time for a significant fragment of the core language that involves only “future looking quantifiers” which lays the foundations for the space analysis of the entire core language.  Abstract. A stepper is a tool that displays all the steps of a program's execution. To implement a stepper, we need to reconstruct each intermediate program from the current redex and the evaluation context. We regard evaluation contexts as delimited continuations and capture them using the control operators shift and reset. This enables us to implement a stepper concisely by writing an evaluator that is close to the standard bigstep interpreter. Our implementation is a nontrivial application of shift and reset.  Abstract. We solve the Expression Compatibility Problem (ECP) – a variation of the famous Expression Problem (EP) which, in addition to the classical EP concerns, takes into consideration the replacement, refinement, and borrowing of algebraic datatype (ADT) cases. ECP describes ADT cases as components and promotes ideas from Lightweight Family Polymorphism, Class Sharing, and Expression Families Problem. Our solution is based on a formal model for ComponentBased Software Engineering that pertains to the Expression Problem. We provide the syntax, static semantics, and dynamic semantics of our model. We also show that our model can be used to solve the Expression Families Problem as well. Moreover, we show how to embed the model in Scala.  Abstract. We implemented a Mathematica module of CGA which includes functions to denote CGA elements and compute several operations in CGA. We can draw the figure in 3D space which is corresponding to a CGA element. Our draw function is using Gr\"{o}bner Basis for simplifying equations of figures. It can be used for any dimensional figures. One of our motivations is to realize 3D origami system using our own CGA Library. We follow the 2D computational origami system EOrigamiSystem developed by Ida et.al. and formulated simple fold operations in 3D by using CGA points and motions. Then, we proved some geometric theorems about origami properties by computing CGA equation formulas.  Abstract. We present a data structure to represent and manipulate large sets of (equal) terms (or expressions). Our initial and main motivation for this data structure is the simplification of expressions with respect to a formal theory, typically, an equational one. However, it happens that the data structure is also efficient to compute the congruence closure of a relation over a set of terms. We provide an abstract definition of the data structure, including a precise semantics, and we explain how to implement it efficiently. We prove the correctness of the proposed algorithms, with a complexity analysis and experimental results. We compare these algorithms with previous algorithms to compute the congruence closure and we also sketch how we use the data structure to tackle the expression simplification problem.  Abstract. Functional logic programming extends the functional programming style with two important features: the possibility to define nondeterministic operations with overlapping rules, and the usage of logic variables in both defining rules and expressions to evaluate. Conditional constructorbased term rewrite systems (CBCTRSs) emerged as a suitable model for functional logic programs, because they can be easily transformed into an equivalent program in a core language where computations can be performed more efficiently.
We consider a recent proposal by Antoy and Hanus, of translating CBCTRSs into an equivalent class of programs where computation can be performed by mere rewriting. His computational model has the limitation of computing only ground answer substitutions for equations with strict semantics interpreted as joinability to a value.
We propose two adjustments of their computational models, which are capable to compute nonground answers.  Abstract. We have implemented a certified Wang tiling program for tiling a rectangle region using a brick corner Wang tile set. A brick corner Wang tile set is a special Wang tile set introduced by A. DerouetJourdan et al. in computer graphics in 2015 to model wall patterns texture. We have implemented a tiling algorithm using Coq proof assistant and proved its correctness. This correctness assures the existence of a tiling of any brick corner Wang tile set for any size of rectangle. The essential points of our proof are the existence of a tiling for a $2 \times 2$ rectangle and a simple induction process. Since the brick corner Wang tile is a class of infinite kinds of tile sets, it is not straightforward and there are many conditional branches to prove the correctness. The certification with Coq assures that there are no lack of conditions.  Abstract. PLCA is a framework for qualitative spatial reasoning, using symbolic objects and relationships between them. To show construction of a PLCA expression, the second and the third authors introduced inductive constructions. They also proved that expressions obtained by inductive constructions are planar (planarity) and planar PLCA expressions can be obtained by inductive constructions (realizability). The former one is proven with Coq proof assistant, but the latter is proven with penandpaper.
We are currently proving the latter with Coq. We locate some oversights in the original inductive constructions and the proof. In this paper, we report these oversights, reformalization of inductive constructions and modified proofs. We prove planarity and a base case of realizability with Coq, and induction step of realizability with Coq and penandpaper proof.  Abstract. In many areas, some geometry problems can not be solved using only geometry and are treated by the means of algebraic tools. However, geometric properties can be still employed to simplify the system of equations. This allows either to speed up the treatment or, more radically, to make the treatment possible. In this article we illustrate this approach with a family of toy examples. In all these problems the goal it is to determine if there is a compassandstraightedge construction of the three vertices of a triangle knowing only three located points of this triangle. Algebraic tools, basically Galois theory, are needed to answer the question. But in many cases a geometric reasoning phase is required to provide a polynomial algebraic system that algebraic softwares can address within a acceptable time despite the exponential complexity of the underlying algorithms.  Abstract. If you find yourself in a corridor of a standard maze, a sure and easy way to escape is to simply pick the left (or right) wall, and then follow it along its twists and turns and around the deadends till you eventually arrive at the exit. But what happens when you cannot tell left from right? What if you cannot tell North from South? What if you cannot judge distances, and have no idea what it means to follow a wall in a given direction?
The possibility of escape in these circumstances is suggested in the statement of an unproven theorem given in David Hilbert's celebrated /Foundations of Geometry/, in which he effectively claimed that a standard maze could be fully navigated using axioms and concepts based /solely/ on the relations of points lying on lines in a specified order.
We discuss our algorithm for this surprisingly challenging version of the maze navigation problem, and our HOL Light verification of its correctness from Hilbert's axioms.  Abstract. Nominal rewriting (Fernández, Gabbay & Mackie, 2004; Fernández & Gabbay, 2007) is a framework that extends firstorder term rewriting by a binding mechanism based on the nominal approach (Gabbay & Pitts, 2002; Pitts, 2003). In this paper, we investigate confluence properties of nominal rewriting, following the study of orthogonal systems in (Suzuki et al., 2015), but here we treat systems in which overlaps of the rewrite rules exist. First we present an example where choice of bound variables (atoms) of rules affects joinability of the induced critical pairs. Then we give a detailed proof of the critical pair lemma, and illustrate some of its applications including confluence results for nonterminating systems.  Abstract. At ISSAC 2004, Schost and D. introduced a transformation of triangular lexicographic Groebner bases generating a radical ideal of dimension zero, which reduces significantly the bitsize of coefficients. The case where the triangular lexicographic Groebner basis does not generate a radical ideal is far more complicated. This work treats the case of n=2 variables, and in some extent the case of n=3 variables. It resorts to an extra operation, the squarefree factorization; nevertheless this operation has low complexity cost. But as soon as n>2 variables a lack of simple and efficient gcdlike operation over nonreduced rings prevents to undertake meaningful algorithmic considerations. An implementation in Maple for the case n=2 confirms the expected reduction of the expected size coefficients. 

