View: session overviewtalk overview

10:30-11:00 Session 1: Opening and Homage
Location: room Arrabida
The Legacy of Stefania Gnesi: From Software Engineering to Formal Methods and Tools, and Back
From Dynamic Programming to Programming Science: Some Recollections in Honour of Stefania Gnesi

ABSTRACT. Stefania Gnesi graduated at Pisa with a MsSc. thesis on the correspondence between operational and denotational semantics for dynamic programming. She then moved to logics for proving properties of concurrent programs and their tools. She succeeded, as much as a few others, in combining theoretical, methodological and experimental aspects of programming into a programming science.

11:00-11:15 Session 2: Software Engineering
Location: room Arrabida
Multi-modelling and Co-simulation in the Engineering of Cyber-Physical Systems: Towards the Digital Twin
PRESENTER: John Fitzgerald

ABSTRACT. Ensuring the dependability of Cyber-Physical Systems (CPSs) poses challenges for model-based engineering, stemming from the semantic heterogeneity of the models of computational, physical and human processes, and from the range of stakeholders involved. We argue that delivering such dependability requires a marriage of multi-disciplinary models developed during design with models derived from real operational data. Assets developed during design thus become the basis of a learning digital twin, able to support decision making both in redesign and in responsive operation. Starting from an open integrated toolchain leveraging formal models for CPS design, we consider the extension of this concept towards digital twins. A small example inspired by agricultural robotics illustrates some of the opportunities for research and innovation in delivering digital twins that contribute to dependability.

11:15-12:30 Session 3: Formal Methods and Tools
Location: room Arrabida
Innovating Medical Image Analysis via Spatial Logics
PRESENTER: Mieke Massink

ABSTRACT. Current computer-assisted medical imaging for the planning of radiotherapy requires high-level mathematical and computational skills. These are often paired with the case-by-case integration of highly specialised technologies. The lack of modularity at the right level of abstraction in this field hinders research, collaboration and transfer of expertise among medical physicists, engineers and technicians. In recent work, we have proposed the introduction of spatial logics and spatial model checking in medical imaging, and demonstrated the technology in a case study devoted to contouring of brain tumours. Our longer term aim is to provide an open platform introducing declarative medical image analysis. This will provide domain experts with a convenient and very concise way to specify contouring and segmentation operations, grounded on the solid mathematical foundations of Topological Spatial Logics. In this work, we show preliminary spatial model checking results for the automatic identification of specific brain tissues in a healthy brain and a selection of challenges for spatial model checking for medical imaging is discussed.

Formal Methods in Designing Critical Cyber-Physical Systems
PRESENTER: Matteo Rossi

ABSTRACT. Cyber-physical Systems (CPS) are increasingly applied in critical contexts, where they have to support safe and secure operations, often subject to stringent timing requirements. Typical examples are scenarios involving automated living or working spaces in which humans operate, or human-robot collaborations (HRC) in modern manufacturing. Formal methods have been traditionally investigated to support modeling and verification of critical systems. In this paper, we review some of the main new challenges arising in the application of formal methods to modeling and verification of CPS. We do that by presenting two case studies (emergency response in a smart city and a smart manufacturing system), reflecting past work of the authors, from which some general lessons are distilled.

Automata-Based Behavioural Contracts with Action Correlation
PRESENTER: Davide Basile

ABSTRACT. The rigorous design of Service-Oriented Computing (SOC) applications has been identified as one of the primary research challenges for the next 10 years. Many foundational theories for SOC have been then defined, but they often rely on mechanisms different from real-world SOC technologies, hindering actual service modelling and verification. In this paper, we propose a novel automata-based formalism of service contracts equipped with a mechanism, inspired by current web service technologies, exploiting correlation data to drive service interactions and with formal foundations enabling reasoning about service correctness.

Logical Support for Bike-Sharing System Design

ABSTRACT. Automated bicycle-sharing systems are a prominent example of reconfigurable cyber-physical systems for which the locality (i.e., particular position and movement) and connectivity of their elements (i.e., physical or virtual interaction and information flow) are central to the way in which they operate. These features motivate us to study bicycle systems from the perspective of actor networks – a newly developed framework that we have proposed for dealing with cyber-physical-system protocols in which the so-called networks or systems comprise elements that are no longer limited to programs, but can also include humans and physical artefacts as actors. In this paper, we present an actor-network-based approach to modelling bike-sharing systems (BSS) and to reasoning about information-flow properties in the context of BSS. As in our previous work on actor networks, logical support is obtained by means of a two-stage hybridisation process that spans three logical levels: a base level, which in this case is propositional and captures the attributes that the actors may have; a second level that captures the locality and connectivity of actors in a given configuration of the network; and a third level that corresponds to the possible interactions between actors, and captures the dynamics of the system in terms of network reconfigurations. To illustrate the information-flow properties that can be checked using this framework, we provide an actor-network specification of a BSS, and use a recently developed tool for hybridised logics to highlight an information-flow vulnerability of the system, which we then correct.

A Generic Dynamic Logic with Applications to Interaction-Based Systems
PRESENTER: Martin Wirsing

ABSTRACT. We propose a generic dynamic logic with the usual diamond and box modalities over structured actions. Instead of using regular expressions of actions our logic is parameterised by the form of the actions which can be given by an arbitrary language for complex, structured actions. In particular, our logic can be instantiated by languages that describe complex interactions between system components. We study two instantiations of our logic for specifying global behaviours of interaction-based systems: one on the basis of global session types and the other one using UML sequence diagrams. Moreover, we show that our proposed generic logic, and hence all its instantiations, satisfy bisimulation invariance and a Hennessy-Milner theorem.

14:00-14:30 Session 4: Requirements Engineering / Natural Language Processing
Location: room Arrabida
Ambiguity in Requirements Engineering: Towards a Unifying Framework
PRESENTER: Paola Spoletini

ABSTRACT. Requirements engineering (RE) is one of the most human-centric and communication-intensive discipline within the software engineering field. As such, activities in RE, such as requirements elicitation, modelling and documentation, make large use of the most human of the communication codes, which is natural language (NL). NL is flexible, in that it can be used and understood by different stakeholders, but it also suffers from ambiguity. Several works have been performed to study the problem of ambiguity in RE. However, most of the works focused either on written NL, as in requirements documents, or in oral NL, as in requirements elicitation interviews. In this paper, we review the main contributions in the field, and we provide the basis towards a unifying perspective on ambiguity in RE.

Looking Inside the Black Box: Core Semantics Towards Accountability of Artificial Intelligence

ABSTRACT. Recent advances in artificial intelligence raise a number of concerns. Among the challenges to be addressed by researchers, accountability of artificial intelligence solutions is one of the most critical. This paper focuses on artificial intelligence applications using natural language to investigate if the core semantics defined for a large-scale natural language processing system could assist in addressing ac-countability issues. Core semantics aims to obtain a full interpretation of the con-tent of natural language texts, representing both implicit and explicit knowledge, using only 'subj-action-(obj)' structure and casual, temporal, spatial and personal links. The first part of the paper offers a summary of the difficulties to be ad-dressed and of the reasons why representing the meaning of a natural language text is relevant for artificial intelligence accountability. In the second part, a-proof-of-concept for the application of such a knowledge representation to support ac-countability, a detailed example of the analysis obtained with a prototype system is illustrated. While only preliminary, results give some new insights and indicate that the provided knowledge representation can be used to support accountability, looking inside the box.

14:30-15:00 Session 5: Software Product Lines
Location: room Arrabida
A Decade of Featured Transition Systems

ABSTRACT. Variability-intensive systems (VIS) form a large and heterogeneous class of systems which behaviour can be modified by enabling or disabling predefined features. Variability mechanisms allows the adaptation of software to the needs of their users and the environment. However, VIS verification and validation (V&V) is challenging: the combinatorial explosion of the number of possible behaviours and undesired feature interactions are amongst such challenges. To tackle them, Featured Transitions Systems (FTS) were proposed a decade ago to model and verify the behaviours of VIS. In a FTS, each transition is annotated with a combination of features determining which variants can execute it. An FTS can model all possible behaviours of a given VIS. This compact model enabled us to create efficient V&V algorithms taking advantage of the behaviours shared amongst features resulting in a reduction of the V&V effort by several orders of magnitude. In this chapter, we will cover the formalism, its applications and the role it can play in supporting the quality assurance of the next generation of VIS powered by artificial intelligence techniques.

Towards Model Checking Product Lines in the Digital Humanities: An Application to Historical Data
PRESENTER: Tiziana Margaria

ABSTRACT. Rapid development in computing techniques and databases’ systems have aided in the digitization of, and access to, various historical (big) data, with significant challenges of analysis and interoperability. The Death and Burial Data, Ireland project aims to build a Big Data interoperability framework loosely based on the Knowledge Discovery Data process to integrate Civil Registration of Death data with other data types collated in Ireland from 1864 to 1922. For our project, we resort to a DTD product line to represent and manage various representations and enrichments of the data. Well-formed documents serve as contracts between a provider (of the data set) and its customers (the researchers that consult them). We adopt the Context-Free Modal Transition Systems as a formalism to specify product lines of Document Type Descriptions. The goal is to then proceed to product line verification using the M3C context-free model checking techniques to ensure that they are fit for purpose. The goal is to later implement and manage the corresponding family of data models and processes in DIME, leveraging its flexible DyWA data management layer to define and efficiently manage the interoperable historical data framework for future use. The resulting hierarchical product line verification will allow our hDB platform to act as a high-quality service provider for digital humanity researchers, providing them with a wide range of tailored applications implementing the KDD process, whose essential business rules are easily checked by a standard DTD-checker

15:30-16:45 Session 6: Formal Verification
Location: room Arrabida
A Systematic Approach to Programming and Verifying Attribute-Based Communication Systems
PRESENTER: Rocco De Nicola

ABSTRACT. A methodology is presented for the systematic development of systems of many components, that interact by relying on predicates over attributes that they themselves mutually expose. The starting point is a novel process calculus AbC (for Attribute-based Communication) introduced for modelling collective-adaptive systems. It is shown how to refine the model by introducing a translator from AbC into UML-like state machines that can be analyzed by UMC. In order to execute the specification, another translator is introduced that maps AbC terms into ABEL, a domain-specific framework that offers faithful \AbC-style programming constructs built on top of Erlang. It is also shown how the proposed methodology can be used to assess relevant properties of systems and to automatically obtain an executable program for a non-trivial case study.

On the Prediction of Smart Contracts’ Behaviours
PRESENTER: Cosimo Laneve

ABSTRACT. Smart contracts are pieces of software stored on the blockchain that control the transfer of digital currencies between parties under certain conditions. In this paper we analyze the behaviour of smart contracts and the interaction with external agents in order to predict their optimal behaviour. We define a core language of programs with a minimal set of smart contract primitives formalized as a process calculus and we describe the whole system as a parallel composition of smart contracts and human actors. We also study the system behaviour and the maximum profit for each actors solving Presburger constraints.

Hunting Superfluous Locks with Model Checking
PRESENTER: Radu Mateescu

ABSTRACT. Parallelization of existing sequential programs to increase their performance and exploit recent multi and many-core architectures is a challenging but inevitable effort. One increasingly popular parallelization approach is based on OpenMP, which enables the designer to annotate a sequential program with constructs specifying the parallel execution of code blocks. These constructs are then interpreted by the OpenMP runtime, which assigns blocks to threads running on a parallel architecture. Although this scheme is very flexible and not (very) intrusive, it presents the risk of introducing synchronization errors, such as data races on shared variables. In this paper, we propose an iterative methodology to assist the OpenMP parallelization by using formal methods and verification. In each iteration, potential data races are identified by applying to the OpenMP program a lockset analysis, which computes the set of shared variables that are potentially not protected by locks. To avoid the insertion of superfluous locks, an abstract, action-based formal model of the OpenMP is extracted and analyzed using the ACTL on-the-fly model checker of the CADP formal verification toolbox. We describe the methodology, compare it with existing work, and illustrate its practical use.

Formal Verification of Railway Timetables - Using the UPPAAL Model Checker

ABSTRACT. This paper considers the challenge of validating railway timetables and investigates how formal methods can be used for that. The paper presents a re-configurable, formal model which can be configured with a timetable for a railway network, properties of that network, and various timetabling parameters (such as station and line capacities, headways, and minimum running times) constraining the allowed train behaviour. The formal model describes the system behaviour of trains driving according to the given railway timetable. Model checking can then be used to check that driving according to the timetable does not lead to illegal system states. The method has successfully been applied to a real world case study: a time table for 12 trains at Nærumbanen in Denmark.

An Axiomatization of Strong Distribution Bisimulation for a Language with a Parallel Operator and Probabilistic Choice
PRESENTER: Erik P. de Vink

ABSTRACT. In the setting of a simple process language featuring non-deterministic choice and a parallel operator on the one hand and probabilistic choice on the other hand, we propose an axiomatization capturing strong distribution bisimulation. Contrary to other process equivalences for probabilistic languages, in this paper distributions rather than states are the leading ingredients for building the semantics and accompanying equational theory, for which we establish soundness and completeness.

17:15-17:45 Session 7: Applications
Location: room Arrabida
Modelling of Railway Signalling System Requirements by Controlled Natural Languages: A Case Study

ABSTRACT. For long time, the railway sector has been a source of inspiration for generations of researchers challenged to develop models and tools to analyze safety and reliability. Threats were coming mainly from within due to occasionally faults in hardware components. With the advent of smart trains, railways safety is instead venturing into cybersecurity and the railway sector is more and more compelled to protect assets from threats against components that rely on information & communication technology. Therefore, instruments developed for security engineering can be helpful to railway safety. We explore the use of one of them, the Controlled Natural Language for Data Sharing Agreement (CNL4DSA), which we use to express unambiguously a few exemplifying signal management system requirements. Since CNL4DSA enables the automatic generation of enforceable access control policies, our exercise can be seen as a first step towards an implementation of the security-by design principle in railway signalling management system engineering.

How Formal Methods Can Contribute to 5G Networks
PRESENTER: Laura Panizo

ABSTRACT. Communication networks were one of the main drivers of formal methods since the 70's. The dominant role of software in the new 5G mobile communication networks will bring again a relevant applica- tion area for formal models and techniques like model checking, model- based testing or runtime verication. This chapter introduces some of these novel application areas, specically for Software Dened Networks (SDN) and Network Function Virtualization (NFV). Our proposals focus on automated methods to create formal models that satisfy a given set of requirements for SDN and NFV.

17:45-18:15 Session 8: Homages and Closing
Location: room Arrabida
A collection of video homages
The last word is for the honoree...