View: session overviewtalk overviewside by side with other conferences

09:00 | Probabilistic Hyperproperties ABSTRACT. Four decades ago, Lamport used the notion of trace properties as a means to specify the correctness of individual executions of concurrent programs. This notion was later formalized and classified by Alpern and Schneider to safety and liveness properties. Temporal logics like LTL and CTL were built based on these efforts to give formal syntax and semantics to requirements of trace properties. Subsequently, verification algorithms were developed to reason about individual executions of a system. However, it turns out that many interesting requirements are not trace properties. For example, important information-flow security policies (e.g. noninterference, observational determinism) or service level agreements (e.g. mean response time, percentage uptime) cannot be expressed as properties of individual execution traces of a system. Rather, they are properties of sets of execution traces, also known as hyperproperties. Temporal logics such as HyperLTL and HyperCTL∗ have been proposed to provide a unifying framework to express and reason about hyperproperties. This talk is devoted to special class of hyperproperties: we ask the question what are hyperproperties in the context of systems with random behavior. We will discuss what are relevant probabilistic relations between independent executions of a system, how we can formally express them in a temporal logic, and how we can decide the truth of such statements. |

09:50 | PRESENTER: Ichiro Hasuo ABSTRACT. Solving \emph{stochastic games} with the reachability objective is a fundamental problem, especially in quantitative verification and synthesis. For this purpose, \emph{bounded value iteration (BVI)} attracts attention as an efficient iterative method. However, BVI's performance is often impeded by costly \emph{end component (EC) computation} that is needed to ensure convergence. Our contribution is a novel BVI algorithm that conducts, in addition to local propagation by the Bellman update that is typical of BVI, \emph{global} propagation of upper bounds that is not hindered by ECs. To conduct global propagation in a computationally tractable manner, we construct a weighted graph and solve the \emph{widest path problem} in it. Our experiments show the algorithm's performance advantage over the previous BVI algorithms that rely on EC computation. |

10:10 | Safe Sampling-Based Planning for Stochastic Systems with Uncertain Dynamics PRESENTER: Thom Badings ABSTRACT. Autonomous systems that operate in safety-critical settings must be able to behave safely under stochastic disturbances and imprecise model parameters. Due to their uncertain and stochastic nature, such autonomous systems may be captured by probabilistic programs. A typical reach-avoid planning problem is to reach a desirable goal state while avoiding certain unsafe states. However, due to stochastic disturbances and imprecise model parameters, the outcome of an action by the controller of the autonomous system may be uncertain. Moreover, the set of safe states is typically non-convex, making it infeasible to find global solutions to reach-avoid problems in general. In this research, we employ abstraction-based methods to solve reach-avoid problems for continuous-state systems with partially unknown and/or stochastic dynamics. Given a partial model specification and a reach-avoid problem, we abstract the system into a discrete-state model that captures the uncertainty about the dynamics by probabilistic transitions between discrete states. We resort to sampling techniques to compute probably approximately correct (PAC) bounds on these transition probabilities, which we capture in a so-called interval Markov decision process (iMDP). We synthesize policies using a tailored version of the probabilistic model checker PRISM for iMDPs. We then convert the policy into a feedback controller for the continuous system, for which we give high-confidence guarantees on the probability that the reach-avoid problem is satisfied. With a diverse set of benchmarks, we show the robustness and practical applicability of our method for solving realistic reach-avoid problems. |

11:00 | Distribution Testing and Probabilistic Programming: A Match made in Heaven ABSTRACT. The field of distribution testing seeks to test whether a given distribution satisfies a property of interest. While there have been deep technical developments in the field of distribution testing, the field has somehow been confined to theoretical studies. I will discuss our work on development practical algorithmic frameworks based on distribution testing, and its relationship to probabilistic program verification. In particular, I will show that the problem of measuring distance between straight line probabilistic programs is #P-hard. As expected, the theoretical hardness shouldn’t deter us and I will discuss our success in using distribution testing-based frameworks to design efficient samplers. |

11:50 | A Quantitative Verification Infrastructure ABSTRACT. Following the success of deductive verification infrastructures for non-probabilistic programs like Viper and Boogie, we talk about our proposal for a deductive verification infrastructure for probabilistic programs. Our quantitative intermediate verification language enables modular reasoning about lower and upper bounds of expected values of programs. In our framework, we can encode and verify different proof rules for weakest pre-expectation semantics. As a proof of concept, we have created a deductive verifier for probabilistic programs on top of our framework. It supports reasoning about lower bounds of weakest liberal pre-expectations and about upper bounds of weakest pre-expectatons. For loops, we use encodings of proof rules of Park induction and k-induction. |

12:10 | Inferring Expected Runtimes and Sizes PRESENTER: Eleanore Meyer ABSTRACT. We present a modular approach to infer upper bounds on the expected runtimes of probabilistic integer programs automatically. To this end, it computes bounds on the runtimes of program parts and on the sizes of their variables in an alternating way. To evaluate its power, we implemented our approach in a new version of our open-source tool KoAT. |

Lunches will be held in Taub hall.

14:00 | Quantifying Leakage in Privacy Risk Analysis using Probabilistic Programming |

14:50 | Deterministic stream-sampling for probabilistic programming: semantics and verification PRESENTER: William Smith ABSTRACT. Probabilistic programming languages rely fundamentally on some notion of sampling, and this is doubly true for probabilistic programming languages which perform Bayesian inference using Monte Carlo techniques. Verifying samplers---proving that they generate samples from the correct distribution---is crucial to the use of probabilistic programming languages for statistical modelling and inference. However, the typical denotational semantics of probabilistic programs is incompatible with deterministic notions of sampling which is problematic, considering most statistical inference is performed using pseudorandom number generators. We present an operational and denotational semantics for probabilistic programming, in terms of operations on samplers, that allows for deterministic generation of random variables, and which distinguishes between samplers---programs---and the probability distributions which they target. We also introduce a sound calculus for demonstrating that samplers target a desired distribution, justifying common approaches to constructing samplers, which can be applied to produce samplers that are correct-by-construction. |

15:10 | Probabilistic Guarded Kleene Algebra with Tests PRESENTER: Wojciech Rozowski ABSTRACT. We extend Guarded Kleene Algebra with Tests (GKAT) with probabilistic branching and loops to reason about imperative programs with probabilistic control flow. We introduce coalgebraic operational semantics and a sound Salomaa style axiomatisation of bisimulation equivalence. As an example, we show the correctness of a variant of the Knuth-Yao algorithm simulating a three-sided die with a series of fair coin tosses. |