Tags:geometry of interaction, semantics of programming languages and tensorflow
Abstract:
The popular library TensorFlow (TF) has familiarised the mainstream of machine-learning community with programming language concepts such as dataflow computing and automatic differentiation. Additionally, it has introduced some genuinely new syntactic and semantic programming concepts. In this paper we study one such new concept. It is a subtle but important feature of TensorFlow, the ability to extract and manipulate the state of a computation graph. This feature allows the convenient specification of parameterised models by freeing the programmer of much of the bureaucratic burden of managing them, while still permitting the use of generic, model-independent, search and optimisation algorithms. We study this new language feature, which we call `graph abstraction' in the context of the call-by-value lambda calculus, using the recently developed Dynamic Geometry of Interaction formalism. We give a simple type system guaranteeing the safety of graph abstraction, and we also show the safety of critical language properties such as garbage collection and the beta law. The semantic model also suggests that the feature could be implemented in a general-purpose functional language reasonably efficiently.