PCC 2016 Talks with Abstracts (with PDF versions of 13 of them accessible through the links on the titles)

extra page for the invited speakers of PCC

Lars Kristiansen. On Subrecursive Representability of Irrational Numbers
Abstract: We consider various ways to represent irrational numbers by subrecursive functions:
via Cauchy sequences, Dedekind cuts, trace functions, several variants of sum approximations and continued fractions. Let S be a class of subrecursive functions.
The set of irrational numbers that can be obtained with functions from S depends on the representation. We compare the sets obtained by the different representations.
Alessio Guglielmi. Report on a 3-Year Project on Efficient and Natural Proof Systems
Abstract: I propose to present generalised normalisation methods and to use example proof systems to illustrate the main points.
Andrea Aler Tubella and Alessio Guglielmi. Generalising Cut-Elimination through Subatomic Proof Systems
Abstract: In work presented at PCC 2015, we showed how by considering atoms as self-dual, noncommutative, binary logical relations, we can reduce disparate rules such as introduction, contraction or cut to instances of single linear rule scheme. We show how we exploit this regularity to study cut-elimination. We are able to generalise a cut-elimination procedure to a wide range of systems, and in the process gain valuable insight as to why cut-elimination is such a prevalent phenomenon.
Ernst Zimmermann. Local confluence in Natural Deduction
Abstract: The talk tries to show that local confluence is
a key concept for the understanding of Natural Deduction.
The considerations are restricted to intuitionistic implication,
with hints to richer languages and substructural logics.
In Natural Deduction we are in a fine position concerning
local confluence: reductions of many usual logical connectives a r e
locally confluent. Furthermore, certain subclasses of reductions
with nice properties can be defined, especially
subclasses of reductions, which can be shown to terminate
and to preserve local confluence.
Due to Newman's lemma this yields confluence of such subclasses of reduction.
For confluence of full reductions a commutation property is stated,
showing how a reduction commutes with its own subreduction.
And termination of full reductions can be shown
by using a subreduction cover.
Lorenz Berger. Modelling the human lung
Abstract: Creating accurate computational models of the human body is difficult due to many challenges with modelling, validation and the development of correct numerical implementations. We present a computational lung model that tightly couples a poroelastic model of lung tissue to an airway fluid network to model breathing in the lungs. The poroelastic model approximates the porous structure of lung tissue using a continuum model. The naive approach of solving the resulting model is numerically unstable. A stabilized finite element method is presented to discretize the poroelastic equations and overcome these instabilities. To demonstrate the robust coupling between the poroelastic medium and the airway fluid network, numerical simulations on a realistic lung geometry are presented. Remaining challenges on the validation and verification of the model are also discussed.
Sam Sanders. The unreasonable effectiveness of Nonstandard Analysis
Abstract: As suggested by the title, the topic of my talk is the vast computational content of classical Nonstandard Analysis. In particular, I will present a template CI which allows one to obtain effective theorems from theorems in ‘pure’ Nonstandard Analysis, i.e. only involving the nonstandard definitions (of continuity, compactness, Riemann integration, convergence, et cetera). This template CI has been applied successfully to the Big Five systems of Reverse Mathematics, the Reverse Mathematics zoo, and computability theory. The template CI often produces theorems of Bishop’s Constructive Analysis.


The framework for the template CI is Nelson’s syntactic approach to Nonstandard Analysis, called internal set theory, and its fragments based on G ̈odel’s T as introduced in [1]. Notable results are that applying the template CI to theo- rems involving the nonstandard definitions of respectively continuity, compactness, and open set, the latter definitions are converted into the associated definitions from constructive or computable analysis (resp. continuity with a modulus, totally boundedness, and effectively open set).

Finally, we establish that a theorem of Nonstandard Analysis has the same com- putational content as its ‘highly constructive’ Herbrandisation. Thus, we establish an ‘algorithmic two-way street’ between so-called hard and soft analysis, i.e. be- tween the worlds of numerical and qualitative results. However, the study of the Herbrandisations of nonstandard theorems also leads to a new class of function- als (not involving Nonstandard Analysis) with rather strange properties. Chief among these new functionals is the special fan functional which can be computed easily in intuitionistic mathematics, but cannot be computed by the Turing jump functional (2E) or even much stronger comprehension axioms. Similar functionals exist for most theorems from the Reverse Mathematics zoo.
Rene Gazzari. Pure Proofs
Abstract: We intend to characterize philosophical notions as "to be a pure proof", "a simple proof" or "an elementary proof" from a formal point of view. After developing a good formal definition of those informal notions, we discuss the theorem that every proof may be transformed into a pure proof. We provide a partial solution and illustrate the problem of finishing the proof.
Thomas Piecha. Atomic Systems in Proof-Theoretic Semantics
Abstract: In proof-theoretic semantics the validity of atomic formulas, or atoms, is usually defined in terms of their derivability in atomic systems. Such systems can be sets of atoms, figuring as atomic axioms, or sets of atomic rules such as production rules. One can also allow for atomic rules which can discharge atomic assumptions, or even consider higher-level atomic rules which can discharge assumed atomic rules. In proof-theoretic semantics for minimal and intuitionistic logic atomic systems are used as the base case in an inductive definition of validity. We compare two approaches to atomic systems. The first approach is compatible with an interpretation of atomic systems as representations of states of knowledge. We present negative as well as positive completeness results for notions of validity based on this approach. The second approach takes atomic systems to be definitions of atomic formulas. The two views lead to different notions of derivability for atomic formulas, and consequently to different notions of proof-theoretic validity. In the first approach, validity is stable in the sense that for atomic formulas logical consequence and derivability coincide for any given atomic system. In the second approach this is not the case. This indicates that atomic systems as definitions, which determine the meaning of atomic sentences, might not be the proper basis for proof-theoretic validity, or conversely, that standard notions of proof-theoretic validity are not appropriate for definitional rule systems.
Thomas Powell. The computational content of Zorn's lemma
Abstract: Zorn's lemma is a well known formulation of the axiom of choice which states that any chain complete partially ordered set has a maximal element. Certain theorems in mathematics can be given a particularly elegant proof using Zorn's lemma - a well-known example of this is the theorem that any ring with unity has a maximal ideal. In this talk I will focus on giving a computational interpretation to Zorn's lemma. More precisely, I will describe a new form of recursion which realizes the functional interpretation of certain restricted instances of Zorn's lemma.

There are two main motivating factors behind this work. The task of making constructive sense of Zorn's lemma is an interesting and challenging proof theoretic problem in its own right. My emphasis here is on providing a natural realizer for the functional interpretation of the lemma which clearly reflects its computational content. This alone is a non-trivial task, as even in the weak cases of Zorn's lemma considered here such a realizer will necessarily be based on an extremely strong form of recursion, undefinable even in Goedel's system T. The second factor is that a direct computational interpretation of Zorn's lemma should enable us to extract intuitive programs from non-constructive proofs which rely on it. This in particular paves the way for a proof theoretic analysis of several important theorems in abstract algebra and well-quasi order theory that make use of choice in this form.

My talk builds on a number of recent studies which examine the constructive meaning of variants of Zorn's lemma, most importantly the work of U. Berger [1], who has given a direct and elegant modified realizability interpretation of a reformulation of the lemma known as open induction. The difference here is that I work in the alternative setting of Goedel's functional interpretation (which requires a different realizing term) and look towards giving a more general interpretation. Moreover, I emphasise the algorithmic behaviour of the realizer, linking it to my own recent research on giving learning-based realizers to the functional interpretation of classical principles [2]. The talk is very much about work in progress, and I aim to emphasise open problems and directions for future research.

[1] U. Berger: A computational interpretation of open induction. Proceedings of LICS 2004.

[2] T. Powell: Goedel's functional interpretation and the concept of learning. To appear in LICS 2016.
Alessio Santamaria. Substitution in Deep Inference via Atomic Flows
Abstract: This is a short talk in Alessio Guglielmi's group of talks, describing the problem of substitution of flows into other flows in the effort of studying the concept of substitution of a derivation into the occurrences of a given atom in another derivation.
Birgit Elbl. Decomposing a labelled sequent calculus for the logic of subset spaces
Abstract: The logic of subset spaces SSL is a bimodal logic
introduced in [1] for formalising reasoning about
points and sets. Using the cut-free, one-sided, labelled sequent calculus
LSSL-p presented in [2], we prove results concerning the unimodal fragments in a
purely syntactic way. Furthermore, we relate the K-L-fragment of the calculus to
known systems for S5.

[1] A. Dabrowski, L.S. Moss, R. Parikh. Topological reasoning and the logic
of knowledge, Annals of Pure and Applied Logic 78 (1996), pp. 73--110.

[2] B. Elbl. A cut-free sequent calculus for the logic of subset spaces. Submitted.
Anton Setzer. Schemata for Proofs by Coinduction
Abstract: Proofs by induction are carried out by following schemata for induction,
which makes it easier to carry out such kind of proofs than by
using directly the fact that the natural numbers is the least set
closed under zero and successor. So for proving all x.phi (x),
one doesn't define first A:= { x in Nat | phi (x) }
and show that $A$ is closed under 0 and successor. Instead, one uses
the schema of induction. Although using the schema of induction
amounts to essentially the same as showing the closure properties of A,
using the schema of induction is much easier to use and to teach.

Proofs by coinduction usually follow directly the principle that the
coinductively defined set is the largest set satisfying the principles
of the coinductively defined set.
For instance for carrying out proofs of bisimulation,
one usually introduces a relation and shows that it is a bisimulation relation.
This makes proofs by coinduction cumbersome and difficult to teach.

In this talk we will introduce schemata for coinduction which
are similar to the schemata for induction. The use of
the coinduction hypothesis is made easier by
defining coinductively defined sets as largest sets allowing
observations, rather than as largest sets closed under
introduction rules.
For instance the set Stream of streams of natural numbers is the largest set
allowing observations
head : Stream -> Nat and tail : Stream -> Stream ,
rather than being the largest set closed under
consrm : Nat -> Stream -> Stream.

Based on this we will first introduce schemata for defining functions
by primitive corecursion or guarded recursion. This is dual
to the principle of primitive recursion. Then
we define schemata for coinductive proofs of equality.
Finally we introduce schemata for coinductively defined relations
such as bisimulation relations.

We will give examples of how to carry out coinductive proofs on
paper. These proofs will make use of the coinduction hypothesis,
where restrictions that are dual to those for the use of the
induction hypothesis in inductive proofs are used.
Peter Schuster, Davide Rinaldi and Daniel Wessel. Eliminating Disjunctions by Disjunction Elimination
Abstract: Completeness theorems, or contrapositive forms of Zorn's Lemma, are often invoked in elementary contexts in which the corresponding syntactical conservation theorems would suffice. We present a pretty universal form of such a syntactical conservation theorem for Horn sequents, using an utterly versatile sandwich criterion given by Scott 1974.

We work with Scott's multi-conclusion entailment relations as extending Tarskian single-conclusion consequence relations. In a nutshell, the extra multi-conclusion axioms can be reduced to rules for the underlying single-conclusion relation that hold in all known mathematical instances. In practice this means to fold up branchings of proof trees by referring to the given mathematical structures.

Applications include the separation and extension theorems known under the names of Krull-Lindenbaum, Artin-Schreier, Szpilrajn and -- this is work in progress -- Hahn-Banach. Related work has been done on individual instances: in locale theory (Mulvey-Pelletier 1991), dynamical algebra (Coste-Lombardi-Roy 2001, Lombardi (1997-8), formal topology (Cederquist-Coquand-Negri 1998) and proof theory (Coquand-Negri-von Plato 2004). Further motivation for our approach was given by Berger's celebrated paper in LICS 2004.
Dieter Spreen. Bitopological spaces and the continuity problem
Abstract: The continuity problem is the question when effective (or Markov computable) maps between effectively given topological spaces are effectively continuous. It will be shown that this is always the case if the the range of the map is effectively bi-regular. As will be shown, such spaces appear quite naturally in the context of the problem.
David Sherratt. Atomic Lambda Calculus and its connections with Sharing Graphs
Abstract: Presented is a research idea to connect the atomic lambda calculus and sharing graphs through a formal graphical representation of the atomic lambda calculus, so that we may explore the area between full laziness and optimal reduction.
Benjamin Ralph. Decomposing First-Order Proofs using Deep Inference
Abstract: The deep-inference formalism, by allowing for very fine-grained inference steps and freer composition of proofs, has produced important results and innovations in various logics, especially classical propositional logic. A natural progression is to extend these insights to classical first-order logic (FOL) but, although a direct cut-elimination procedure has been provided, there has been no work as of yet that incorporates the many perspectives and techniques developed in the last ten years.
In the talk, I will give the outline of a new cut elimination procedure for FOL in deep inference, as well as a decomposition-style presentation of Herbrand’s Theorem called a Herbrand Stratification that is proved not as a corollary of cut elimination, but in tandem with it. In doing so, I hope to provide a different and perhaps better perspective on FOL normalisation, Herbrand’s Theorem, and their relationship. More concretely, there is good reason to believe that, as in propositional logic, this research can provide us with new results in proof complexity.
Mizuhito Ogawa. Decidability by two semi-algorithms
Abstract: Recently, several interesting examples of the decidablity proofs consisting of two semi-algorithms have been shown.
(1) Jerome Leroux, Vector addition system reachability problem, ACM POPL 2011.
(2) Helmut Seidl, Sebastian Maneth, Gregor Kemper, Equivalence of Deterministic Top-Down Tree-to-String Transducers is Decidable, IEEE FOCS 2015.
(3) Yuxi Fu, Checking Equality and Regularity for Normed BPA with Silent Moves, ICALP (2) 2013
Among them (2) and (3) solve long standing open problems, and (1) gives a new simple proof for the decidability of a vector addition system (VAS), which can be tracked back to W.Mayr (1981).
This presentation focuses on (1) and (2), which run two semi-algorithms, one tries to say "yes" and another tries to say "no". The semi-algorithms saying "no" fail to say their termination, due to the non-constructive nature of the finite basis property for (1) and the convergence of an ideal sequence in a Noetherian ring for (2).
Fredrik Nordvall Forsberg. A Type Theory for Comprehensive Parametric Polymorphism
Abstract: A class of models of System F is presented, where reflexive-graph-category structure for relational parametricity and fibrational models of impredicative polymorphism are combined. To achieve this, we modify the definition of fibrational model of impredicative polymorphism by adding one further ingredient to the structure: comprehension in the sense of Lawvere. Such comprehensive models, once further endowed with reflexive-graph-category structure, enjoy the expected consequences of parametricity. To prove this, we introduce a type theory extending System F with a sound and complete semantics with respect to the class of models considered. Proving the consequences of parametricity within this type theory requires new techniques since equality relations are not available, and standard arguments that exploit equality need to be reworked.

This is joint work with Alex Simpson and Neil Ghani.
Masahiko Sato. Proof theory of the lambda calculus
Abstract: We develop a proof theory of the lambda calculus where we study
the set of closed lambda terms by inductively defining the set
as a free algebra.

The novelty of the approach is that we construct and study
lambda calculus without using the notions of variables and alpha-equivalence.
In this approach we can study lambda terms as combinators and
can have a clean proof of the Church-Rosser Theorem in the Minlog
proof assistant.

Page maintained by:

Ralph Matthes
https://www.irit.fr/~Ralph.Matthes/

Last modified (date in French): dim. mai 1 21:51:26 CEST 2016