via Cauchy sequences, Dedekind cuts, trace functions, several variants of sum approximations and continued fractions. Let S be a class of subrecursive functions.

The set of irrational numbers that can be obtained with functions from S depends on the representation. We compare the sets obtained by the different representations.

a key concept for the understanding of Natural Deduction.

The considerations are restricted to intuitionistic implication,

with hints to richer languages and substructural logics.

In Natural Deduction we are in a fine position concerning

local confluence: reductions of many usual logical connectives a r e

locally confluent. Furthermore, certain subclasses of reductions

with nice properties can be defined, especially

subclasses of reductions, which can be shown to terminate

and to preserve local confluence.

Due to Newman's lemma this yields confluence of such subclasses of reduction.

For confluence of full reductions a commutation property is stated,

showing how a reduction commutes with its own subreduction.

And termination of full reductions can be shown

by using a subreduction cover.

Modelling the human lung

The framework for the template CI is Nelson’s syntactic approach to Nonstandard Analysis, called internal set theory, and its fragments based on G ̈odel’s T as introduced in [1]. Notable results are that applying the template CI to theo- rems involving the nonstandard definitions of respectively continuity, compactness, and open set, the latter definitions are converted into the associated definitions from constructive or computable analysis (resp. continuity with a modulus, totally boundedness, and effectively open set).

Finally, we establish that a theorem of Nonstandard Analysis has the same com- putational content as its ‘highly constructive’ Herbrandisation. Thus, we establish an ‘algorithmic two-way street’ between so-called hard and soft analysis, i.e. be- tween the worlds of numerical and qualitative results. However, the study of the Herbrandisations of nonstandard theorems also leads to a new class of function- als (not involving Nonstandard Analysis) with rather strange properties. Chief among these new functionals is the special fan functional which can be computed easily in intuitionistic mathematics, but cannot be computed by the Turing jump functional (2E) or even much stronger comprehension axioms. Similar functionals exist for most theorems from the Reverse Mathematics zoo.

Atomic Systems in Proof-Theoretic Semantics

There are two main motivating factors behind this work. The task of making constructive sense of Zorn's lemma is an interesting and challenging proof theoretic problem in its own right. My emphasis here is on providing a natural realizer for the functional interpretation of the lemma which clearly reflects its computational content. This alone is a non-trivial task, as even in the weak cases of Zorn's lemma considered here such a realizer will necessarily be based on an extremely strong form of recursion, undefinable even in Goedel's system T. The second factor is that a direct computational interpretation of Zorn's lemma should enable us to extract intuitive programs from non-constructive proofs which rely on it. This in particular paves the way for a proof theoretic analysis of several important theorems in abstract algebra and well-quasi order theory that make use of choice in this form.

My talk builds on a number of recent studies which examine the constructive meaning of variants of Zorn's lemma, most importantly the work of U. Berger [1], who has given a direct and elegant modified realizability interpretation of a reformulation of the lemma known as open induction. The difference here is that I work in the alternative setting of Goedel's functional interpretation (which requires a different realizing term) and look towards giving a more general interpretation. Moreover, I emphasise the algorithmic behaviour of the realizer, linking it to my own recent research on giving learning-based realizers to the functional interpretation of classical principles [2]. The talk is very much about work in progress, and I aim to emphasise open problems and directions for future research.

[1] U. Berger: A computational interpretation of open induction. Proceedings of LICS 2004.

[2] T. Powell: Goedel's functional interpretation and the concept of learning. To appear in LICS 2016.

introduced in [1] for formalising reasoning about

points and sets. Using the cut-free, one-sided, labelled sequent calculus

LSSL-p presented in [2], we prove results concerning the unimodal fragments in a

purely syntactic way. Furthermore, we relate the K-L-fragment of the calculus to

known systems for S5.

[1] A. Dabrowski, L.S. Moss, R. Parikh. Topological reasoning and the logic

of knowledge, Annals of Pure and Applied Logic 78 (1996), pp. 73--110.

[2] B. Elbl. A cut-free sequent calculus for the logic of subset spaces. Submitted.

which makes it easier to carry out such kind of proofs than by

using directly the fact that the natural numbers is the least set

closed under zero and successor. So for proving all x.phi (x),

one doesn't define first A:= { x in Nat | phi (x) }

and show that $A$ is closed under 0 and successor. Instead, one uses

the schema of induction. Although using the schema of induction

amounts to essentially the same as showing the closure properties of A,

using the schema of induction is much easier to use and to teach.

Proofs by coinduction usually follow directly the principle that the

coinductively defined set is the largest set satisfying the principles

of the coinductively defined set.

For instance for carrying out proofs of bisimulation,

one usually introduces a relation and shows that it is a bisimulation relation.

This makes proofs by coinduction cumbersome and difficult to teach.

In this talk we will introduce schemata for coinduction which

are similar to the schemata for induction. The use of

the coinduction hypothesis is made easier by

defining coinductively defined sets as largest sets allowing

observations, rather than as largest sets closed under

introduction rules.

For instance the set Stream of streams of natural numbers is the largest set

allowing observations

head : Stream -> Nat and tail : Stream -> Stream ,

rather than being the largest set closed under

consrm : Nat -> Stream -> Stream.

Based on this we will first introduce schemata for defining functions

by primitive corecursion or guarded recursion. This is dual

to the principle of primitive recursion. Then

we define schemata for coinductive proofs of equality.

Finally we introduce schemata for coinductively defined relations

such as bisimulation relations.

We will give examples of how to carry out coinductive proofs on

paper. These proofs will make use of the coinduction hypothesis,

where restrictions that are dual to those for the use of the

induction hypothesis in inductive proofs are used.

Eliminating Disjunctions by Disjunction Elimination

We work with Scott's multi-conclusion entailment relations as extending Tarskian single-conclusion consequence relations. In a nutshell, the extra multi-conclusion axioms can be reduced to rules for the underlying single-conclusion relation that hold in all known mathematical instances. In practice this means to fold up branchings of proof trees by referring to the given mathematical structures.

Applications include the separation and extension theorems known under the names of Krull-Lindenbaum, Artin-Schreier, Szpilrajn and -- this is work in progress -- Hahn-Banach. Related work has been done on individual instances: in locale theory (Mulvey-Pelletier 1991), dynamical algebra (Coste-Lombardi-Roy 2001, Lombardi (1997-8), formal topology (Cederquist-Coquand-Negri 1998) and proof theory (Coquand-Negri-von Plato 2004). Further motivation for our approach was given by Berger's celebrated paper in LICS 2004.

Bitopological spaces and the continuity problem

In the talk, I will give the outline of a new cut elimination procedure for FOL in deep inference, as well as a decomposition-style presentation of Herbrand’s Theorem called a Herbrand Stratification that is proved not as a corollary of cut elimination, but in tandem with it. In doing so, I hope to provide a different and perhaps better perspective on FOL normalisation, Herbrand’s Theorem, and their relationship. More concretely, there is good reason to believe that, as in propositional logic, this research can provide us with new results in proof complexity.

Decidability by two semi-algorithms

(1) Jerome Leroux, Vector addition system reachability problem, ACM POPL 2011.

(2) Helmut Seidl, Sebastian Maneth, Gregor Kemper, Equivalence of Deterministic Top-Down Tree-to-String Transducers is Decidable, IEEE FOCS 2015.

(3) Yuxi Fu, Checking Equality and Regularity for Normed BPA with Silent Moves, ICALP (2) 2013

Among them (2) and (3) solve long standing open problems, and (1) gives a new simple proof for the decidability of a vector addition system (VAS), which can be tracked back to W.Mayr (1981).

This presentation focuses on (1) and (2), which run two semi-algorithms, one tries to say "yes" and another tries to say "no". The semi-algorithms saying "no" fail to say their termination, due to the non-constructive nature of the finite basis property for (1) and the convergence of an ideal sequence in a Noetherian ring for (2).

This is joint work with Alex Simpson and Neil Ghani.

Proof theory of the lambda calculus

the set of closed lambda terms by inductively defining the set

as a free algebra.

The novelty of the approach is that we construct and study

lambda calculus without using the notions of variables and alpha-equivalence.

In this approach we can study lambda terms as combinators and

can have a clean proof of the Church-Rosser Theorem in the Minlog

proof assistant.

Page maintained by:

Ralph MatthesLast modified (date in French): dim. mai 1 21:51:26 CEST 2016