Meta-mathematics is the mathematical analysis of the foundations of mathematics.

In 1920, the mathematician David Hilbert presented to the demand to make the math on the basis of a complete and consistent system of axioms. This effort became known as Hilbert's program. For the analysis of the foundations of mathematics using mathematical methods he coined the term meta-mathematics (based on metaphysics ).

The Hilbert program seemed to fail, since the Gödel's incompleteness theorem showed that there is not an axiom system that meets all the demands of Hilbert. In particular it is not possible to develop a formal system in which all the statements are true also proved.

After consistency proofs for part of arithmetic by Leopold Löwenheim, Thoralf Albert Skolem, Jacques Herbrand and Mojżesz Presburger Gerhard Gentzen achieved a consistency proof for Peano arithmetic first stage, although he used the so-called transfinite induction. However, all this evidence is the fact that they - according to Gödel's incompleteness theorem - could not be executed within arithmetic itself.

About the decidability there were important results of Alonzo Church, who was able to show the undecidability of the quantifier all levels. The concept of recursion is the equivalent of predictability.

Paul Lorenzen led in 1951 by a consistency proof for the branched type theory. This evidence provides the consistency of parts of classical analysis. In his book published in 1962, he summarizes the metamathematics metamathematics as " Mathematics of the metatheory ", whereby a metatheory is a (constructive or axiomatic ) theory of axiomatic theories.

By using the rule (infinite induction) obtained a complete formalism half (K. Schütte ) of arithmetic and as a consistency proof of constructive mathematics by incorporation in the Gentz ​​'s law.