Bayesian inference

Please help to eliminate the shortcomings of this article, and please take part you in the discussion! (Insert item)

In Bayesian statistics is defined as a particular branch of modern statistics, which has three features:

  • A: statement about an unknown phenomenon,
  • B: statement about the unknown phenomenon and
  • C: statement of additional knowledge.
  • As a-priori probability,
  • As a posteriori probability, and
  • , English referred to as inverse model probability or likelihood.

Examples

Example of Laplace

Laplace 's theorem Thomas Bayes derived and applied to the mass of Saturn and other planets narrow again:

  • A: mass of Saturn is in a certain interval,
  • B: data from observatories mutual interference of Jupiter and Saturn and
  • C: the mass of Saturn must not be so small that it loses its rings, nor so large that it destroys the solar system.

" Pour en donner quelques applications interesting, j'ai de l' immense profits travail que vient de terminer M. Bouvard sur les mouvemens de Jupiter et de Saturne, dont il a construit the tables très précises. Il a discuté avec le plus grand soin et les les oppositions quadratures de ces deux planètes, observées par Bradley et par les qui l' ont suivi astronomes jusqu'à ces dernières années; il en a conclu les corrections of Elemens de leur mouvement et leurs masses comparées à celle du Soleil, pinch pour unité. Ses calculs donnent lui la masse de Saturne égale à la partie de 3512e celle du Soleil. En leur appliquant mes formules de probabilité depending trouve qu'il ya onze mille à parler contre un, que l' erreur de ce n'est pas un résultat centième de sa valeur, ou, à très peu ce qui revient près au même, qu'après un siècle de nouvelles observations ajoutées aux précédentes, et de la même discutées manière, le nouveau résultat ne pas d'un différera centième de celui de M. Bouvard. "

Translation: " To some interesting applications to name it, I 've benefited from the tremendous work, M. Bouvard just finished on the movements of Jupiter and Saturn, and of which he has created very precise tables. He has discussed with great care the oppositions and quadratures of these two planets, which were observed by Bradley and astronomers, who accompanied him in recent years; he unlocked the corrections to the elements of their movement and their masses compared to the Sun, which was used as reference. According to his calculations, the mass of Saturn is the 3512ten part of the solar mass. My formulas of probability theory applied to this, I come to the conclusion that the chances of 11000 are to 1, the error of this result, one-hundredth of its value is not, or what is the same means that, even after a century with new observations in addition to the already existing ones, the new result is different from M. Bouvard not more than one hundredth of that provided that it is carried out in the same way. "

Application in the test theory

The attractiveness of Bayesian statistics can be seen eg in hypothesis testing. Where E is the observed message and the null hypothesis H0, then the probability P (E | H0) is calculated using the traditional statistics; So how likely it is to observe E, if the null hypothesis is true. This is too small (less than the predetermined significance level ), then the null hypothesis is rejected.

Actually, however, you would rather know how likely it is that the null hypothesis is true if the event E has observed. Using the formula of Bayes, this can be calculated

With

  • P ( H0) is the a priori probability of H0 ( pre-assessment ).
  • P ( E | H0) is the conditional probability of E, under the condition that the null hypothesis H0 is true; as a function of H0 they are called the likelihood function.
  • P ( E) is the unconditional probability of E. It is a normalizing constant, which is sometimes referred to as evidence. The unconditional probability can use the theorem of the total probability as the expected value of the conditional probabilities P ( E | H0) | are calculated, where H1 is the alternative hypothesis (ie, the alternative hypothesis to H0 ) represents and P ( H1 E).
  • P ( H0 | E) is the posterior probability of H0 given E ( revaluation ).

Derivation

In the article by Cox ( 1946) postulates three are called, from which the Bayesian statistics are derived:

From this, the mathematical foundations of Bayesian statistics derived:

With A, B, C arbitrary propositions, S is the definite statement and the negation of the statement A.

110041
de