Law of large numbers

As laws of large numbers certain mathematical theorems from the stochastics are called.

In their simplest form, these rates imply that stabilizes the relative frequency of a random result are generally to the theoretical probability of a random result when the underlying random experiment is carried out repeatedly under the same conditions. The common formulation used, that the relative frequency of probability " approximates more and more " is misleading, since it may also provide for a large number of repetitions outliers. The approach is thus not monotonic.

Formally, to statements about the convergence of the arithmetic mean of random variables, usually divided into " strong" (almost sure convergence ) and "weak" ( convergence in probability) laws of large numbers.

Example: tossing a coin

The probability that a coin comes up heads when throwing, amounts to ½. The more often the coin is tossed, the less likely it is that the proportion of litters in which head appears ( ie the relative frequency of the event " head") differs by more than any given value of the theoretical probability ½. However, it is quite likely that the absolute difference between the number of head - throws and half the total number of throws increases.

In particular, these laws of large numbers do not indicate that an event which has not occurred as frequently as expected, his " residue " to compensate for sometime and therefore occur more frequently in the future must. This is often more common in roulette and lotto players error, the " offending " Zahlenart must catch up now but to get back to match the statistical uniform distribution. It states therefore no law of compensation.

An example: Suppose a series of coin tosses begin with "head", "tail ", " head ", " head ". Then "head" was thrown by then three times, " number" again. "Head" has to some extent a lead of two litters. After these four throws is the relative frequency of the "head" ¾ to. "Number " ¼ After 96 future litters a ratio of 47 times " figure " in places to 53 times "head" one. The projection of the "head" so after 100 tosses even greater than after four litters, but the relative distance has greatly reduced from the "head" and " number", or - and this is the statement of the law of large numbers - the difference in the relative frequency of "head" to the expected value of "head". The value is much closer to the expected value of 0.5 as ¾ = 0.75.

Weak law for relative frequencies

The simplest case of a law of large numbers, the weak law of relative frequencies, is the main result in Jacob Bernoulli's Ars Conjectandi (1713 ). A random experiment with exactly two outputs, called success and failure, ie a Bernoulli experiment, time will be repeated independently. Refers to the probability of success in a single execution, then the number of successes is a binomial distribution with parameters and. And then for the variance applies to the expected value of.

For the relative frequency and follows. The Chebyshev inequality is applied to so

For everyone. Since the right side of the inequality converges to zero as follows

That is, for every small, the probability that the relative frequency of success is not in the interval to zero if the number of trials approaches infinity.

Weak law of large numbers

They say that a sequence of random variables (ie, see also Lp- space ) satisfies the weak law of large numbers, if for all positive numbers:

So if the arithmetic mean of the centered random variable stochastically to converge. In the common case that all have the same expected value, which is equivalent to saying that the arithmetic mean of the stochastically to converge.

There are several conditions under which applies the weak law of large numbers. It applies, for example, if the random variables have finite variances, which are also limited by a common upper limit, and are pairwise uncorrelated, ie for meet.

The weak law of large numbers of Khinchin called as a condition for the convergence in probability that the random variables are a sequence of independent and identically distributed and have a finite expectation value.

The proof of the above sentences can always run through the Chebyshev inequality.

Strong Law of Large Numbers

They say that a sequence of random variables satisfy the strong law of large numbers, for if the following applies:

So if the arithmetic mean of the centered random variable almost surely to converge. If all have the same expected value, which is equivalent to saying that the arithmetic mean almost certainly converge to the.

Assuming that the fourth moments exist, the proof of the strong law of large numbers for sequences of independent, identically distributed random variables with the help of the Markov inequality relatively elementary possible. Difficulties arise rather in the interpretation of formal statements that are the subject of the strong law of large numbers.

The strong law of large numbers implies the weak law of large numbers. A strong law of large numbers applies, for example, if the sequence is independent and identically distributed random variables ( Second Law of Large Numbers by Kolmogorov ). One form of the strong law of large numbers for dependent random variables is the ergodic theorem.

Interpretation of the formal statements

Unlike classic episodes, such as those examined in the analysis, there may be in the probability theory is usually not an absolute statement about the convergence of a sequence of random results. Reason is that for example in a series of cubes Try sequences of random results, such as 6, 6, 6, ... are not excluded. When such a sequence of random results, the result of the arithmetic mean formed from them would not converge to the expected value of 3.5. However, stating the strong law of large numbers that the event where the arithmetic mean values ​​do not converge to the expected value of 3.5, the probability of having 0. We call such an event almost impossible event.

Subject of the law of large numbers is the sequence formed at a given sequence of random variables of the arithmetic mean of the centered random variables

Due to the problems described, the formal characterization of the convergence of this sequence must start with the value 0, not only, as in a conventional sequence of numbers from one arbitrarily small specified tolerance distance. In addition, an arbitrarily small tolerance probability is specified. The statement of the weak law of large numbers then means that at any given specification of a tolerance distance and a tolerance probability occurs at a sufficiently large index chosen a deviation which exceeds the tolerance distance at most with probability. In contrast, refers to the strong law of large numbers to the event that any of the deviations exceeds the tolerance distance.

Practical significance

History of the laws of large numbers

Was first formulated a law of large numbers by Jakob Bernoulli in 1689, with the posthumous publication took place before 1713. Bernoulli described his version of the weak law of large numbers as the Golden theorem. The first version of the strong law of large numbers for the special case of a coin toss was published in 1909 by Émile Borel. 1917 proved Francesco Cantelli first a general version of the strong law of large numbers.

A certain degree became the story of the strong law of large numbers with the 1981 theorem of N. Etemadi 1981. 's Theorem Etemadi shows the validity of the strong law of large numbers under the assumption that the random variables are integrable (ie a finite expectation value own ), each having the same distribution and two random variables are independent. The existence of a variance is not required.

262148
de