Bernstein inequalities (probability theory)

The Bernstein 's inequality is an estimate of the probability theory and was proved by Sergei Bernstein. It is an extension of Hoeffding 's inequality, the estimation can be improved by adding an additional condition to the variance of the random variable.

The inequality holds for arbitrarily distributed bounded random variables with zero mean and bounded variance.

  • 3.1 Problem 1
  • 3.2 Problem 2
  • 3.3 Problem 3

Set

Let be a probability space. Let and be positive real numbers and a natural number. are independently distributed random variables with and for all.

Then:

Lemma

For all true:

A proof of the lemma and a more detailed proof of the theorem are in the proof archive.

Proof ( set)

This proof follows the approach of " Support Vector Machines " ( see References ).

For simplicity, we define

After converted, the result is:

Now the inequality is shown. We apply the Markov inequality with and for a (t is later specifically selected):

Since the random variables are independent by assumption, product and expectation can be interchanged. From the exponential function we form an infinite exponential series. This is limited by the integrable majorant. So the expected value and the sum can be interchanged. With and the condition follows:

With the estimation follows:

By the inequality for obtained by:

One set:

From the lemma ( above) follows with.

Application

Problem 1

Suppose and are known.

What is the probability is at most that the average exceeds a given value of k?

Release following.

The probability that the average value exceeds the value of k is a maximum.

Problem 2

Furthermore, and are known.

It should apply:

Calculate k using the Bernstein inequality.

Problem 3

Let and be known.

How many realizations are actually required, so that for given values ​​of k and applies that?

You need at least realizations.

Example

As a random variable, consider a coin. The i-th coin toss will be denoted by. This applies in head and number.

What is the probability that the mean value after litters exceeds the value?

Since the probability of heads and tails, respectively 50%, the expected value is 0, since the random variables can take only the values ​​-1 and 1, can be chosen.

. So can be selected.

Now just select. Here, the number of flips. After Bernstein 's inequality then:

So true, for example at 50 litters:

Thus, the mean exceeds, you would have to 50 litters 31 times head throw.

That is, the probability of obtaining the head 31 in 50 litters, is less than

119782
de