Chernoff bound

In probability theory, named after Herman Chernoff Chernoff inequality describes an upper bound for the probability that a sequence of independent Bernoulli trials of their expected number of successes is different.

The Chernoff inequality is a versatile and widely used as an aid in the analysis of randomized algorithms in computer science.

Set

Let be a sequence of independent Bernoulli experiments with and. Accordingly, describes the expected number of successes () of the experiment.

Be a first arbitrary constant. Denote in the following to simplify the notation, in virtue of a new random variable. Due to the monotony of the figure then follows

Where is defined as and the last assessment by Markov inequality follows. Now applies

And thus

This follows

Consider now. Then we have

For a part of the exponent of the right term

One can show by means of curve sketching and Taylor series expansion that always applies. Due to the monotonicity of the exponential function applies. Together with the first estimate, the assertion follows.

Variants

  • A common variant of the Chernoff inequality can be formulated by means of the standard deviation. Be discrete, independent random variables with and. Denote the variance of. Then for each:

Examples

  • Consider the following question: How likely is it at least seven times to obtain ten times the toss of a fair coin, the result " number"? The coin flips ask Bernoulli trials with dar. Thus follows after the first Chernoff inequality:
  • Formulate the above example, only slightly and ask instead: How likely is it, at least seventy times to get at hundertmaligem fair coin toss, the result is " number"? Immediately the first Chernoff bound proves to be much stronger:
181904
de