Cramér–Rao bound

The Cramér -Rao inequality, named after the two mathematicians Harald Cramér and Calyampudi Radhakrishna Rao, provides in mathematical statistics at any parameters to estimate a lower bound for the variance of an estimator with respect to this parameter.

Definition

Be parameter random variable with unknown density estimator. Be the Fisher information. Then the Cramér -Rao inequality holds

,

Provided that the following regularity conditions are met:

  • The carrier of the probability densities is not dependent on the unknown parameters.
  • The probability densities are steadily differentiable.

Efficiency and optimality

If the identity of an estimator

Applies, it is called efficient. If he also is unbiased, it is optimal with respect to the mean square deviation. For an unbiased estimator, the lower bound on the inverse Fisher information easier.

Regularity and proof idea

The proof of the Cramér -Rao inequality is mainly due to the Cauchy- Schwarz inequality and two model assumptions governing the commutativity of differentiation and integration.

On the one hand

Apply and on the other hand, we take

Of. Direct insertion into the Cauchy- Schwarz inequality then yields the assertion.

Multidimensional formulation

Under similar regularity conditions, the Cramér -Rao inequality can be formulated also in the case of multi-dimensional parameters. The statement is then transferred to the consideration of the covariance matrix of the multivariate estimator and provides a relation in the sense of Loewner - order matrices.

Be the vector of unknown parameters and a multivariate random variable with an associated probability density.

The estimator

For the parameter vector has a covariance matrix

The Cramér -Rao inequality in this case is

The Fisher information matrix

Is.

Applications

Using the Cramér -Rao inequality can be the dynamic permeability of membranes to assess what is brisk application mainly in biotechnology and nanotechnology.

206244
de