Fisher-Information

The Fisher information (named after the statistician Ronald Fisher) is a characteristic size of mathematical statistics, which can be defined for a family of probability densities and provides statements about the best possible quality of parameter estimates in this model.

Definition

The underlying statistical model consists of a family of probability densities with unknown parameters, an open subset of the real numbers. Then the Fisher information of a random variable is defined as

With the expected value.

Are the random variables independent and identically distributed ( iid), we obtain for the joint density and thus for the Fisher information

Properties and Applications

The Fisher information is i.i.d. Case under the regularity condition

Additive, ie for independent random variables and apply. This property is a simple application of the fact that the variances of independent random variables are additive.

Moreover, for suffiency statistics show that the Fisher information with respect to the same as for, satisfying.

Used the Fisher information especially in the Cramer -Rao inequality, where they at validity of the mentioned regularity condition is a lower bound for the variance of an estimator for supplies.

Extensions to higher dimensions

If the model depends on several parameters with, the Fisher information can be defined as a symmetric matrix,

Applies. The properties are substantially retained.

335769
de