Mutual information

Mutual information or mutual information is a quantity from information theory, which indicates the strength of the statistical relationship between two random variables. The mutual information is also known as Synentropie. In contrast to a Markov source Synentropie first order which brings the redundancy of a source expression and thus should be minimal, which provides a channel Synentropie the average information content is, the passes from the transmitter to the receiver and should therefore be maximized.

Sometimes the term relative entropy is used, but this corresponds to the Kullback -Leibler divergence.

The mutual information is closely related to entropy and conditional entropy. Thus, the mutual information is calculated as follows:

Definition of the difference of source entropy and equivocation or receive entropy and misinformation:

Definition about probabilities:

Definition of the Kullback -Leibler divergence:

Definition of what the expected value:

Disappears the mutual information, then one speaks of statistical independence of two random variables. The mutual information is maximum when a random variable can be calculated entirely from the other.

The mutual information is based on the introduced by Claude Shannon definition of information with the help of entropy ( uncertainty, average information content ). If the mutual information, so the uncertainty is reduced by a random variable under the assumption that the other is known. If the maximum mutual information, uncertainty disappears consequently. As can be seen from the formal definition, the uncertainty of a random variable is reduced by knowledge of another. This expresses itself in the mutual information.

The mutual information plays a role, for example in data transmission. It can be used to determine the channel capacity of a line.

Accordingly, also an entropy H ( Z) of two different, in turn dependent from one another, depending entropies:

In the literature various terms are used. The equivocation also referred to as " Verlustentropie " and the misinformation as " irrelevant ". The mutual information is also referred to as " transmission " or "average mutual information content ".

363852
de