Rényi entropy

In information theory, the Rényi entropy is (named after Alfréd Rényi ) a generalization of the Shannon entropy. The Rényi entropy belongs to the family of functions, which are used for quantifying the diversity, uncertainty or randomness of a system.

The Rényi entropy of order α, where α > 0 is defined as:

Where X is a random variable with range { x1, x2 ... xn } and pi is the probability that X = xi. If the probabilities pi are all equal, then Hα (X) = log 2 N, independently of α. Otherwise, the entropies are monotonically decreasing as a function of α.

Here are some individual cases:

Which the logarithm of the thickness of X, which is sometimes also called the " Hartley entropy " of x.

Approaches the limit of 1 against (L' Hôpital ) gives:

That of " Shannon-Entropie/Informationsentropie " corresponds.

More

That of " Korrelationsentropie " corresponds. The limit of for is

And is named as it is the smallest value of minimum and entropy.

The Rényi entropies are important in ecology and statistics as indices of diversity. They also lead to a range of indices of fractal dimension.

  • Information Theory
678635
de