Negentropy

Negentropy is the abbreviation for negative entropy, is exactly the opposite of entropy, a special case of syntropy. In general, the negentropy is defined as entropy with a negative sign. It can be interpreted as a measure of the deviation of a random variable from the uniform distribution. Since the entropy of a uniformly distributed random sequence is maximum, it follows that the negative entropy of the sequence is minimal. In the information-theoretic interpretation of the entropy is thus the negentropy great if much information is contained in a string, and small in a random string.

Definition

Negentropy is defined differently. The simplest definition (see above) is: negentropy is negative entropy.

Another definition defines negentropy for discrete random variable so that it corresponds to the redundancy: By a suitable normalization can be achieved that the negentropy J is a uniformly distributed variable 0, which leads to the following definition:

With a uniformly distributed random variable, with the same correlation ( and covariance ) matrix- like, and H is the entropy ( the first term in this definition is only a normalization, it is crucial that H is taken with a negative sign ).

Interpretation and Miscellaneous

The concept of negative entropy was coined by Erwin Schrödinger in his book What is Life. He defines life as something that takes up the negative entropy and stores. This means that life is something that exports entropy and maintains its own entropy low: negentropy import is entropy export.

Although Schrödinger negative entropy meant free energy, as he wrote in a footnote contradicts the counter to the often put forward does not consider that the Second Law of Thermodynamics, as this process by supplying energy takes place (such as in the sunlight for plants).

Léon Brillouin shortened the name later to negentropy to express the facts on " positive " way: a living system takes negentropy and stores it. Organisms change negentropic caused by energy use. Because of the energy requirements are organisms open systems.

As for the use in other areas, the concept of negentropy does not seem to be clearly defined.

In the lexicon of biology negentropy is defined as the average information content of the single char in a given string, thereby providing a reference is made to the information theory. This also corresponds to the above example, for the uniform distribution, since in a " uniformly distributed " variable no additional information compared to a " uniform distribution " exists.

The term is defined somewhat differently from sociological systems theorists, namely as a "negation of entropy " or as " increased complexity ". Thus, negentropy is here synonymous with order or information and thus an indicator of the creation or definition of systems. Another ( free ) translation would be: " the absence of ( relatively complete ) entropy " or according to " the absence of chaos."

The opposite of negative entropy is entropy, ie the measure of the disorder or randomness of a system.

Negentropy and order

Similar contradictions as the term negentropy is the term used order, which is equated with the negentropy mostly.

Swell

596843
de