Information theory

Information theory is a mathematical theory from the field of probability theory and statistics, which goes back to Claude Shannon.

It deals with concepts such as information, entropy, information transmission, data compression, coding, and related topics.

In addition to mathematics, computer science and communications engineering, the theoretical consideration of communication through information theory also for the description of communication systems in other areas ( media in journalism, nervous system neurology, DNA and protein sequences in molecular biology, knowledge in information science and documentation, etc. ) are used.

The Shannon's theory uses the concept of entropy, to characterize the density of information ( information content ) of messages. The non-uniform a message is constructed, the higher its entropy. Fundamental to the information theory, in addition to the concept of entropy is the Shannon - Hartley Act, by Claude Elwood Shannon and Ralph Hartley. It describes the theoretical upper limit of the channel capacity, which is the maximum data transfer rate that achieves a transmission channel as a function of bandwidth and signal-to- noise ratio without a transmission error.

History

Claude Elwood Shannon delivered Especially in the 1940s to 1950s, significant contributions to the theory of data transmission and probability theory.

He wondered how to ensure lossless data transmission channels via electronic ( and optical today). This involves, in particular, to separate the data signals from the background noise. They are also trying to identify errors occur during the transmission error and correct it. For this it is necessary to provide additional redundant (i.e., no additional information-carrying ) data send along, to allow the data receiver, a data verification or correction data.

It is doubtful and was not claimed by Shannon that his 1948 study published A Mathematical Theory of Communication ( " information theory " ) of substantial importance for issues outside the communications technology is. The concept of entropy used by him, associated with the thermodynamics is a formal analogy for a mathematical expression. General can be defined as an engineering theory at a high abstraction level of information theory. It shows the trend for the scientific nature of art, which led to the formation of the engineering sciences. The reference point of Shannon's theory is the accelerated development of electrical communications technology with its forms telegraphy, telephony, radio and television in the first half of the 20th century. Before and next to Shannon and Harry Nyquist, RVL Hartley and Karl Küpfmüller contributed with important contributions to the theory of communication engineering. Mathematical clarifications of relevance to the information theory gave Norbert Wiener, who helped her to considerable publicity in the context of his reflections on cybernetics.

An overarching question for communications engineers was, as an economically - efficient and interference-free message transmission can be achieved. It the advantages of the modulation are detected, i.e., changing the shape of the message by technical means. In a technical context allows two basic forms of news - continuous and discrete - differ. This can be the common representation forms of information / news briefs ( discrete), are assigned to language (continuous) and image (continuous).

In the late 1930s there was a technical breakthrough, when it became possible with the help of pulse code modulation to represent a present as a continuum message in a satisfactory approximation discreet. With this method it was possible to telegraph language. Shannon, who worked for the Bell Telephone Laboratories, was familiar with the technical development. The importance of his theory of the technique is that he described as " physical size " with a measure or counting, the bit defined information. This allowed quantitative exactly to compare the costs for the technical transfer of information in various forms (sounds, characters, images ) to determine the effectiveness of codes and the capacity of information storage and transmission channels.

The definition of the bit is a theoretical expression of the new technical possibilities, different representation forms of messages (information) to transform into a common, advantageous for purposes of representation of information: A sequence of electrical impulses, which can be expressed by a binary code. This is ultimately the foundation for an information technology on a digital basis, as well as multimedia. It was known in principle with the information theory. In practice, however, the digital transition of information technology was only later - coupled with the rapid development of microelectronics in the second half of the 20th century possible.

Shannon himself describes his work as a " mathematical theory of communication". It includes semantic and pragmatic aspects of information, that is, statements about the "content" of transmitted messages and their significance for the receiver explicitly. This means that one is "meaningful" message transmitted as conscientious as a random sequence of letters. Although the Shannon theory, it is commonly referred to as " Information Theory ", it therefore does not make any direct information about the information content of transmitted messages.

In recent times, is increasingly attempted, not to determine the complexity of a message only to statistical analysis of the data, but rather to consider the algorithm that can generate the data. Such approaches are particularly the Kolmogorov complexity and algorithmic depth, as well as the algorithmic information theory Gregory Chaitin. Classical information concepts fail partially in quantum mechanical systems. This leads to the concept of quantum information.

Information theory provides mathematical methods for measuring certain properties of data. The concept of information from information theory has no direct relation to semantics, meaning and knowledge, since they can not be properties with information theoretic methods measure.

412317
de