Channel capacity

The channel capacity is part of the information-theoretic description of a transmission channel. You are at the highest bit rate, just can be transmitted error- free nor with the information on a channel.

Claude Shannon and Ralph Hartley showed in the form of the Shannon -Hartley Act, that the theoretical channel capacity can be achieved by suitable coding approximation.

Definition

The channel capacity C for a discrete memoryless channel is the supremum of all correctly transmitted in a time interval symbols. A distinction is made between the information content of the transmitted symbols X and Y of the received symbols. The difference arises:

Mathematically, the entropy function I ( · ), which describes the mutual information, for these two cases be expressed as:

The channel capacity C can be defined as the maximum that specify:

Types of transmission channels

The channel capacity is dependent on the type of transmission channel. In the following, the channel capacity is shown for important models of transmission channels.

Binary symmetric channel

The binary transmission channel represents a discrete-value transmission channel, thus the maximum channel capacity is limited to one bit. The channel can only two symbols (states) that are designated to accept 0 or 1. Thus, the channel capacity can be in this channel model only in the interval from 0 to 1.

The channel capacity of this binary symmetric channel, in English also known as Binary Symmetric Channel ( BSC) is:

Where H (p ) is the binary entropy function:

And p is the probability of error.

For the extreme values ​​of the error probability of p = 1 or p = 0, the capacity of one and thus maximum. Wherein in the first case of the transfer channel is an inverter. In case p = 0, there is an error-free transmission of binary channel. For p = 0.5 is the channel capacity C = 0, ie minimal. In this case, the BSC to an ideal source of noise and no information can be transferred.

Binary erasure channel

And the binary erasure channel is a value -discrete transmission channel, the channel capacity is:

With error probability p. The channel capacity is at its maximum for p = 0, where no extinction occurs at this channel. It is minimal for p = 1 where all information is extinguished and no information transmission is possible.

AWGN channel

The AWGN channel represents a value- continuous transmission channel having as additive white Gaussian disturbance noise (AWGN ). Due to the continuous-value channel property infinitely many symbols may be found, but which can not be safely distinguished from each other by the disturbance may in principle. The resulting channel capacity C of the AWGN channel is:

Here, the expression S / N is the signal -to-noise ratio, abbreviated as SNR between the information to be transmitted and occurring at the channel, additive white Gaussian noise. For maximum channel capacity, it is necessary that the SNR is maximized. In the limiting case of a channel with S / N → ∞, by the fact an infinite set of symbols and an infinite channel capacity can be achieved - but this limiting case is not practically achievable by always interfering sources.

Demarcation

To get to the transmission rate, different symbols are transmitted over the channel in a time sequence. This temporal sequence of symbols results in a different symbol rate, which depends on the frequency and a specific bandwidth is spectrally. The transmission rate R is then obtained from the symbol rate and the value of the number of bits of information per symbol is conveyed. The value of how many bits are transmitted per symbol depends directly on the channel capacity and thus from each channel.

Literature sources

  • André Neubauer, Jürgen Freudenberger, Volker Kühn: Coding Theory. Algorithms, Architectures, and Applications. John Wiley & Son, Chichester, 2007, ISBN 978-0-470-02861-2.
462294
de