Jump to content

Noisy-channel coding theorem

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Jheald (talk | contribs) at 23:09, 19 December 2005 (Separated from Shannon-Hartley theorem). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.
(diff) ← Previous revision | Latest revision (diff) | Newer revision → (diff)

In information theory, the Noisy channel coding theorem establishes that however contaminated with noise interference a communication channel may be, it is possible to communicate digital data (information) error-free up to a given maximum rate through the channel. This surprising result, sometimes called the fundamental theorem of information theory, or just Shannon's theorem, was first presented by Claude Shannon in 1948.

The Shannon limit or Shannon capacity of a communications channel is the theoretical maximum information transfer rate of the channel, for a particular noise level.

Summary

Proved by Claude Shannon in 1948, the theorem describes the maximum possible efficiency of error-correcting methods versus levels of noise interference and data corruption. The theory doesn't describe how to construct the error-correcting method, it only tells us how good the best possible method can be. Shannon's theorem has wide-ranging applications in both communications and data storage applications. This theorem is of foundational importance to the modern field of information theory.

The Shannon theorem states that given a noisy channel with information capacity C and information transmitted at a rate R, then if

there exists a coding technique which allows the probability of error at the receiver to be made arbitrarily small. This means that theoretically, it is possible to transmit information without error up to a limit, C.

The converse is also important. If

the probability of error at the receiver increases without bound if the transmission rate is increased. No useful information can be transmitted beyond the channel capacity.

Mathematical statement

Theorem (Shannon, 1948):

1. For every discrete memoryless channel, the channel capacity
has the following property. For any ε > 0 and R < C, for large enough N, there exists a code of length N and rate ≥ R and a decoding algorithm, such that the maximal probability of block error is ≤ ε.
2. If a probability of bit error pb is acceptable, rates up to R(pb) are achievable, where
3. For any pb, rates greater than R(pb) are not achievable.

(MacKay (2003), p. 162; cf Gallager (1968), ch.5; Cover and Thomas (1991), p. 198)

References

See also