Jump to content

Source coding theorem

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Jheald (talk | contribs) at 21:04, 4 January 2006 (stub created). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.
(diff) ← Previous revision | Latest revision (diff) | Newer revision → (diff)

In information theory, the source coding theorem states that N i.i.d. random variables each with entropy H(X) can be compressed into more than NH(X) bits with negligible risk of information loss, as N tends to infinity; conversely, if they are compressed into fewer than NH(X) bits it is virtually certain that information will be lost.

(MacKay 2003, Shannon 1948).