Source coding theorem
Appearance
In information theory, the source coding theorem (Shannon 1948) informally states that:
- "N i.i.d. random variables each with entropy H(X) can be compressed into more than NH(X) bits with negligible risk of information loss, as N tends to infinity; but conversely, if they are compressed into fewer than NH(X) bits it is virtually certain that information will be lost." (MacKay 2003).
Devising coding strategies to achieve successfully this compression is the basis of the field of entropy encoding.