Jump to content

Conditional entropy

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Creidieki (talk | contribs) at 02:21, 18 August 2004 (Creation!). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.
(diff) ← Previous revision | Latest revision (diff) | Newer revision → (diff)

The conditional entropy is an entropy measure used in information theory. The conditional entropy measures how much entropy a random variable has remaining if we have already learned completely the value of a second random variable . It is referred to as the entropy of conditional on , and is written . Like other entropies, the conditional entropy is measured in bits.

Given random variables and with entropies and , and with a joint entropy , the conditional entropy of given is defined as . Intuitively, the combined system contains bits of information. If we learn the value of , we have gained bits of information, and the system has bits remaining.

if and only if the value of is completely determined by the value of . Conversely, if and only if and are independent random variables.

In quantum information theory, the conditional entropy is generalized to the conditional quantum entropy.