Conditional entropy
The conditional entropy is an entropy measure used in information theory. The conditional entropy measures how much entropy a random variable has remaining if we have already learned completely the value of a second random variable . It is referred to as the entropy of conditional on , and is written . Like other entropies, the conditional entropy is measured in bits.
Given random variables and with entropies and , and with a joint entropy , the conditional entropy of given is defined as . Intuitively, the combined system contains bits of information. If we learn the value of , we have gained bits of information, and the system has bits remaining.
if and only if the value of is completely determined by the value of . Conversely, if and only if and are independent random variables.
In quantum information theory, the conditional entropy is generalized to the conditional quantum entropy.