Talk:Information theory and measure theory
Possible misstatement
I am uncomfortable with the phrase 'we find that Shannon's "measure" of information content satisfies all the postulates and basic properties of a formal measure over sets.' This may not be quite correct, as it is a signed measure, as explained below in the article. How should it be better worded? --130.94.162.64 21:18, 20 June 2006 (UTC)
- signed measure is still a measure, so if that's the only objection, it should be ok. on the other hand, the section title suggests entropy is a measure, that doesn't seem right. Mct mht 02:49, 21 June 2006 (UTC)
in the same vein, i think the article confuses measure in the sense of information theory with measure in the sense of real analysis in a few places. Mct mht 03:02, 21 June 2006 (UTC)
some language in the section is not clear:
- If we associate the existence of sets and with arbitrary discrete random variables X and Y, somehow representing the information borne by X and Y, respectively, such that: whenever X and Y are independent, and...
Associate sets to random variables how? are they the supports of the random variables? what's the σ-algebra? what's meant by two random variable being independent? Mct mht 03:14, 21 June 2006 (UTC)
- Just pretend that those sets exist. They are not the supports of the random variables. The sigma-algebra is the algebra generated by the operations of set union and intersection on those sets. See statistical independence. --130.94.162.64 03:34, 21 June 2006 (UTC)
Kullback–Leibler divergence
Also, the Kullback–Leibler divergence should be explained here in a proper measure-theoretic framework. --130.94.162.64 21:27, 20 June 2006 (UTC)