Jump to content

Talk:Information theory and measure theory

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Mct mht (talk | contribs) at 02:49, 21 June 2006 (Possible misstatement). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Possible misstatement

I am uncomfortable with the phrase 'we find that Shannon's "measure" of information content satisfies all the postulates and basic properties of a formal measure over sets.' This may not be quite correct, as it is a signed measure, as explained below in the article. How should it be better worded? --130.94.162.64 21:18, 20 June 2006 (UTC)[reply]

signed measure is still a measure, so if that's the only objection, it should be ok. on the other hand, the section title suggests entropy is a measure, that doesn't seem right. Mct mht 02:49, 21 June 2006 (UTC)[reply]

Kullback–Leibler divergence

Also, the Kullback–Leibler divergence should be explained here in a proper measure-theoretic framework. --130.94.162.64 21:27, 20 June 2006 (UTC)[reply]