Jump to content

Normalizing constant

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Henrygb (talk | contribs) at 01:58, 12 December 2003. The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.
(diff) ← Previous revision | Latest revision (diff) | Newer revision → (diff)

The normalizing constant is a concept from Bayesian probability. In Bayes theorem we have:

where P(H0) is the prior probability that the hypothesis is true; P(D|H0) is the likelihood of the data given that the hypothesis is true; and P(H0|D) is the posterior probability that the hypothesis is true given the data. P(D) should be the probability of producing the data, but on its own is difficult to calculate, so an alternative way to describe this relationship is as one of proportionality:

.

Since P(H|D) is a probability, the sum over all possible (mutually exclusive) hypotheses should be 1, leading to the conclusion that

.

In this case, the value

is the normalizing constant. It can be extended from countably many hypotheses to uncountably many by replacing the sum by an integral.