Jump to content

Generalized inverse Gaussian distribution

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Michael Hardy (talk | contribs) at 19:38, 7 January 2017 (Entropy). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.
Generalized inverse Gaussian
Probability density function
Probability density plots of GIG distributions
Parameters a > 0, b > 0, p real
Support x > 0
PDF
Mean

Mode
Variance
MGF
CF

In probability theory and statistics, the generalized inverse Gaussian distribution (GIG) is a three-parameter family of continuous probability distributions with probability density function

where Kp is a modified Bessel function of the second kind, a > 0, b > 0 and p a real parameter. It is used extensively in geostatistics, statistical linguistics, finance, etc. This distribution was first proposed by Étienne Halphen.[1][2][3] It was rediscovered and popularised by Ole Barndorff-Nielsen, who called it the generalized inverse Gaussian distribution. It is also known as the Sichel distribution, after Herbert Sichel.[4] Its statistical properties are discussed in Bent Jørgensen's lecture notes.[5]

Properties

Summation

Barndorff-Nielsen and Halgreen proved that the GIG distribution has infinite divisibility[6]

Entropy

The entropy of the generalized inverse Gaussian distribution is given as[citation needed]

where is a derivative of the modified Bessel function of the second kind with respect to the order evaluated at

Differential equation

The pdf of the generalized inverse Gaussian distribution is a solution to the following differential equation:

Special cases

The inverse Gaussian and gamma distributions are special cases of the generalized inverse Gaussian distribution for p = -1/2 and b = 0, respectively.[7] Specifically, an inverse Gaussian distribution of the form

is a GIG with , , and . A Gamma distribution of the form

is a GIG with , , and .

Other special cases include the inverse-gamma distribution, for a=0, and the hyperbolic distribution, for p=0.[7]

Conjugate prior for Gaussian

The GIG distribution is conjugate to the normal distribution when serving as the mixing distribution in a normal variance-mean mixture.[8][9] Let the prior distribution for some hidden variable, say , be GIG:

and let there be observed data points, , with normal likelihood function, conditioned on :

where is the normal distribution, with mean and variance . Then the posterior for , given the data is also GIG:

where .[note 1]

Notes

  1. ^ Due to the conjugacy, these details can be derived without solving integrals, by noting that
    .
    Omitting all factors independent of , the right-hand-side can be simplified to give an un-normalized GIG distribution, from which the posterior parameters can be identified.


References

  1. ^ Seshadri, V. (1997). "Halphen's laws". In Kotz, S.; Read, C. B.; Banks, D. L. (eds.). Encyclopedia of Statistical Sciences, Update Volume 1. New York: Wiley. pp. 302–306.
  2. ^ Perreault, L.; Bobée, B.; Rasmussen, P. F. (1999). "Halphen Distribution System. I: Mathematical and Statistical Properties". Journal of Hydrologic Engineering. 4 (3): 189. doi:10.1061/(ASCE)1084-0699(1999)4:3(189).
  3. ^ Étienne Halphen was the uncle of the mathematician Georges Henri Halphen.
  4. ^ Sichel, H.S., Statistical valuation of diamondiferous deposits, Journal of the South African Institute of Mining and Metallurgy 1973
  5. ^ Jørgensen, Bent (1982). Statistical Properties of the Generalized Inverse Gaussian Distribution. Lecture Notes in Statistics. Vol. 9. New York–Berlin: Springer-Verlag. ISBN 0-387-90665-7. MR 0648107.
  6. ^ O. Barndorff-Nielsen and Christian Halgreen, Infinite Divisibility of the Hyperbolic and Generalized Inverse Gaussian Distributions, Zeitschrift für Wahrscheinlichkeitstheorie und verwandte Gebiete 1977
  7. ^ a b Johnson, Norman L.; Kotz, Samuel; Balakrishnan, N. (1994), Continuous univariate distributions. Vol. 1, Wiley Series in Probability and Mathematical Statistics: Applied Probability and Statistics (2nd ed.), New York: John Wiley & Sons, pp. 284–285, ISBN 978-0-471-58495-7, MR 1299979
  8. ^ Dimitris Karlis, "An EM type algorithm for maximum likelihood estimation of the normal–inverse Gaussian distribution", Statistics & Probability Letters 57 (2002) 43–52.
  9. ^ Barndorf-Nielsen, O.E., 1997. Normal Inverse Gaussian Distributions and stochastic volatility modelling. Scand. J. Statist. 24, 1–13.

See also