Jump to content

Information matrix test

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Bender235 (talk | contribs) at 18:53, 1 September 2017 (created a stub). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.
(diff) ← Previous revision | Latest revision (diff) | Newer revision → (diff)

In econometrics, the information matrix test is used to determine whether a regression model is misspecified. The test was developed by Halbert White,[1] who observed that in a correctly specified model and under standard regularity assumptions, the information matrix can be expressed in either of two ways: as the outer product of the gradient, or as a function of the Hessian matrix of the log-likelihood function.

Consider a linear model , where the errors are assumed to be distributed . If the parameters and are stacked in the vector , the resulting log-likelihood function is

The information matrix can then be expressed as

that is the expected value of the outer product of the gradient or score. Second, it can be written as the negative of the Hessian matrix of the log-likelihood function

If the model is correctly specified, both expressions should be equal. Combining the equivalent forms yields

where is an random matrix, where is the number of parameters. White showed that the elements of , where is the MLE, are asymptotically normally distributed with zero means when the model is correctly specified.[2]

References

  1. ^ White, Halbert (1982). "Maximum Likelihood Estimation of Misspecified Models". Econometrica. 50 (1): 1–25. JSTOR 1912526.
  2. ^ Godfrey, L. G. (1988). Misspecification Tests in Econometrics. New York: Cambridge University Press. pp. 35–37. ISBN 0-521-26616-5.