Jump to content

Gauss–Markov theorem

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Michael Hardy (talk | contribs) at 21:23, 17 January 2003. The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.
(diff) ← Previous revision | Latest revision (diff) | Newer revision → (diff)

This article is not about Gauss-Markov processes.

In statistics, the Gauss-Markov theorem states that in a linear model in which the errors have expecation zero and are uncorrelated and homoscedastic, the best linear unbiased estimators of the coefficients are the least-squares estimators. The errors are not assumed to be normally distributed, nor are they assumed to be independent (but only uncorrelated --- a weaker condition), nor are they assumed to be identically distributed (but only homoscedastic --- a weaker condition).

More explicitly, and more concretely, suppose we have

for i = 1, . . . , n, where β0 and β1 are non-random but unobservable parameters, xi are non-random and observable, εi are random, and so Yi are random. (We set x in lower-case because it is not random, and Y in capital because it is random.) The random variables xi are called the "errors". The Gauss-Markov assumptions state that

(i.e., all errors have the same variance; that is "homoscedasticity"), and

for , that is "uncorrelatedness."