Jump to content

Delta method

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Afelton (talk | contribs) at 18:27, 3 October 2005 (initial creation). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.
(diff) ← Previous revision | Latest revision (diff) | Newer revision → (diff)

The delta method is a method for computing the variance of a function of a statistical estimator.

A consistent estimator converges in probability to its true value: if B is an estimator for β using n observations then

Using the first two terms of the Taylor series (using vector notation for the gradient), we can estimate h(B) as

Therefore,

and the variance is

Therefore, since B-β converges to 0,

or in univariate terms,

References