From Wikipedia, the free encyclopedia
The delta method is a method for computing the variance of a function of a statistical estimator .
A consistent estimator converges in probability to its true value: if B is an estimator for β using n observations then
n
(
B
−
β
)
→
N
(
0
,
V
a
r
(
B
)
)
{\displaystyle {\sqrt {n}}\left(B-\beta \right)\rightarrow N\left(0,Var(B)\right)}
Using the first two terms of the Taylor series (using vector notation for the gradient ), we can estimate h(B) as
h
(
B
)
≈
h
(
β
)
+
∇
h
(
β
)
T
⋅
(
B
−
β
)
{\displaystyle h(B)\approx h(\beta )+\nabla h(\beta )^{T}\cdot (B-\beta )}
Therefore,
h
(
B
)
−
h
(
β
)
≈
∇
h
(
β
)
T
⋅
(
B
−
β
)
{\displaystyle h(B)-h(\beta )\approx \nabla h(\beta )^{T}\cdot (B-\beta )}
and the variance is
V
a
r
(
h
(
B
)
−
h
(
β
)
)
≈
V
a
r
(
∇
h
(
β
)
T
⋅
(
B
−
β
)
)
{\displaystyle Var\left(h(B)-h(\beta )\right)\approx Var\left(\nabla h(\beta )^{T}\cdot (B-\beta )\right)}
Therefore, since B -β converges to 0,
n
(
h
(
B
)
−
h
(
β
)
)
→
N
(
0
,
∇
h
(
B
)
T
⋅
V
a
r
(
B
)
⋅
∇
h
(
B
)
)
{\displaystyle {\sqrt {n}}\left(h(B)-h(\beta )\right)\rightarrow N\left(0,\nabla h(B)^{T}\cdot Var(B)\cdot \nabla h(B)\right)}
or in univariate terms,
n
(
h
(
B
)
−
h
(
β
)
)
→
N
(
0
,
V
a
r
(
B
)
⋅
(
h
′
(
B
)
)
2
)
{\displaystyle {\sqrt {n}}\left(h(B)-h(\beta )\right)\rightarrow N\left(0,Var(B)\cdot \left(h^{\prime }(B)\right)^{2}\right)}
References