From Wikipedia, the free encyclopedia
In mathematics , the Milstein method , is a technique for the approximate numerical solution of a stochastic differential equation . It is named after Grigori N. Milstein who first published the method in 1974.[ 1] [ 2]
Consider the Itō stochastic differential equation
d
X
t
=
a
(
X
t
)
d
t
+
b
(
X
t
)
d
W
t
,
{\displaystyle \mathrm {d} X_{t}=a(X_{t})\,\mathrm {d} t+b(X_{t})\,\mathrm {d} W_{t},}
with initial condition X 0 = x 0 , where W t stands for the Wiener process , and suppose that we wish to solve this SDE on some interval of time [0, T ]. Then the Milstein approximation to the true solution X is the Markov chain Y defined as follows:
partition the interval [0, T ] into N equal subintervals of width
Δ
t
>
0
{\displaystyle \Delta t>0}
:
0
=
τ
0
<
τ
1
<
⋯
<
τ
N
=
T
with
τ
n
:=
n
Δ
t
and
Δ
t
=
T
N
;
{\displaystyle 0=\tau _{0}<\tau _{1}<\dots <\tau _{N}=T{\text{ with }}\tau _{n}:=n\Delta t{\text{ and }}\Delta t={\frac {T}{N}};}
set
Y
0
=
x
0
;
{\displaystyle Y_{0}=x_{0};}
recursively define
Y
n
{\displaystyle Y_{n}}
for
1
≤
n
≤
N
{\displaystyle 1\leq n\leq N}
by
Y
n
+
1
=
Y
n
+
a
(
Y
n
)
Δ
t
+
b
(
Y
n
)
Δ
W
n
+
1
2
b
(
Y
n
)
b
′
(
Y
n
)
(
(
Δ
W
n
)
2
−
Δ
t
)
,
{\displaystyle Y_{n+1}=Y_{n}+a(Y_{n})\Delta t+b(Y_{n})\Delta W_{n}+{\frac {1}{2}}b(Y_{n})b'(Y_{n})\left((\Delta W_{n})^{2}-\Delta t\right),}
where
Δ
W
n
=
W
τ
n
+
1
−
W
τ
n
{\displaystyle \Delta W_{n}=W_{\tau _{n+1}}-W_{\tau _{n}}}
and
b
′
{\displaystyle b'}
denotes the derivative of
b
(
x
)
{\displaystyle b(x)}
with respect to
x
{\displaystyle x}
. Note that the random variables
Δ
W
n
{\displaystyle \Delta W_{n}}
are independent and identically distributed normal random variables with expected value zero and variance
Δ
t
{\displaystyle \Delta t}
.
Intuitive Derivation
Using Ito's formula, we can rewrite the Geometric Brownian motion SDE
d
X
t
=
X
μ
d
t
+
X
σ
d
W
t
{\displaystyle \mathrm {d} X_{t}=X\mu \mathrm {d} t+X\sigma dW_{t}}
as
d
ln
X
t
=
(
μ
(
X
t
)
−
1
2
σ
(
X
t
)
2
)
d
t
+
σ
(
X
t
)
d
W
t
,
{\displaystyle \mathrm {d} \ln X_{t}=\left(\mu (X_{t})-{\frac {1}{2}}\sigma (X_{t})^{2}\right)\mathrm {d} t+\sigma (X_{t})\mathrm {d} W_{t},}
Thus, thee solution to the GBM SDE is
X
t
+
Δ
t
=
X
t
exp
{
∫
t
t
+
Δ
t
(
μ
(
X
t
)
−
1
2
σ
(
X
u
)
2
)
d
t
+
∫
t
t
+
Δ
t
σ
(
X
u
)
d
W
u
}
≈
X
t
(
1
+
μ
(
X
t
)
Δ
t
−
1
2
σ
2
(
X
t
)
Δ
t
+
σ
(
X
t
)
Δ
W
t
+
1
2
σ
(
X
t
)
2
(
Δ
W
t
)
2
)
=
X
t
+
a
(
X
t
)
Δ
t
+
b
(
X
t
)
Δ
W
t
+
1
2
b
(
X
t
)
2
X
t
(
(
Δ
W
t
)
2
−
Δ
t
)
≈
X
t
+
a
(
X
t
)
Δ
t
+
b
(
X
t
)
Δ
W
t
+
1
2
b
(
X
t
)
b
′
(
X
t
)
(
(
Δ
W
t
)
2
−
Δ
t
)
{\displaystyle {\begin{aligned}X_{t+\Delta t}&=X_{t}\exp \left\{\int _{t}^{t+\Delta t}\left(\mu (X_{t})-{\frac {1}{2}}\sigma (X_{u})^{2}\right)\mathrm {d} t+\int _{t}^{t+\Delta t}\sigma (X_{u})\mathrm {d} W_{u}\right\}\\&\approx X_{t}\left(1+\mu (X_{t})\Delta t-{\frac {1}{2}}\sigma ^{2}(X_{t})\Delta t+\sigma (X_{t})\Delta W_{t}+{\frac {1}{2}}\sigma (X_{t})^{2}(\Delta W_{t})^{2}\right)\\&=X_{t}+a(X_{t})\Delta t+b(X_{t})\Delta W_{t}+{\frac {1}{2}}{\frac {b(X_{t})^{2}}{X_{t}}}((\Delta W_{t})^{2}-\Delta t)\\&\approx X_{t}+a(X_{t})\Delta t+b(X_{t})\Delta W_{t}+{\frac {1}{2}}b(X_{t})b'(X_{t})((\Delta W_{t})^{2}-\Delta t)\end{aligned}}}
See also
References
Kloeden, P.E., & Platen, E. (1999). Numerical Solution of Stochastic Differential Equations . Springer, Berlin. ISBN 3-540-54062-8 . {{cite book }}
: CS1 maint: multiple names: authors list (link )