From Wikipedia, the free encyclopedia
The biconjugate gradient method is an algorithm to solve a particular system of linear equations
A
x
=
b
{\displaystyle Ax=b}
. In contrary to the conjugate gradient method the algorithm does not require the matrix
A
{\displaystyle A}
to be symmetric , but requires to additionally apply
A
∗
{\displaystyle A^{*}}
.
Biconjugate gradient algorithm
The regular preconditioner
M
{\displaystyle M}
may be chosen arbitrarily (so
M
−
1
=
1
{\displaystyle M^{-1}=1}
is frequently used).
Choose
x
0
,
y
0
{\displaystyle x_{0},y_{0}}
and let
r
0
←
b
−
A
x
0
,
s
0
←
c
−
A
∗
y
0
{\displaystyle r_{0}\leftarrow b-Ax_{0},s_{0}\leftarrow c-A^{*}y_{0}}
;
d
0
←
M
−
1
r
0
,
f
0
←
M
−
∗
s
0
{\displaystyle d_{0}\leftarrow M^{-1}r_{0},f_{0}\leftarrow M^{-*}s_{0}}
;
for
k
=
0
,
1
,
…
{\displaystyle k=0,1,\dots }
do
α
k
←
s
k
∗
M
−
1
r
k
f
k
∗
A
d
k
{\displaystyle \alpha _{k}\leftarrow {s_{k}^{*}M^{-1}r_{k} \over f_{k}^{*}Ad_{k}}}
;
x
k
+
1
←
x
k
+
α
k
d
k
{\displaystyle x_{k+1}\leftarrow x_{k}+\alpha _{k}d_{k}}
(
y
k
+
1
←
y
k
+
α
k
¯
f
k
)
{\displaystyle \left(y_{k+1}\leftarrow y_{k}+{\overline {\alpha _{k}}}f_{k}\right)}
;
r
k
+
1
←
r
k
−
α
k
A
d
k
{\displaystyle r_{k+1}\leftarrow r_{k}-\alpha _{k}Ad_{k}}
,
s
k
+
1
←
s
k
−
α
k
¯
A
∗
f
k
{\displaystyle s_{k+1}\leftarrow s_{k}-{\overline {\alpha _{k}}}A^{*}f_{k}}
;
β
k
←
s
k
+
1
∗
M
−
1
r
k
+
1
s
k
∗
M
−
1
r
k
{\displaystyle \beta _{k}\leftarrow {s_{k+1}^{*}M^{-1}r_{k+1} \over s_{k}^{*}M^{-1}r_{k}}}
;
d
k
+
1
←
M
−
1
r
k
+
1
+
β
k
d
k
{\displaystyle d_{k+1}\leftarrow M^{-1}r_{k+1}+\beta _{k}d_{k}}
,
f
k
+
1
←
M
−
∗
s
k
+
1
+
β
k
¯
f
k
{\displaystyle f_{k+1}\leftarrow M^{-*}s_{k+1}+{\overline {\beta _{k}}}f_{k}}
.
The algorithm is stopped if
r
n
=
0
{\displaystyle r_{n}=0}
or
s
n
=
0
{\displaystyle s_{n}=0}
, and
x
n
{\displaystyle x_{n}}
then is the required solution of the system
A
x
=
b
{\displaystyle Ax=b}
(
y
n
{\displaystyle y_{n}}
solves the dual problem
A
∗
y
=
c
{\displaystyle A^{*}y=c}
). In finite dimensional spaces the algorithm always terminates, as
n
≤
d
i
m
{\displaystyle n\leq dim}
may be proven.
The sequences produced by the algorithm are biorthogonal :
f
i
∗
A
d
j
=
0
{\displaystyle f_{i}^{*}Ad_{j}=0}
and
s
i
∗
M
−
1
r
j
=
0
{\displaystyle s_{i}^{*}M^{-1}r_{j}=0}
for
i
≠
j
{\displaystyle i\neq j}
.
If
A
=
A
∗
{\displaystyle A=A^{*}}
is symmetric and
y
0
=
x
0
{\displaystyle y_{0}=x_{0}}
is chosen, then
r
k
=
s
k
{\displaystyle r_{k}=s_{k}}
,
d
k
=
f
k
{\displaystyle d_{k}=f_{k}}
, and
x
k
=
y
k
{\displaystyle x_{k}=y_{k}}
is the same sequence as produced by the conjugate gradient method .