In probability theory and statistics, a cross-covariance matrix is a matrix whose element in the i, j position is the covariance between the i-th element of a random vector and j-th element of another random vector. A random vector is a random variable with multiple dimensions. Each element of the vector is a scalar random variable. Each element has either a finite number of observed empirical values or a finite or infinite number of potential values. The potential values are specified by a theoretical joint probability distribution. Intuitively, the cross-covariance matrix generalizes the notion of covariance to multiple dimensions.
The auto-covariance matrix of a random vector
is typcically denoted by
or
.
Definition
For random vectors
and
, each containing random elements whose expected value and variance exist, the cross-covariance matrix of
and
is defined by[1]: p.336
![{\displaystyle \operatorname {K} _{\mathbf {X} \mathbf {Y} }=\operatorname {cov} (\mathbf {X} ,\mathbf {Y} ){\stackrel {\mathrm {def} }{=}}\ \operatorname {E} [(\mathbf {X} -\mathbf {\mu _{X}} )(\mathbf {Y} -\mathbf {\mu _{Y}} )^{\rm {T}}]}](/media/api/rest_v1/media/math/render/svg/96ab8cdb7a99b79fcc12aa96759624fca8288f92) | | Eq.1 |
where
and
are vectors containing the expected values of
and
. The vectors
and
need not have the same dimension, and either might be a scalar value. Any element of the cross-covariance matrix is itself a "cross-covariance".
Example
For example, if
and
are random vectors, then
is a
matrix whose
-th entry is
.
Properties
For the cross-covariance matrix, the following basic properties apply:[2]
![{\displaystyle \operatorname {cov} (\mathbf {X} ,\mathbf {Y} )=\operatorname {E} [\mathbf {X} \mathbf {Y} ^{\rm {T}}]-\mathbf {\mu _{X}} \mathbf {\mu _{Y}} ^{\rm {T}}}](/media/api/rest_v1/media/math/render/svg/98310d15037b674359dab0b260360fa442336fa3)



- If
and
are independent (or somewhat less restrictedly, if every random variable in
is uncorrelated with every random variable in
), then 
where
,
and
are random
vectors,
is a random
vector,
is a
vector,
is a
vector,
and
are
matrices of constants, and
is a
matrix of zeroes.
Definition for complex random vectors
If
and
are complex random vectors, the definition of the cross-covariance matrix is slightly changed. Transposition is replaced by hermitan transposition:
![{\displaystyle \operatorname {K} _{\mathbf {Z} \mathbf {W} }=\operatorname {cov} (\mathbf {Z} ,\mathbf {W} ){\stackrel {\mathrm {def} }{=}}\ \operatorname {E} [(\mathbf {Z} -\mathbf {\mu _{Z}} )(\mathbf {W} -\mathbf {\mu _{W}} )^{\rm {H}}]}](/media/api/rest_v1/media/math/render/svg/35f2df6d5550bc7eff958f026787a738eb3700d4)
For complex random vectors, another matrix called the pseudo-cross-covariance matrix is defined as follows:
![{\displaystyle \operatorname {J} _{\mathbf {Z} \mathbf {W} }=\operatorname {cov} (\mathbf {Z} ,{\overline {\mathbf {W} }}){\stackrel {\mathrm {def} }{=}}\ \operatorname {E} [(\mathbf {Z} -\mathbf {\mu _{Z}} )(\mathbf {W} -\mathbf {\mu _{W}} )^{\rm {T}}]}](/media/api/rest_v1/media/math/render/svg/b70f47f27e26849236481952e871b8e3bcf52708)
Two random vectors
and
are called uncorrelated if their cross-covariance matrix
matrix is zero.[1]: p.337
Complex random vectors
and
are called uncorrelated if their covariance matrix and pseudo-covariance matrix is zero, i.e. if
.
References