Jump to content

Covariance and correlation

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by PAR (talk | contribs) at 04:04, 12 July 2005 (New page). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.
(diff) ← Previous revision | Latest revision (diff) | Newer revision → (diff)

The mathematical description of covariance and correlation are very similar. Both are used to describe the degree of similarity between two random variables, or sets of random variables. There is a certain amount of disagreement as to the naming conventions used. The conventions used in this article will be those used by Oppenheim & Shafer (1975). For two sets of random variates and we have:

correlation matrix
covariance matrix
autocorrelation matrix
autocovariance matrix

In the case of stationarity, the means are constant and the covariance or correlation are functions only of the difference in the indices:

cross correlation
cross covariance
autocorrelation
autocovariance

Each of these statistics may be normalized by dividing by the respective standard deviations. For example, the normalized cross correlation is written:

where and </math>\sigma_y</math> are the standard deviations of the and respectively.

These definitions are easily extended to the case of continuous random variables.

References

  • . ISBN 0132146355. {{cite book}}: Missing or empty |title= (help); Unknown parameter |Author= ignored (|author= suggested) (help); Unknown parameter |Publisher= ignored (|publisher= suggested) (help); Unknown parameter |Title= ignored (|title= suggested) (help); Unknown parameter |Year= ignored (|year= suggested) (help)