Jump to content

Kolmogorov's two-series theorem

From Wikipedia, the free encyclopedia
This is the current revision of this page, as edited by JJMC89 bot III (talk | contribs) at 06:34, 14 April 2025 (Moving Category:Probability theorems to Category:Theorems in probability theory per Wikipedia:Categories for discussion/Speedy). The present address (URL) is a permanent link to this version.
(diff) ← Previous revision | Latest revision (diff) | Newer revision → (diff)

In probability theory, Kolmogorov's two-series theorem is a result about the convergence of random series. It follows from Kolmogorov's inequality and is used in one proof of the strong law of large numbers.

Statement of the theorem

[edit]

Let be independent random variables with expected values and variances , such that converges in and converges in . Then converges in almost surely.

Proof

[edit]

Assume WLOG . Set , and we will see that with probability 1.

For every ,

Thus, for every and ,

While the second inequality is due to Kolmogorov's inequality.

By the assumption that converges, it follows that the last term tends to 0 when , for every arbitrary .

References

[edit]
  • Durrett, Rick. Probability: Theory and Examples. Duxbury advanced series, Third Edition, Thomson Brooks/Cole, 2005, Section 1.8, pp. 60–69.
  • M. Loève, Probability theory, Princeton Univ. Press (1963) pp. Sect. 16.3
  • W. Feller, An introduction to probability theory and its applications, 2, Wiley (1971) pp. Sect. IX.9