Jump to content

Continuity in probability

From Wikipedia, the free encyclopedia
The printable version is no longer supported and may have rendering errors. Please update your browser bookmarks and please use the default browser print function instead.

In probability theory, a stochastic process is said to be continuous in probability or stochastically continuous if its distributions converge whenever the values in the index set converge. [1][2]

Definition

Let be a stochastic process in . The process is continuous in probability when converges in probability to whenever converges to .[2]

Examples and Applications

Feller processes are continuous in probability at . Continuity in probability is a sometimes used as one of the defining property for Lévy process.[1] Any process that is continuous in probability and has independent increments has a version that is càdlàg.[2] As a result, some authors immediately define Lévy process as being càdlàg and having independent increments.[3]

References

  1. ^ a b Applebaum, D. "Lectures on Lévy processes and Stochastic calculus, Braunschweig; Lecture 2: Lévy processes" (PDF). University of Sheffield. pp. 37–53.
  2. ^ a b c Kallenberg, Olav (2002). Foundations of Modern Probability (2nd ed.). New York: Springer. p. 286.
  3. ^ Kallenberg, Olav (2002). Foundations of Modern Probability (2nd ed.). New York: Springer. p. 290.