Continuity in probability

In probability theory, a stochastic process is said to be continuous in probability or stochastically continuous if its distributions converge whenever the values in the index set converge.

Definition

Let X=(X_t)_{t \in T} be a stochastic process in \R^n .

The process X is continuous in probability when X_r converges in probability to X_s whenever r converges to s .

Examples and Applications

Feller processes are continuous in probability at t=0 . Continuity in probability is a sometimes used as one of the defining property for Lévy process. Any process that is continuous in probability and has independent increments has a version that is càdlàg. As a result, some authors immediately define Lévy process as being càdlàg and having independent increments.

References

{{cite book |last1=Kallenberg |first1=Olav |author-link1=Olav Kallenberg |year=2002 |title=Foundations of Modern Probability|location= New York |publisher=Springer | edition=2nd| pages=286}}

{{cite web|title=Lectures on Lévy processes and Stochastic calculus, Braunschweig; Lecture 2: Lévy processes|url=http://www.applebaum.staff.shef.ac.uk/Brauns2notes.pdf|author=Applebaum, D.|pages=37–53|publisher=University of Sheffield}}

{{cite book |last1=Kallenberg |first1=Olav |author-link1=Olav Kallenberg |year=2002 |title=Foundations of Modern Probability|location= New York |publisher=Springer | edition=2nd| pages=290}}

Category:Stochastic processes