Difference between revisions of "Markov process, stationary"
(→References: Feller: internal link) |
Ulf Rehmann (talk | contribs) m (tex encoded by computer) |
||
Line 1: | Line 1: | ||
− | A | + | <!-- |
+ | m0625001.png | ||
+ | $#A+1 = 29 n = 0 | ||
+ | $#C+1 = 29 : ~/encyclopedia/old_files/data/M062/M.0602500 Markov process, stationary | ||
+ | Automatically converted into TeX, above some diagnostics. | ||
+ | Please remove this comment and the {{TEX|auto}} line below, | ||
+ | if TeX found to be correct. | ||
+ | --> | ||
− | + | {{TEX|auto}} | |
+ | {{TEX|done}} | ||
− | + | A [[Markov process|Markov process]] which is a [[Stationary stochastic process|stationary stochastic process]]. There is a stationary Markov process associated with a homogeneous Markov [[Transition function|transition function]] if and only if there is a stationary initial distribution $ \mu ( A) $ | |
+ | corresponding to this function, that is, $ \mu ( A) $ | ||
+ | satisfies | ||
− | + | $$ | |
+ | \mu ( A) = \int\limits _ { X } P ( x , t , A ) \mu ( d x ) . | ||
+ | $$ | ||
− | + | If the phase space $ X $ | |
+ | is finite, then a stationary initial distribution always exists, independent of whether the process has discrete $ ( t= 0 , 1 ,\dots) $ | ||
+ | or continuous time. For a process in discrete time and for a countable set $ X $, | ||
+ | a condition for existence of a stationary distribution has been found by A.N. Kolmogorov [[#References|[1]]]: It is necessary and sufficient that there is class of communicating states $ Y \subset X $ | ||
+ | such that the mathematical expectation of the time for reaching $ y _ {2} \in Y $ | ||
+ | from $ y _ {1} \in Y $ | ||
+ | is finite for any $ y _ {1} \in Y $. | ||
+ | This criterion has been generalized to strong Markov processes with an arbitrary phase space $ X $: | ||
+ | For the existence of a stationary process it is sufficient that there is a compact set $ K \subset X $ | ||
+ | such that the expectation of the time of reaching $ K $ | ||
+ | from $ x $ | ||
+ | is finite for all $ x \in X $. | ||
+ | There is the following sufficient condition for the existence of a stationary Markov process in terms of Lyapunov stochastic functions (cf. [[Lyapunov stochastic function|Lyapunov stochastic function]]): If there is a function $ V ( x) \leq 0 $ | ||
+ | for which $ L V ( x) \leq - 1 $ | ||
+ | for $ x \notin K $, | ||
+ | then there is a stationary Markov process associated with the Markov transition function $ P ( x , t , A ) $. | ||
+ | Here $ L $ | ||
+ | is the infinitesimal generator of the process. | ||
− | A stationary initial distribution satisfies the Fokker–Planck(–Kolmogorov) equation | + | When the stationary initial distribution $ \mu $ |
+ | is unique, the corresponding stationary process is ergodic. In this case the Cesàro mean of the transition probabilities converges weakly to $ \mu $. | ||
+ | Under certain additional conditions, | ||
+ | |||
+ | $$ | ||
+ | \lim\limits _ {t \rightarrow \infty } P ( x , t , A ) = \ | ||
+ | \mu ( A) \ ( \textrm{ weakly } ) . | ||
+ | $$ | ||
+ | |||
+ | A stationary initial distribution satisfies the Fokker–Planck(–Kolmogorov) equation $ L ^ {*} \mu = 0 $, | ||
+ | where $ L ^ {*} $ | ||
+ | is the adjoint operator to the infinitesimal operator of the process. For example, $ L ^ {*} $ | ||
+ | is the adjoint operator to the generating differential operator of the process for diffusion processes. In this case $ \mu $ | ||
+ | has a density $ p $ | ||
+ | with respect to the Lebesgue measure which satisfies $ L ^ {*} p = 0 $. | ||
+ | In the one-dimensional case this equation can be solved by quadrature. | ||
====References==== | ====References==== | ||
<table><TR><TD valign="top">[1]</TD> <TD valign="top"> A.N. Kolmogorov, "Markov chains with a countable number of states" , Moscow (1937) (In Russian)</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top"> J.L. Doob, "Stochastic processes" , Wiley (1953)</TD></TR><TR><TD valign="top">[3]</TD> <TD valign="top"> A.B. Sevast'yanov, "An ergodic theorem for Markov processes and its application to telephone systems with refusals" ''Theor. Probab. Appl.'' , '''2''' (1957) pp. 104–112 ''Teor. Veroyatnost. i Primenen.'' , '''2''' : 1 (1957) pp. 106–116</TD></TR></table> | <table><TR><TD valign="top">[1]</TD> <TD valign="top"> A.N. Kolmogorov, "Markov chains with a countable number of states" , Moscow (1937) (In Russian)</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top"> J.L. Doob, "Stochastic processes" , Wiley (1953)</TD></TR><TR><TD valign="top">[3]</TD> <TD valign="top"> A.B. Sevast'yanov, "An ergodic theorem for Markov processes and its application to telephone systems with refusals" ''Theor. Probab. Appl.'' , '''2''' (1957) pp. 104–112 ''Teor. Veroyatnost. i Primenen.'' , '''2''' : 1 (1957) pp. 106–116</TD></TR></table> | ||
− | |||
− | |||
====Comments==== | ====Comments==== | ||
− | |||
====References==== | ====References==== | ||
<table><TR><TD valign="top">[a1]</TD> <TD valign="top"> K.L. Chung, "Markov chains with stationary transition probabilities", Springer (1960)</TD></TR><TR><TD valign="top">[a2]</TD> <TD valign="top"> W. Feller, [[Feller, "An introduction to probability theory and its applications"|"An introduction to probability theory and its applications"]], '''1–2''', Wiley (1966)</TD></TR><TR><TD valign="top">[a3]</TD> <TD valign="top"> P. Lévy, "Processus stochastiques et mouvement Brownien", Gauthier-Villars (1965)</TD></TR><TR><TD valign="top">[a4]</TD> <TD valign="top"> E. Parzen, "Stochastic processes", Holden-Day (1962)</TD></TR><TR><TD valign="top">[a5]</TD> <TD valign="top"> Yu.A. Rozanov, "Stationary random processes", Holden-Day (1967) (Translated from Russian)</TD></TR></table> | <table><TR><TD valign="top">[a1]</TD> <TD valign="top"> K.L. Chung, "Markov chains with stationary transition probabilities", Springer (1960)</TD></TR><TR><TD valign="top">[a2]</TD> <TD valign="top"> W. Feller, [[Feller, "An introduction to probability theory and its applications"|"An introduction to probability theory and its applications"]], '''1–2''', Wiley (1966)</TD></TR><TR><TD valign="top">[a3]</TD> <TD valign="top"> P. Lévy, "Processus stochastiques et mouvement Brownien", Gauthier-Villars (1965)</TD></TR><TR><TD valign="top">[a4]</TD> <TD valign="top"> E. Parzen, "Stochastic processes", Holden-Day (1962)</TD></TR><TR><TD valign="top">[a5]</TD> <TD valign="top"> Yu.A. Rozanov, "Stationary random processes", Holden-Day (1967) (Translated from Russian)</TD></TR></table> |
Latest revision as of 07:59, 6 June 2020
A Markov process which is a stationary stochastic process. There is a stationary Markov process associated with a homogeneous Markov transition function if and only if there is a stationary initial distribution $ \mu ( A) $
corresponding to this function, that is, $ \mu ( A) $
satisfies
$$ \mu ( A) = \int\limits _ { X } P ( x , t , A ) \mu ( d x ) . $$
If the phase space $ X $ is finite, then a stationary initial distribution always exists, independent of whether the process has discrete $ ( t= 0 , 1 ,\dots) $ or continuous time. For a process in discrete time and for a countable set $ X $, a condition for existence of a stationary distribution has been found by A.N. Kolmogorov [1]: It is necessary and sufficient that there is class of communicating states $ Y \subset X $ such that the mathematical expectation of the time for reaching $ y _ {2} \in Y $ from $ y _ {1} \in Y $ is finite for any $ y _ {1} \in Y $. This criterion has been generalized to strong Markov processes with an arbitrary phase space $ X $: For the existence of a stationary process it is sufficient that there is a compact set $ K \subset X $ such that the expectation of the time of reaching $ K $ from $ x $ is finite for all $ x \in X $. There is the following sufficient condition for the existence of a stationary Markov process in terms of Lyapunov stochastic functions (cf. Lyapunov stochastic function): If there is a function $ V ( x) \leq 0 $ for which $ L V ( x) \leq - 1 $ for $ x \notin K $, then there is a stationary Markov process associated with the Markov transition function $ P ( x , t , A ) $. Here $ L $ is the infinitesimal generator of the process.
When the stationary initial distribution $ \mu $ is unique, the corresponding stationary process is ergodic. In this case the Cesàro mean of the transition probabilities converges weakly to $ \mu $. Under certain additional conditions,
$$ \lim\limits _ {t \rightarrow \infty } P ( x , t , A ) = \ \mu ( A) \ ( \textrm{ weakly } ) . $$
A stationary initial distribution satisfies the Fokker–Planck(–Kolmogorov) equation $ L ^ {*} \mu = 0 $, where $ L ^ {*} $ is the adjoint operator to the infinitesimal operator of the process. For example, $ L ^ {*} $ is the adjoint operator to the generating differential operator of the process for diffusion processes. In this case $ \mu $ has a density $ p $ with respect to the Lebesgue measure which satisfies $ L ^ {*} p = 0 $. In the one-dimensional case this equation can be solved by quadrature.
References
[1] | A.N. Kolmogorov, "Markov chains with a countable number of states" , Moscow (1937) (In Russian) |
[2] | J.L. Doob, "Stochastic processes" , Wiley (1953) |
[3] | A.B. Sevast'yanov, "An ergodic theorem for Markov processes and its application to telephone systems with refusals" Theor. Probab. Appl. , 2 (1957) pp. 104–112 Teor. Veroyatnost. i Primenen. , 2 : 1 (1957) pp. 106–116 |
Comments
References
[a1] | K.L. Chung, "Markov chains with stationary transition probabilities", Springer (1960) |
[a2] | W. Feller, "An introduction to probability theory and its applications", 1–2, Wiley (1966) |
[a3] | P. Lévy, "Processus stochastiques et mouvement Brownien", Gauthier-Villars (1965) |
[a4] | E. Parzen, "Stochastic processes", Holden-Day (1962) |
[a5] | Yu.A. Rozanov, "Stationary random processes", Holden-Day (1967) (Translated from Russian) |
Markov process, stationary. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Markov_process,_stationary&oldid=25956