Difference between revisions of "Markov chain, class of positive states of a"
(Importing text file) |
(latex details) |
||
(One intermediate revision by one other user not shown) | |||
Line 1: | Line 1: | ||
− | A | + | <!-- |
+ | m0623601.png | ||
+ | $#A+1 = 28 n = 0 | ||
+ | $#C+1 = 28 : ~/encyclopedia/old_files/data/M062/M.0602360 Markov chain, class of positive states of a | ||
+ | Automatically converted into TeX, above some diagnostics. | ||
+ | Please remove this comment and the {{TEX|auto}} line below, | ||
+ | if TeX found to be correct. | ||
+ | --> | ||
− | + | {{TEX|auto}} | |
+ | {{TEX|done}} | ||
− | of | + | A set $ K $ |
+ | of states of a homogeneous [[Markov chain|Markov chain]] $ \xi ( t) $ | ||
+ | with state space $ S $ | ||
+ | such that the transition probabilities | ||
− | + | $$ | |
+ | p _ {ij} ( t) = {\mathsf P} \{ \xi ( t) = j \mid \xi ( 0) = i \} | ||
+ | $$ | ||
− | + | of $ \xi ( t) $ | |
+ | satisfy | ||
− | + | $$ | |
+ | \sup _ { t } p _ {ij} ( t) > 0 \ \ | ||
+ | \textrm{ for any } i , j \in K , | ||
+ | $$ | ||
− | + | $ p _ {il} ( t) = 0 $ | |
+ | for any $ i \in K $, | ||
+ | $ l \in S \setminus K $, | ||
+ | $ t > 0 $, | ||
+ | and | ||
− | < | + | $$ |
+ | {\mathsf E} \tau _ {ii} < \infty \ \textrm{ for any } i \in K , | ||
+ | $$ | ||
+ | |||
+ | where $ \tau _ {ii} $ | ||
+ | is the return time to the state $ i $: | ||
+ | |||
+ | $$ | ||
+ | \tau _ {ii} = \min \ | ||
+ | \{ {t > 0 } : {\xi ( t) = i \mid \xi ( 0) = i } \} | ||
+ | $$ | ||
for a discrete-time Markov chain, and | for a discrete-time Markov chain, and | ||
− | + | $$ | |
+ | \tau _ {ii} = \inf \ | ||
+ | \{ {t > 0 } : {\xi ( t) = i \mid \xi ( 0) = i , \xi ( 0 + ) \neq i } \} | ||
+ | $$ | ||
− | for a continuous-time Markov chain. When | + | for a continuous-time Markov chain. When $ {\mathsf E} \tau _ {ii} = \infty $, |
+ | $ K $ | ||
+ | is called a zero class of states (class of zero states). | ||
− | States in the same positive class | + | States in the same positive class $ K $ |
+ | have a number of common properties. For example, in the case of discrete time, for any $ i , j \in K $ | ||
+ | the limit relation | ||
− | + | $$ | |
+ | \lim\limits _ {n \rightarrow \infty } \ | ||
+ | |||
+ | \frac{1}{n} \sum_{t=1}^ { n } | ||
+ | p _ {ij} ( t) = \ | ||
+ | p _ {j} ^ {*} > 0 | ||
+ | $$ | ||
holds; if | holds; if | ||
− | + | $$ | |
+ | d _ {i} = \max \ | ||
+ | \{ {d } : { {\mathsf P} \{ \tau _ {ii} \ | ||
+ | \textrm{ is divisible by } d \} | ||
+ | = 1 } \} | ||
+ | $$ | ||
− | is the period of state | + | is the period of state $ i $, |
+ | then $ d _ {i} = d _ {j} $ | ||
+ | for any $ i , j \in K $ | ||
+ | and $ d $ | ||
+ | is called the period of the class $ K $; | ||
+ | for any $ i \in K $ | ||
+ | the limit relation | ||
− | + | $$ | |
+ | \lim\limits _ {t \rightarrow \infty } p _ {ii} ( t d ) = \ | ||
+ | d p _ {i} ^ {*} > 0 | ||
+ | $$ | ||
holds. A discrete-time Markov chain such that all its states form a single positive class of period 1 serves as an example of an ergodic Markov chain (cf. [[Markov chain, ergodic|Markov chain, ergodic]]). | holds. A discrete-time Markov chain such that all its states form a single positive class of period 1 serves as an example of an ergodic Markov chain (cf. [[Markov chain, ergodic|Markov chain, ergodic]]). | ||
Line 37: | Line 95: | ||
====References==== | ====References==== | ||
<table><TR><TD valign="top">[1]</TD> <TD valign="top"> K.L. Chung, "Markov chains with stationary transition probabilities" , Springer (1967)</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top"> J.L. Doob, "Stochastic processes" , Wiley (1953)</TD></TR></table> | <table><TR><TD valign="top">[1]</TD> <TD valign="top"> K.L. Chung, "Markov chains with stationary transition probabilities" , Springer (1967)</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top"> J.L. Doob, "Stochastic processes" , Wiley (1953)</TD></TR></table> | ||
− | |||
− | |||
====Comments==== | ====Comments==== | ||
Cf. also [[Markov chain, class of zero states of a|Markov chain, class of zero states of a]] for additional refences. | Cf. also [[Markov chain, class of zero states of a|Markov chain, class of zero states of a]] for additional refences. |
Latest revision as of 16:46, 20 January 2024
A set $ K $
of states of a homogeneous Markov chain $ \xi ( t) $
with state space $ S $
such that the transition probabilities
$$ p _ {ij} ( t) = {\mathsf P} \{ \xi ( t) = j \mid \xi ( 0) = i \} $$
of $ \xi ( t) $ satisfy
$$ \sup _ { t } p _ {ij} ( t) > 0 \ \ \textrm{ for any } i , j \in K , $$
$ p _ {il} ( t) = 0 $ for any $ i \in K $, $ l \in S \setminus K $, $ t > 0 $, and
$$ {\mathsf E} \tau _ {ii} < \infty \ \textrm{ for any } i \in K , $$
where $ \tau _ {ii} $ is the return time to the state $ i $:
$$ \tau _ {ii} = \min \ \{ {t > 0 } : {\xi ( t) = i \mid \xi ( 0) = i } \} $$
for a discrete-time Markov chain, and
$$ \tau _ {ii} = \inf \ \{ {t > 0 } : {\xi ( t) = i \mid \xi ( 0) = i , \xi ( 0 + ) \neq i } \} $$
for a continuous-time Markov chain. When $ {\mathsf E} \tau _ {ii} = \infty $, $ K $ is called a zero class of states (class of zero states).
States in the same positive class $ K $ have a number of common properties. For example, in the case of discrete time, for any $ i , j \in K $ the limit relation
$$ \lim\limits _ {n \rightarrow \infty } \ \frac{1}{n} \sum_{t=1}^ { n } p _ {ij} ( t) = \ p _ {j} ^ {*} > 0 $$
holds; if
$$ d _ {i} = \max \ \{ {d } : { {\mathsf P} \{ \tau _ {ii} \ \textrm{ is divisible by } d \} = 1 } \} $$
is the period of state $ i $, then $ d _ {i} = d _ {j} $ for any $ i , j \in K $ and $ d $ is called the period of the class $ K $; for any $ i \in K $ the limit relation
$$ \lim\limits _ {t \rightarrow \infty } p _ {ii} ( t d ) = \ d p _ {i} ^ {*} > 0 $$
holds. A discrete-time Markov chain such that all its states form a single positive class of period 1 serves as an example of an ergodic Markov chain (cf. Markov chain, ergodic).
References
[1] | K.L. Chung, "Markov chains with stationary transition probabilities" , Springer (1967) |
[2] | J.L. Doob, "Stochastic processes" , Wiley (1953) |
Comments
Cf. also Markov chain, class of zero states of a for additional refences.
Markov chain, class of positive states of a. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Markov_chain,_class_of_positive_states_of_a&oldid=14075