# Markov chain, class of positive states of a

A set $K$ of states of a homogeneous Markov chain $\xi ( t)$ with state space $S$ such that the transition probabilities

$$p _ {ij} ( t) = {\mathsf P} \{ \xi ( t) = j \mid \xi ( 0) = i \}$$

of $\xi ( t)$ satisfy

$$\sup _ { t } p _ {ij} ( t) > 0 \ \ \textrm{ for any } i , j \in K ,$$

$p _ {il} ( t) = 0$ for any $i \in K$, $l \in S \setminus K$, $t > 0$, and

$${\mathsf E} \tau _ {ii} < \infty \ \textrm{ for any } i \in K ,$$

where $\tau _ {ii}$ is the return time to the state $i$:

$$\tau _ {ii} = \min \ \{ {t > 0 } : {\xi ( t) = i \mid \xi ( 0) = i } \}$$

for a discrete-time Markov chain, and

$$\tau _ {ii} = \inf \ \{ {t > 0 } : {\xi ( t) = i \mid \xi ( 0) = i , \xi ( 0 + ) \neq i } \}$$

for a continuous-time Markov chain. When ${\mathsf E} \tau _ {ii} = \infty$, $K$ is called a zero class of states (class of zero states).

States in the same positive class $K$ have a number of common properties. For example, in the case of discrete time, for any $i , j \in K$ the limit relation

$$\lim\limits _ {n \rightarrow \infty } \ \frac{1}{n} \sum _ { t= } 1 ^ { n } p _ {ij} ( t) = \ p _ {j} ^ {*} > 0$$

holds; if

$$d _ {i} = \max \ \{ {d } : { {\mathsf P} \{ \tau _ {ii} \ \textrm{ is divisible by } d \} = 1 } \}$$

is the period of state $i$, then $d _ {i} = d _ {j}$ for any $i , j \in K$ and $d$ is called the period of the class $K$; for any $i \in K$ the limit relation

$$\lim\limits _ {t \rightarrow \infty } p _ {ii} ( t d ) = \ d p _ {i} ^ {*} > 0$$

holds. A discrete-time Markov chain such that all its states form a single positive class of period 1 serves as an example of an ergodic Markov chain (cf. Markov chain, ergodic).

#### References

 [1] K.L. Chung, "Markov chains with stationary transition probabilities" , Springer (1967) [2] J.L. Doob, "Stochastic processes" , Wiley (1953)