# Markov chain, generalized

2010 Mathematics Subject Classification: Primary: 60J10 [MSN][ZBL]

A sequence of random variables $\xi _ {n}$ with the properties:

1) the set of values of each $\xi _ {n}$ is finite or countable;

2) for any $n$ and any $i _ {0} \dots i _ {n}$,

$$\tag{* } {\mathsf P} \{ \xi _ {n} = i _ {n} \mid \xi _ {0} = i _ {0} \dots \xi _ {n-} s = i _ {n-} s \dots \xi _ {n-} 1 = i _ {n-} 1 \} =$$

$$= \ {\mathsf P} \{ \xi _ {n} = i _ {n} \mid \xi _ {n-} s = i _ {n-} s \dots \xi _ {n-} 1 = i _ {n-} 1 \} .$$

A generalized Markov chain satisfying (*) is called $s$- generalized. For $s = 1$, (*) is the usual Markov property. The study of $s$- generalized Markov chains can be reduced to the study of ordinary Markov chains. Consider the sequence of random variables $\eta _ {n}$ whose values are in one-to-one correspondence with the values of the vector

$$( \xi _ {n-} s+ 1 , \xi _ {n-} s+ 2 \dots \xi _ {n} ) .$$

The sequence $\eta _ {n}$ forms an ordinary Markov chain.

#### References

 [D] J.L. Doob, "Stochastic processes" , Wiley (1953) MR1570654 MR0058896 Zbl 0053.26802