# Markov chain, recurrent

2010 Mathematics Subject Classification: Primary: 60J10 [MSN][ZBL]

A Markov chain in which a random trajectory $\xi(t)$, starting at any state $\xi(0)=i$, returns to that state with probability 1. In terms of the transition probabilities $p_{ij}(t)$, recurrence of a discrete-time Markov chain is equivalent to the divergence for any $i$ of the series

$$\sum_{t=0}^\infty p_{ij}(t).$$

In a recurrent Markov chain a trajectory $\xi(t)$, $0\leq t<\infty$, $\xi(0)=i$, returns infinitely often to the state $i$ with probability 1. In a recurrent Markov chain there are no inessential states and the essential states decompose into recurrent classes. An example of a recurrent Markov chain is the symmetric random walk on the integer lattice on the line or plane. In the symmetric walk on the line a particle moves from position $x$ to $x\pm1$ with probabilities $1/2$; in the symmetric walk on the plane a particle moves from $(x,y)$ to one of the four points $(x\pm1,y)$, $(x,y\pm1)$ with probabilities $1/4$. In these examples a particle, starting the walk at an arbitrary point, returns to that point with probability 1. The symmetric walk on the integer lattice in the three-dimensional space, when the probability of transition from $(x,y,z)$ to a neighbouring point $(x\pm1,y,z)$, $(x,y\pm1,z)$, $(x,y,z\pm1)$ is equal to $1/6$, is not recurrent. In this case the probability of return of the particle to its initial point is approximately 0.35.

#### References

 [F] W. Feller, "An introduction to probability theory and its applications", 1 , Wiley (1966)