Namespaces
Variants
Actions

Difference between revisions of "Stochastic matrix"

From Encyclopedia of Mathematics
Jump to: navigation, search
(Importing text file)
 
m (→‎References: details)
 
(8 intermediate revisions by 4 users not shown)
Line 1: Line 1:
A square (possibly infinite) matrix <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s0901501.png" /> with non-negative elements, for which
+
{{MSC|15B51|60J10}}
 +
{{TEX|done}}
 +
[[Category:Special matrices]] [[Category:Markov chains]]
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s0901502.png" /></td> </tr></table>
+
A stochastic matrix is a square (possibly infinite) matrix
 +
$P=[p_{ij}]$ with non-negative elements, for which
 +
$$
 +
\sum_j p_{ij} = 1 \quad \text{for all $i$.}
 +
$$
 +
The set of all stochastic matrices of
 +
order $n$ is the convex hull of the set of $n^n$ stochastic matrices
 +
consisting of zeros and ones. Any stochastic matrix $P$ can be
 +
considered as the
 +
[[Matrix of transition probabilities|matrix of transition
 +
probabilities]] of a discrete
 +
[[Markov chain|Markov chain]] $\xi^P(t)$.
  
The set of all stochastic matrices of order <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s0901503.png" /> is the convex hull of the set of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s0901504.png" /> stochastic matrices consisting of zeros and ones. Any stochastic matrix <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s0901505.png" /> can be considered as the [[Matrix of transition probabilities|matrix of transition probabilities]] of a discrete [[Markov chain|Markov chain]] <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s0901506.png" />.
+
The absolute values of the eigenvalues of stochastic matrices do not
 +
exceed 1; 1 is an eigenvalue of any stochastic matrix. If a stochastic
 +
matrix $P$ is indecomposable (the Markov chain $\xi^P(t)$ has one
 +
class of positive states), then 1 is a simple eigenvalue of $P$
 +
(i.e. it has multiplicity 1); in general, the multiplicity of the
 +
eigenvalue 1 coincides with the number of classes of positive states
 +
of the Markov chain $\xi^P(t)$. If a stochastic matrix is
 +
indecomposable and if the class of positive states of the Markov chain
 +
has period $d$, then the set of all eigenvalues of $P$, as a set of
 +
points in the complex plane, is mapped onto itself by rotation through
 +
an angle $2\pi/d$. When $d=1$, the stochastic matrix $P$ and the
 +
Markov chain $\xi^P(t)$ are called aperiodic.
  
The absolute values of the eigenvalues of stochastic matrices do not exceed 1; 1 is an eigenvalue of any stochastic matrix. If a stochastic matrix <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s0901507.png" /> is indecomposable (the Markov chain <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s0901508.png" /> has one class of positive states), then 1 is a simple eigenvalue of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s0901509.png" /> (i.e. it has multiplicity 1); in general, the multiplicity of the eigenvalue 1 coincides with the number of classes of positive states of the Markov chain <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s09015010.png" />. If a stochastic matrix is indecomposable and if the class of positive states of the Markov chain has period <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s09015011.png" />, then the set of all eigenvalues of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s09015012.png" />, as a set of points in the complex plane, is mapped onto itself by rotation through an angle <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s09015013.png" />. When <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s09015014.png" />, the stochastic matrix <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s09015015.png" /> and the Markov chain <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s09015016.png" /> are called aperiodic.
+
The left eigenvectors $\pi = \{\pi_j\}$ of $P$ of finite order,
 +
corresponding to the eigenvalue 1:
 +
\begin{equation}
 +
\label{eq1}
 +
\pi_j = \sum_i \pi_i p_{ij}
 +
\quad \text{for all}\ j\,,
 +
\end{equation}
 +
and satisfying the conditions $\pi_j \geq
 +
0$, $\sum_j\pi_j = 1$, define the stationary distributions of the
 +
Markov chain $\xi^P(t)$; in the case of an indecomposable matrix $P$,
 +
the stationary distribution is unique.
  
The left eigenvectors <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s09015017.png" /> of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s09015018.png" /> of finite order, corresponding to the eigenvalue <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s09015019.png" />:
+
If $P$ is an indecomposable aperiodic stochastic matrix of finite
 +
order, then the following limit exists:  
 +
\begin{equation}
 +
\label{eq2}
 +
\lim_{n\rightarrow\infty} P^n = \Pi,
 +
\end{equation}
 +
where $\Pi$ is the matrix
 +
all rows of which coincide with the vector $\pi$ (see also
 +
[[Markov chain, ergodic|Markov chain, ergodic]]; for infinite
 +
stochastic matrices $P$, the system of equations \ref{eq1} may have no
 +
non-zero non-negative solutions that satisfy the condition
 +
$\sum_j \pi_j < \infty$; in
 +
this case $\Pi$ is the zero matrix). The rate of convergence in \ref{eq2} can
 +
be estimated by a geometric progression with any exponent $\rho$ that has
 +
absolute value greater than the absolute values of all the eigenvalues
 +
of $P$ other than 1.
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s09015020.png" /></td> <td valign="top" style="width:5%;text-align:right;">(1)</td></tr></table>
+
If $P = [p_{ij}]$ is a stochastic matrix of order $n$, then any of its
 +
eigenvalues $\lambda$ satisfies the inequality (see {{Cite|MM}}):  
 +
$$
 +
\left| \lambda - \omega \right| \leq 1-\omega, \quad
 +
\text{where $\omega = \min_{1 \leq i \leq n} p_{ii}.$}
 +
$$
 +
The union $M_n$ of the sets of eigenvalues of all stochastic matrices of
 +
order $n$ has been described (see {{Cite|Ka}}).
  
and satisfying the conditions <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s09015021.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s09015022.png" />, define the stationary distributions of the Markov chain <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s09015023.png" />; in the case of an indecomposable matrix <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s09015024.png" />, the stationary distribution is unique.
+
A stochastic matrix $P=[p_{ij}]$ that satisfies the extra condition  
 
+
$$
If <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s09015025.png" /> is an indecomposable aperiodic stochastic matrix of finite order, then the following limit exists:
+
\sum_i p_{ij} = 1 \quad \text{for all $j$}
 
+
$$
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s09015026.png" /></td> <td valign="top" style="width:5%;text-align:right;">(2)</td></tr></table>
+
is called a doubly-stochastic matrix. The set of doubly-stochastic
 
+
matrices of order $n$ is the convex hull of the set of $n!$ permutation
where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s09015027.png" /> is the matrix all rows of which coincide with the vector <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s09015028.png" /> (see also [[Markov chain, ergodic|Markov chain, ergodic]]; for infinite stochastic matrices <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s09015029.png" />, the system of equations (1) may have no non-zero non-negative solutions that satisfy the condition <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s09015030.png" />; in this case <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s09015031.png" /> is the zero matrix). The rate of convergence in (2) can be estimated by a geometric progression with any exponent <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s09015032.png" /> that has absolute value greater than the absolute values of all the eigenvalues of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s09015033.png" /> other than 1.
+
matrices of order $n$ (i.e. doubly-stochastic matrices consisting of
 
+
zeros and ones). A finite Markov chain $\xi^P(t)$ with a doubly-stochastic
If <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s09015034.png" /> is a stochastic matrix of order <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s09015035.png" />, then any of its eigenvalues <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s09015036.png" /> satisfies the inequality (see [[#References|[3]]]):
+
matrix $P$ has the uniform stationary distribution.
 
 
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s09015037.png" /></td> </tr></table>
 
 
 
The union <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s09015038.png" /> of the sets of eigenvalues of all stochastic matrices of order <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s09015039.png" /> has been described (see [[#References|[4]]]).
 
 
 
A stochastic matrix <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s09015040.png" /> that satisfies the extra condition
 
 
 
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s09015041.png" /></td> </tr></table>
 
 
 
is called a doubly-stochastic matrix. The set of doubly-stochastic matrices of order <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s09015042.png" /> is the convex hull of the set of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s09015043.png" /> permutation matrices of order <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s09015044.png" /> (i.e. doubly-stochastic matrices consisting of zeros and ones). A finite Markov chain <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s09015045.png" /> with a doubly-stochastic matrix <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s09015046.png" /> has the uniform stationary distribution.
 
  
 
====References====
 
====References====
<table><TR><TD valign="top">[1]</TD> <TD valign="top">  F.R. [F.R. Gantmakher] Gantmacher,   "The theory of matrices" , '''1''' , Chelsea, reprint (1977) (Translated from Russian)</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top">  R. Bellman,   "Introduction to matrix analysis" , McGraw-Hill (1960)</TD></TR><TR><TD valign="top">[3]</TD> <TD valign="top"M. Marcus,   H. Minc,   "A survey of matrix theory and matrix inequalities" , Allyn &amp; Bacon (1964)</TD></TR><TR><TD valign="top">[4]</TD> <TD valign="top"F.I. Karpelevich,   "On the characteristic roots of matrices with non-negative entries" ''Izv. Akad. Nauk SSSR Ser. Mat.'' , '''15''' (1951) pp. 361–383 (In Russian)</TD></TR></table>
+
{|
 
+
|valign="top"|{{Ref|G}}|| F.R. Gantmacher, "The theory of matrices" , '''1''' , Chelsea, reprint (1977) (Translated from Russian) {{MR|1657129}} {{MR|0107649}} {{MR|0107648}} {{ZBL|0927.15002}} {{ZBL|0927.15001}} {{ZBL|0085.01001}}
 
+
|-
 +
|valign="top"|{{Ref|B}}|| R. Bellman, "Introduction to matrix analysis" , McGraw-Hill (1960) {{MR|0122820}} {{ZBL|0124.01001}}
 +
|-
 +
|valign="top"|{{Ref|MM}}|| M. Marcus, H. Minc, "A survey of matrix theory and matrix inequalities" , Allyn &amp; Bacon (1964) {{MR|0162808}} {{ZBL|0126.02404}}
 +
|-
 +
|valign="top"|{{Ref|Ka}}|| F.I. Karpelevich, "On the characteristic roots of matrices with non-negative entries" ''Izv. Akad. Nauk SSSR Ser. Mat.'' , '''15''' (1951) pp. 361–383 (In Russian)
 +
|}
  
 
====Comments====
 
====Comments====
Given a real <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s09015047.png" />-matrix <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s09015048.png" /> with non-negative entries, the question arises whether there are invertible positive diagonal matrices <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s09015049.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s09015050.png" /> such that <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s09015051.png" /> is a doubly-stochastic matrix, and to what extent the <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s09015052.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s09015053.png" /> are unique. Such theorems are known as <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s09015055.png" />-theorems. They are of interest in telecommunications and statistics, [[#References|[a3]]]–[[#References|[a5]]].
+
Given a real $n\times n$ matrix $A$ with non-negative
 
+
entries, the question arises whether there are invertible positive
A matrix <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s09015056.png" /> is fully decomposable if there do not exist permutation matrices <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s09015057.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s09015058.png" /> such that
+
diagonal matrices $D_1$ and $D_2$ such that $D_1AD_2$ is a doubly-stochastic
 
+
matrix, and to what extent the $D_1$ and $D_2$ are unique. Such theorems
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s09015059.png" /></td> </tr></table>
+
are known as $DAD$-theorems. They are of interest in telecommunications
 +
and statistics, {{Cite|C}}, {{Cite|F}}, {{Cite|Kr}}.
  
A <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s09015060.png" />-matrix is fully indecomposable if it is non-zero.
+
A matrix $A$ is fully decomposable if there do not exist permutation
 +
matrices $P$ and $Q$ such that
 +
$$
 +
PAQ =
 +
\left[
 +
\begin{array}{cc}
 +
A_1 & 0 \\
 +
B  & A_2
 +
\end{array}
 +
\right].
 +
$$
 +
A $1 \times 1$ matrix is fully indecomposable if it is non-zero.
  
Then for a non-negative square matrix <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s09015061.png" /> there exist positive diagonal matrices <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s09015062.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s09015063.png" /> such that <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s09015064.png" /> is doubly stochastic if and only if there exist permutation matrices <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s09015065.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s09015066.png" /> such that <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s09015067.png" /> is a direct sum of fully indecomposable matrices [[#References|[a1]]], [[#References|[a2]]].
+
Then for a non-negative square matrix $A$ there exist positive
 +
diagonal matrices $D_1$ and $D_2$ such that $D_1AD_2$ is doubly stochastic if
 +
and only if there exist permutation matrices $P$ and $Q$ such that $PAQ$
 +
is a direct sum of fully indecomposable matrices {{Cite|SK}},
 +
{{Cite|BPS}}.
  
 
====References====
 
====References====
<table><TR><TD valign="top">[a1]</TD> <TD valign="top">  R. Sinkhorn,   P. Knopp,   "Concerning nonnegative matrices and doubly stochastic matrices" ''Pacific J. Math.'' , '''21''' (1967) pp. 343–348</TD></TR><TR><TD valign="top">[a2]</TD> <TD valign="top">  R.A. Brualdi,   S.V. Parter,   H. Schneider,   "The diagonal equivalence of a nonnegative matrix to a stochastic matrix" ''J. Math. Anal. Appl.'' , '''16''' (1966) pp. 31–50</TD></TR><TR><TD valign="top">[a3]</TD> <TD valign="top">  S. Fienberg,   "An iterative procedure for estimation in contingency tables" ''Ann. Math. Stat.'' , '''41''' (1970) pp. 907–917</TD></TR><TR><TD valign="top">[a4]</TD> <TD valign="top">  R.S. Krupp,   "Properties of Kruithof's projection method" ''Bell Systems Techn. J.'' , '''58''' (1979) pp. 517–538</TD></TR><TR><TD valign="top">[a5]</TD> <TD valign="top"I. Csiszár,   "I-divergence geometry of probability distributions and minimization problems" ''Ann. Probab.'' , '''3''' (1975) pp. 146–158</TD></TR><TR><TD valign="top">[a6]</TD> <TD valign="top">  R.D. Nussbaum,   "Iterated nonlinear maps and Hilbert's projective method II" ''Memoirs Amer. Math. Soc.'' , '''401''' (1989)</TD></TR><TR><TD valign="top">[a7]</TD> <TD valign="top">  M.F. Neuts,   "Structured stochastic matrices of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090150/s09015068.png" /> type and their applications" , M. Dekker (1989)</TD></TR><TR><TD valign="top">[a8]</TD> <TD valign="top">  E. Seneta,   "Non-negative matrices and Markov chains" , Springer (1981)</TD></TR></table>
+
{|
 +
|valign="top"|{{Ref|SK}}|| R. Sinkhorn, P. Knopp, "Concerning nonnegative matrices and doubly stochastic matrices" ''Pacific J. Math.'' , '''21''' (1967) pp. 343–348 {{MR|0210731}} {{ZBL|0152.01403}}
 +
|-
 +
|valign="top"|{{Ref|BPS}}|| R.A. Brualdi, S.V. Parter, H. Schneider, "The diagonal equivalence of a nonnegative matrix to a stochastic matrix" ''J. Math. Anal. Appl.'' , '''16''' (1966) pp. 31–50 {{MR|0206019}} {{ZBL|0231.15017}}
 +
|-
 +
|valign="top"|{{Ref|F}}|| S. Fienberg, "An iterative procedure for estimation in contingency tables" ''Ann. Math. Stat.'' , '''41''' (1970) pp. 907–917 {{MR|0266394}} {{ZBL|0198.23401}}
 +
|-
 +
|valign="top"|{{Ref|Kr}}|| R.S. Krupp, "Properties of Kruithof's projection method" ''Bell Systems Techn. J.'' , '''58''' (1979) pp. 517–538
 +
|-
 +
|valign="top"|{{Ref|C}}|| I. Csiszár, "I-divergence geometry of probability distributions and minimization problems" ''Ann. Probab.'' , '''3''' (1975) pp. 146–158 {{MR|0365798}} {{ZBL|0318.60013}}
 +
|-
 +
|valign="top"|{{Ref|Nu}}|| R.D. Nussbaum, "Iterated nonlinear maps and Hilbert's projective method II" ''Memoirs Amer. Math. Soc.'' , '''401''' (1989)
 +
|-
 +
|valign="top"|{{Ref|Ne}}|| M.F. Neuts, "Structured stochastic matrices of M/G/1 type and their applications" , M. Dekker (1989) {{MR|1010040}} {{ZBL|0695.60088}}
 +
|-
 +
|valign="top"|{{Ref|S}}|| E. Seneta, "Non-negative matrices and Markov chains" , Springer (1981) {{MR|2209438}} {{ZBL|0471.60001}}
 +
|}

Latest revision as of 09:11, 1 October 2023

2020 Mathematics Subject Classification: Primary: 15B51 Secondary: 60J10 [MSN][ZBL]

A stochastic matrix is a square (possibly infinite) matrix $P=[p_{ij}]$ with non-negative elements, for which $$ \sum_j p_{ij} = 1 \quad \text{for all $i$.} $$ The set of all stochastic matrices of order $n$ is the convex hull of the set of $n^n$ stochastic matrices consisting of zeros and ones. Any stochastic matrix $P$ can be considered as the matrix of transition probabilities of a discrete Markov chain $\xi^P(t)$.

The absolute values of the eigenvalues of stochastic matrices do not exceed 1; 1 is an eigenvalue of any stochastic matrix. If a stochastic matrix $P$ is indecomposable (the Markov chain $\xi^P(t)$ has one class of positive states), then 1 is a simple eigenvalue of $P$ (i.e. it has multiplicity 1); in general, the multiplicity of the eigenvalue 1 coincides with the number of classes of positive states of the Markov chain $\xi^P(t)$. If a stochastic matrix is indecomposable and if the class of positive states of the Markov chain has period $d$, then the set of all eigenvalues of $P$, as a set of points in the complex plane, is mapped onto itself by rotation through an angle $2\pi/d$. When $d=1$, the stochastic matrix $P$ and the Markov chain $\xi^P(t)$ are called aperiodic.

The left eigenvectors $\pi = \{\pi_j\}$ of $P$ of finite order, corresponding to the eigenvalue 1: \begin{equation} \label{eq1} \pi_j = \sum_i \pi_i p_{ij} \quad \text{for all}\ j\,, \end{equation} and satisfying the conditions $\pi_j \geq 0$, $\sum_j\pi_j = 1$, define the stationary distributions of the Markov chain $\xi^P(t)$; in the case of an indecomposable matrix $P$, the stationary distribution is unique.

If $P$ is an indecomposable aperiodic stochastic matrix of finite order, then the following limit exists: \begin{equation} \label{eq2} \lim_{n\rightarrow\infty} P^n = \Pi, \end{equation} where $\Pi$ is the matrix all rows of which coincide with the vector $\pi$ (see also Markov chain, ergodic; for infinite stochastic matrices $P$, the system of equations \ref{eq1} may have no non-zero non-negative solutions that satisfy the condition $\sum_j \pi_j < \infty$; in this case $\Pi$ is the zero matrix). The rate of convergence in \ref{eq2} can be estimated by a geometric progression with any exponent $\rho$ that has absolute value greater than the absolute values of all the eigenvalues of $P$ other than 1.

If $P = [p_{ij}]$ is a stochastic matrix of order $n$, then any of its eigenvalues $\lambda$ satisfies the inequality (see [MM]): $$ \left| \lambda - \omega \right| \leq 1-\omega, \quad \text{where $\omega = \min_{1 \leq i \leq n} p_{ii}.$} $$ The union $M_n$ of the sets of eigenvalues of all stochastic matrices of order $n$ has been described (see [Ka]).

A stochastic matrix $P=[p_{ij}]$ that satisfies the extra condition $$ \sum_i p_{ij} = 1 \quad \text{for all $j$} $$ is called a doubly-stochastic matrix. The set of doubly-stochastic matrices of order $n$ is the convex hull of the set of $n!$ permutation matrices of order $n$ (i.e. doubly-stochastic matrices consisting of zeros and ones). A finite Markov chain $\xi^P(t)$ with a doubly-stochastic matrix $P$ has the uniform stationary distribution.

References

[G] F.R. Gantmacher, "The theory of matrices" , 1 , Chelsea, reprint (1977) (Translated from Russian) MR1657129 MR0107649 MR0107648 Zbl 0927.15002 Zbl 0927.15001 Zbl 0085.01001
[B] R. Bellman, "Introduction to matrix analysis" , McGraw-Hill (1960) MR0122820 Zbl 0124.01001
[MM] M. Marcus, H. Minc, "A survey of matrix theory and matrix inequalities" , Allyn & Bacon (1964) MR0162808 Zbl 0126.02404
[Ka] F.I. Karpelevich, "On the characteristic roots of matrices with non-negative entries" Izv. Akad. Nauk SSSR Ser. Mat. , 15 (1951) pp. 361–383 (In Russian)

Comments

Given a real $n\times n$ matrix $A$ with non-negative entries, the question arises whether there are invertible positive diagonal matrices $D_1$ and $D_2$ such that $D_1AD_2$ is a doubly-stochastic matrix, and to what extent the $D_1$ and $D_2$ are unique. Such theorems are known as $DAD$-theorems. They are of interest in telecommunications and statistics, [C], [F], [Kr].

A matrix $A$ is fully decomposable if there do not exist permutation matrices $P$ and $Q$ such that $$ PAQ = \left[ \begin{array}{cc} A_1 & 0 \\ B & A_2 \end{array} \right]. $$ A $1 \times 1$ matrix is fully indecomposable if it is non-zero.

Then for a non-negative square matrix $A$ there exist positive diagonal matrices $D_1$ and $D_2$ such that $D_1AD_2$ is doubly stochastic if and only if there exist permutation matrices $P$ and $Q$ such that $PAQ$ is a direct sum of fully indecomposable matrices [SK], [BPS].

References

[SK] R. Sinkhorn, P. Knopp, "Concerning nonnegative matrices and doubly stochastic matrices" Pacific J. Math. , 21 (1967) pp. 343–348 MR0210731 Zbl 0152.01403
[BPS] R.A. Brualdi, S.V. Parter, H. Schneider, "The diagonal equivalence of a nonnegative matrix to a stochastic matrix" J. Math. Anal. Appl. , 16 (1966) pp. 31–50 MR0206019 Zbl 0231.15017
[F] S. Fienberg, "An iterative procedure for estimation in contingency tables" Ann. Math. Stat. , 41 (1970) pp. 907–917 MR0266394 Zbl 0198.23401
[Kr] R.S. Krupp, "Properties of Kruithof's projection method" Bell Systems Techn. J. , 58 (1979) pp. 517–538
[C] I. Csiszár, "I-divergence geometry of probability distributions and minimization problems" Ann. Probab. , 3 (1975) pp. 146–158 MR0365798 Zbl 0318.60013
[Nu] R.D. Nussbaum, "Iterated nonlinear maps and Hilbert's projective method II" Memoirs Amer. Math. Soc. , 401 (1989)
[Ne] M.F. Neuts, "Structured stochastic matrices of M/G/1 type and their applications" , M. Dekker (1989) MR1010040 Zbl 0695.60088
[S] E. Seneta, "Non-negative matrices and Markov chains" , Springer (1981) MR2209438 Zbl 0471.60001
How to Cite This Entry:
Stochastic matrix. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Stochastic_matrix&oldid=15634
This article was adapted from an original article by A.M. Zubkov (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article