Namespaces
Variants
Actions

Difference between revisions of "Information, rate of generation of"

From Encyclopedia of Mathematics
Jump to: navigation, search
(Importing text file)
 
m (tex encoded by computer)
 
Line 1: Line 1:
A quantity characterizing the amount of information (cf. [[Information, amount of|Information, amount of]]) emanating from an information source in unit time. The rate of generation of information of an information source <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i051/i051110/i0511101.png" /> with discrete time generating the communication <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i051/i051110/i0511102.png" />, formed by a sequence of random variables <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i051/i051110/i0511103.png" /> taking values from some discrete set <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i051/i051110/i0511104.png" />, is defined by the equation
+
<!--
 +
i0511101.png
 +
$#A+1 = 18 n = 0
 +
$#C+1 = 18 : ~/encyclopedia/old_files/data/I051/I.0501110 Information, rate of generation of
 +
Automatically converted into TeX, above some diagnostics.
 +
Please remove this comment and the {{TEX|auto}} line below,
 +
if TeX found to be correct.
 +
-->
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i051/i051110/i0511105.png" /></td> <td valign="top" style="width:5%;text-align:right;">(*)</td></tr></table>
+
{{TEX|auto}}
 +
{{TEX|done}}
  
if this limit exists. Here <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i051/i051110/i0511106.png" /> is the [[Entropy|entropy]] of the random variable <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i051/i051110/i0511107.png" />. The variable <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i051/i051110/i0511108.png" />, defined by (*), is also called the entropy (in a symbol) of the information source.
+
A quantity characterizing the amount of information (cf. [[Information, amount of|Information, amount of]]) emanating from an information source in unit time. The rate of generation of information of an information source  $  u $
 +
with discrete time generating the communication  $  \xi = ( .  .  .  , \xi _ {-} 1 , \xi _ {0} , \xi _ {1} , .  .  . ) $,
 +
formed by a sequence of random variables  $  \{ {\xi _ {k} } : {k = . . . , - 1, 0, 1 , . . . } \} $
 +
taking values from some discrete set  $  X $,  
 +
is defined by the equation
  
In certain cases one can successfully prove the existence of the limit in (*) and calculate it explicitly, e.g. in the case of stationary sources. Explicit formulas for <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i051/i051110/i0511109.png" /> have been obtained for stationary Markov sources and Gaussian sources. The concept of a rate of generation of information is closely related to that of [[Redundancy|redundancy]] of an information source.
+
$$ \tag{* }
 +
\overline{H}\; ( u)  = \
 +
\lim\limits _ {n - k \rightarrow \infty } \
 +
{
 +
\frac{1}{n - k }
 +
}
 +
H ( \xi _ {k}  ^ {n} ),
 +
$$
  
If <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i051/i051110/i05111010.png" /> is a stationary ergodic information source with finite number of states, then the following property of asymptotic uniform distribution (the McMillan theorem, [[#References|[1]]]) holds. Let <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i051/i051110/i05111011.png" />, where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i051/i051110/i05111012.png" /> are the values of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i051/i051110/i05111013.png" /> in an information interval of length <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i051/i051110/i05111014.png" />. For any <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i051/i051110/i05111015.png" />, there is an <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i051/i051110/i05111016.png" /> such that for all <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i051/i051110/i05111017.png" />,
+
if this limit exists. Here  $  H ( \xi _ {k}  ^ {n} ) $
 +
is the [[Entropy|entropy]] of the random variable  $  \xi _ {k}  ^ {n} = ( \xi _ {k} \dots \xi _ {n} ) $.  
 +
The variable  $  \overline{H}\; ( u) $,
 +
defined by (*), is also called the entropy (in a symbol) of the information source.
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i051/i051110/i05111018.png" /></td> </tr></table>
+
In certain cases one can successfully prove the existence of the limit in (*) and calculate it explicitly, e.g. in the case of stationary sources. Explicit formulas for  $  \overline{H}\; ( u) $
 +
have been obtained for stationary Markov sources and Gaussian sources. The concept of a rate of generation of information is closely related to that of [[Redundancy|redundancy]] of an information source.
 +
 
 +
If  $  u $
 +
is a stationary ergodic information source with finite number of states, then the following property of asymptotic uniform distribution (the McMillan theorem, [[#References|[1]]]) holds. Let  $  P ( x  ^ {L} ) = {\mathsf P} \{ \xi  ^ {L} = x  ^ {L} \} $,
 +
where  $  x  ^ {L} = ( x _ {1} \dots x _ {L} ) $
 +
are the values of  $  \xi  ^ {L} = ( \xi _ {1} \dots \xi _ {L} ) $
 +
in an information interval of length  $  L $.
 +
For any  $  \epsilon , \delta > 0 $,
 +
there is an  $  L _ {0} ( \epsilon , \delta ) $
 +
such that for all  $  L \geq  L _ {0} ( \epsilon , \delta ) $,
 +
 
 +
$$
 +
{\mathsf P} \left \{ \left |
 +
{
 +
\frac{-  \mathop{\rm log}  P ( x  ^ {L} ) }{L}
 +
} -
 +
\overline{H}\; ( u) \right |  > \delta
 +
\right \}  < \epsilon .
 +
$$
  
 
====References====
 
====References====
 
<table><TR><TD valign="top">[1]</TD> <TD valign="top">  J. Wolfowitz,  "Coding theorems of information theory" , Springer  (1961)</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top">  R. Gallagher,  "Information theory and reliable communication" , Wiley  (1968)</TD></TR><TR><TD valign="top">[3]</TD> <TD valign="top">  A.A. Feinstein,  "Foundations of information theory" , McGraw-Hill  (1958)</TD></TR></table>
 
<table><TR><TD valign="top">[1]</TD> <TD valign="top">  J. Wolfowitz,  "Coding theorems of information theory" , Springer  (1961)</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top">  R. Gallagher,  "Information theory and reliable communication" , Wiley  (1968)</TD></TR><TR><TD valign="top">[3]</TD> <TD valign="top">  A.A. Feinstein,  "Foundations of information theory" , McGraw-Hill  (1958)</TD></TR></table>
 
 
  
 
====Comments====
 
====Comments====
 
  
 
====References====
 
====References====
 
<table><TR><TD valign="top">[a1]</TD> <TD valign="top">  I. Csiszar,  J. Körner,  "Information theory. Coding theorems for discrete memoryless systems" , Akad. Kiado  (1981)</TD></TR><TR><TD valign="top">[a2]</TD> <TD valign="top">  A. Rényi,  "A diary on information theory" , Akad. Kiado &amp; Wiley  (1987)</TD></TR></table>
 
<table><TR><TD valign="top">[a1]</TD> <TD valign="top">  I. Csiszar,  J. Körner,  "Information theory. Coding theorems for discrete memoryless systems" , Akad. Kiado  (1981)</TD></TR><TR><TD valign="top">[a2]</TD> <TD valign="top">  A. Rényi,  "A diary on information theory" , Akad. Kiado &amp; Wiley  (1987)</TD></TR></table>

Latest revision as of 22:12, 5 June 2020


A quantity characterizing the amount of information (cf. Information, amount of) emanating from an information source in unit time. The rate of generation of information of an information source $ u $ with discrete time generating the communication $ \xi = ( . . . , \xi _ {-} 1 , \xi _ {0} , \xi _ {1} , . . . ) $, formed by a sequence of random variables $ \{ {\xi _ {k} } : {k = . . . , - 1, 0, 1 , . . . } \} $ taking values from some discrete set $ X $, is defined by the equation

$$ \tag{* } \overline{H}\; ( u) = \ \lim\limits _ {n - k \rightarrow \infty } \ { \frac{1}{n - k } } H ( \xi _ {k} ^ {n} ), $$

if this limit exists. Here $ H ( \xi _ {k} ^ {n} ) $ is the entropy of the random variable $ \xi _ {k} ^ {n} = ( \xi _ {k} \dots \xi _ {n} ) $. The variable $ \overline{H}\; ( u) $, defined by (*), is also called the entropy (in a symbol) of the information source.

In certain cases one can successfully prove the existence of the limit in (*) and calculate it explicitly, e.g. in the case of stationary sources. Explicit formulas for $ \overline{H}\; ( u) $ have been obtained for stationary Markov sources and Gaussian sources. The concept of a rate of generation of information is closely related to that of redundancy of an information source.

If $ u $ is a stationary ergodic information source with finite number of states, then the following property of asymptotic uniform distribution (the McMillan theorem, [1]) holds. Let $ P ( x ^ {L} ) = {\mathsf P} \{ \xi ^ {L} = x ^ {L} \} $, where $ x ^ {L} = ( x _ {1} \dots x _ {L} ) $ are the values of $ \xi ^ {L} = ( \xi _ {1} \dots \xi _ {L} ) $ in an information interval of length $ L $. For any $ \epsilon , \delta > 0 $, there is an $ L _ {0} ( \epsilon , \delta ) $ such that for all $ L \geq L _ {0} ( \epsilon , \delta ) $,

$$ {\mathsf P} \left \{ \left | { \frac{- \mathop{\rm log} P ( x ^ {L} ) }{L} } - \overline{H}\; ( u) \right | > \delta \right \} < \epsilon . $$

References

[1] J. Wolfowitz, "Coding theorems of information theory" , Springer (1961)
[2] R. Gallagher, "Information theory and reliable communication" , Wiley (1968)
[3] A.A. Feinstein, "Foundations of information theory" , McGraw-Hill (1958)

Comments

References

[a1] I. Csiszar, J. Körner, "Information theory. Coding theorems for discrete memoryless systems" , Akad. Kiado (1981)
[a2] A. Rényi, "A diary on information theory" , Akad. Kiado & Wiley (1987)
How to Cite This Entry:
Information, rate of generation of. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Information,_rate_of_generation_of&oldid=47351
This article was adapted from an original article by R.L. DobrushinV.V. Prelov (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article