Namespaces
Variants
Actions

Information, rate of generation of

From Encyclopedia of Mathematics
Revision as of 22:12, 5 June 2020 by Ulf Rehmann (talk | contribs) (tex encoded by computer)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search
The printable version is no longer supported and may have rendering errors. Please update your browser bookmarks and please use the default browser print function instead.


A quantity characterizing the amount of information (cf. Information, amount of) emanating from an information source in unit time. The rate of generation of information of an information source $ u $ with discrete time generating the communication $ \xi = ( . . . , \xi _ {-} 1 , \xi _ {0} , \xi _ {1} , . . . ) $, formed by a sequence of random variables $ \{ {\xi _ {k} } : {k = . . . , - 1, 0, 1 , . . . } \} $ taking values from some discrete set $ X $, is defined by the equation

$$ \tag{* } \overline{H}\; ( u) = \ \lim\limits _ {n - k \rightarrow \infty } \ { \frac{1}{n - k } } H ( \xi _ {k} ^ {n} ), $$

if this limit exists. Here $ H ( \xi _ {k} ^ {n} ) $ is the entropy of the random variable $ \xi _ {k} ^ {n} = ( \xi _ {k} \dots \xi _ {n} ) $. The variable $ \overline{H}\; ( u) $, defined by (*), is also called the entropy (in a symbol) of the information source.

In certain cases one can successfully prove the existence of the limit in (*) and calculate it explicitly, e.g. in the case of stationary sources. Explicit formulas for $ \overline{H}\; ( u) $ have been obtained for stationary Markov sources and Gaussian sources. The concept of a rate of generation of information is closely related to that of redundancy of an information source.

If $ u $ is a stationary ergodic information source with finite number of states, then the following property of asymptotic uniform distribution (the McMillan theorem, [1]) holds. Let $ P ( x ^ {L} ) = {\mathsf P} \{ \xi ^ {L} = x ^ {L} \} $, where $ x ^ {L} = ( x _ {1} \dots x _ {L} ) $ are the values of $ \xi ^ {L} = ( \xi _ {1} \dots \xi _ {L} ) $ in an information interval of length $ L $. For any $ \epsilon , \delta > 0 $, there is an $ L _ {0} ( \epsilon , \delta ) $ such that for all $ L \geq L _ {0} ( \epsilon , \delta ) $,

$$ {\mathsf P} \left \{ \left | { \frac{- \mathop{\rm log} P ( x ^ {L} ) }{L} } - \overline{H}\; ( u) \right | > \delta \right \} < \epsilon . $$

References

[1] J. Wolfowitz, "Coding theorems of information theory" , Springer (1961)
[2] R. Gallagher, "Information theory and reliable communication" , Wiley (1968)
[3] A.A. Feinstein, "Foundations of information theory" , McGraw-Hill (1958)

Comments

References

[a1] I. Csiszar, J. Körner, "Information theory. Coding theorems for discrete memoryless systems" , Akad. Kiado (1981)
[a2] A. Rényi, "A diary on information theory" , Akad. Kiado & Wiley (1987)
How to Cite This Entry:
Information, rate of generation of. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Information,_rate_of_generation_of&oldid=47351
This article was adapted from an original article by R.L. DobrushinV.V. Prelov (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article