Difference between revisions of "Information, rate of generation of"
(Importing text file) |
Ulf Rehmann (talk | contribs) m (tex encoded by computer) |
||
Line 1: | Line 1: | ||
− | + | <!-- | |
+ | i0511101.png | ||
+ | $#A+1 = 18 n = 0 | ||
+ | $#C+1 = 18 : ~/encyclopedia/old_files/data/I051/I.0501110 Information, rate of generation of | ||
+ | Automatically converted into TeX, above some diagnostics. | ||
+ | Please remove this comment and the {{TEX|auto}} line below, | ||
+ | if TeX found to be correct. | ||
+ | --> | ||
− | + | {{TEX|auto}} | |
+ | {{TEX|done}} | ||
− | + | A quantity characterizing the amount of information (cf. [[Information, amount of|Information, amount of]]) emanating from an information source in unit time. The rate of generation of information of an information source $ u $ | |
+ | with discrete time generating the communication $ \xi = ( . . . , \xi _ {-} 1 , \xi _ {0} , \xi _ {1} , . . . ) $, | ||
+ | formed by a sequence of random variables $ \{ {\xi _ {k} } : {k = . . . , - 1, 0, 1 , . . . } \} $ | ||
+ | taking values from some discrete set $ X $, | ||
+ | is defined by the equation | ||
− | + | $$ \tag{* } | |
+ | \overline{H}\; ( u) = \ | ||
+ | \lim\limits _ {n - k \rightarrow \infty } \ | ||
+ | { | ||
+ | \frac{1}{n - k } | ||
+ | } | ||
+ | H ( \xi _ {k} ^ {n} ), | ||
+ | $$ | ||
− | + | if this limit exists. Here $ H ( \xi _ {k} ^ {n} ) $ | |
+ | is the [[Entropy|entropy]] of the random variable $ \xi _ {k} ^ {n} = ( \xi _ {k} \dots \xi _ {n} ) $. | ||
+ | The variable $ \overline{H}\; ( u) $, | ||
+ | defined by (*), is also called the entropy (in a symbol) of the information source. | ||
− | + | In certain cases one can successfully prove the existence of the limit in (*) and calculate it explicitly, e.g. in the case of stationary sources. Explicit formulas for $ \overline{H}\; ( u) $ | |
+ | have been obtained for stationary Markov sources and Gaussian sources. The concept of a rate of generation of information is closely related to that of [[Redundancy|redundancy]] of an information source. | ||
+ | |||
+ | If $ u $ | ||
+ | is a stationary ergodic information source with finite number of states, then the following property of asymptotic uniform distribution (the McMillan theorem, [[#References|[1]]]) holds. Let $ P ( x ^ {L} ) = {\mathsf P} \{ \xi ^ {L} = x ^ {L} \} $, | ||
+ | where $ x ^ {L} = ( x _ {1} \dots x _ {L} ) $ | ||
+ | are the values of $ \xi ^ {L} = ( \xi _ {1} \dots \xi _ {L} ) $ | ||
+ | in an information interval of length $ L $. | ||
+ | For any $ \epsilon , \delta > 0 $, | ||
+ | there is an $ L _ {0} ( \epsilon , \delta ) $ | ||
+ | such that for all $ L \geq L _ {0} ( \epsilon , \delta ) $, | ||
+ | |||
+ | $$ | ||
+ | {\mathsf P} \left \{ \left | | ||
+ | { | ||
+ | \frac{- \mathop{\rm log} P ( x ^ {L} ) }{L} | ||
+ | } - | ||
+ | \overline{H}\; ( u) \right | > \delta | ||
+ | \right \} < \epsilon . | ||
+ | $$ | ||
====References==== | ====References==== | ||
<table><TR><TD valign="top">[1]</TD> <TD valign="top"> J. Wolfowitz, "Coding theorems of information theory" , Springer (1961)</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top"> R. Gallagher, "Information theory and reliable communication" , Wiley (1968)</TD></TR><TR><TD valign="top">[3]</TD> <TD valign="top"> A.A. Feinstein, "Foundations of information theory" , McGraw-Hill (1958)</TD></TR></table> | <table><TR><TD valign="top">[1]</TD> <TD valign="top"> J. Wolfowitz, "Coding theorems of information theory" , Springer (1961)</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top"> R. Gallagher, "Information theory and reliable communication" , Wiley (1968)</TD></TR><TR><TD valign="top">[3]</TD> <TD valign="top"> A.A. Feinstein, "Foundations of information theory" , McGraw-Hill (1958)</TD></TR></table> | ||
− | |||
− | |||
====Comments==== | ====Comments==== | ||
− | |||
====References==== | ====References==== | ||
<table><TR><TD valign="top">[a1]</TD> <TD valign="top"> I. Csiszar, J. Körner, "Information theory. Coding theorems for discrete memoryless systems" , Akad. Kiado (1981)</TD></TR><TR><TD valign="top">[a2]</TD> <TD valign="top"> A. Rényi, "A diary on information theory" , Akad. Kiado & Wiley (1987)</TD></TR></table> | <table><TR><TD valign="top">[a1]</TD> <TD valign="top"> I. Csiszar, J. Körner, "Information theory. Coding theorems for discrete memoryless systems" , Akad. Kiado (1981)</TD></TR><TR><TD valign="top">[a2]</TD> <TD valign="top"> A. Rényi, "A diary on information theory" , Akad. Kiado & Wiley (1987)</TD></TR></table> |
Latest revision as of 22:12, 5 June 2020
A quantity characterizing the amount of information (cf. Information, amount of) emanating from an information source in unit time. The rate of generation of information of an information source $ u $
with discrete time generating the communication $ \xi = ( . . . , \xi _ {-} 1 , \xi _ {0} , \xi _ {1} , . . . ) $,
formed by a sequence of random variables $ \{ {\xi _ {k} } : {k = . . . , - 1, 0, 1 , . . . } \} $
taking values from some discrete set $ X $,
is defined by the equation
$$ \tag{* } \overline{H}\; ( u) = \ \lim\limits _ {n - k \rightarrow \infty } \ { \frac{1}{n - k } } H ( \xi _ {k} ^ {n} ), $$
if this limit exists. Here $ H ( \xi _ {k} ^ {n} ) $ is the entropy of the random variable $ \xi _ {k} ^ {n} = ( \xi _ {k} \dots \xi _ {n} ) $. The variable $ \overline{H}\; ( u) $, defined by (*), is also called the entropy (in a symbol) of the information source.
In certain cases one can successfully prove the existence of the limit in (*) and calculate it explicitly, e.g. in the case of stationary sources. Explicit formulas for $ \overline{H}\; ( u) $ have been obtained for stationary Markov sources and Gaussian sources. The concept of a rate of generation of information is closely related to that of redundancy of an information source.
If $ u $ is a stationary ergodic information source with finite number of states, then the following property of asymptotic uniform distribution (the McMillan theorem, [1]) holds. Let $ P ( x ^ {L} ) = {\mathsf P} \{ \xi ^ {L} = x ^ {L} \} $, where $ x ^ {L} = ( x _ {1} \dots x _ {L} ) $ are the values of $ \xi ^ {L} = ( \xi _ {1} \dots \xi _ {L} ) $ in an information interval of length $ L $. For any $ \epsilon , \delta > 0 $, there is an $ L _ {0} ( \epsilon , \delta ) $ such that for all $ L \geq L _ {0} ( \epsilon , \delta ) $,
$$ {\mathsf P} \left \{ \left | { \frac{- \mathop{\rm log} P ( x ^ {L} ) }{L} } - \overline{H}\; ( u) \right | > \delta \right \} < \epsilon . $$
References
[1] | J. Wolfowitz, "Coding theorems of information theory" , Springer (1961) |
[2] | R. Gallagher, "Information theory and reliable communication" , Wiley (1968) |
[3] | A.A. Feinstein, "Foundations of information theory" , McGraw-Hill (1958) |
Comments
References
[a1] | I. Csiszar, J. Körner, "Information theory. Coding theorems for discrete memoryless systems" , Akad. Kiado (1981) |
[a2] | A. Rényi, "A diary on information theory" , Akad. Kiado & Wiley (1987) |
Information, rate of generation of. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Information,_rate_of_generation_of&oldid=47351