Information, exactness of reproducibility of
A measure of the quality of information transmission from an information source (cf. Information, source of) to a receiver (addressee) over a communication channel. The criteria relevant to the exactness of reproducibility of information in the theory of information transmission are usually treated statistically, by isolating the class $ W $
of admissible joint distributions for pairs $ ( \xi , \widetilde \xi ) $
in the set of all probability measures on the product $ ( \mathfrak X \times \widetilde{\mathfrak X} , S _ {\mathfrak X } \times S _ {\widetilde{\mathfrak X} } ) $,
where $ ( \mathfrak X , S _ {\mathfrak X } ) $
is the measurable space of values of a communication $ \xi $
generated by the source, and $ ( \widetilde{\mathfrak X} , S _ {\widetilde{\mathfrak X} } ) $
is the measurable space of values of the communication $ \widetilde \xi $
received. Exactness of reproducibility of information is often defined in terms of a distortion measure $ \rho ( x, \widetilde{x} ) $,
$ x \in \mathfrak X $,
$ \widetilde{x} \in \widetilde{\mathfrak X} $,
which is a non-negative measurable function of $ x $
and $ \widetilde{x} $.
The set of admissible communications $ W $
is then specified by the formula
$$ \tag{1 } {\mathsf E} \rho ( \xi , \widetilde \xi ) \leq \epsilon , $$
for a given $ \epsilon > 0 $.
In particular, when $ ( \mathfrak X , S _ {\mathfrak X } ) = ( X ^ {n} , S _ {X ^ {n} } ) $ and $ ( \widetilde{\mathfrak X} , S _ {\widetilde{\mathfrak X} } ) = ( \widetilde{X} {} ^ {n} , S _ {\widetilde{X} {} ^ {n} } ) $, one often uses a componentwise condition for the exactness of reproducibility of information, namely
$$ \rho ( x ^ {n} , \widetilde{x} {} ^ {n} ) = \ { \frac{1}{n} } \sum _ {k = 1 } ^ { n } \rho _ {0} ( x _ {k} , \widetilde{x} _ {k} ), $$
where $ x ^ {n} = ( x _ {1} \dots x _ {n} ) \in X $, $ \widetilde{x} {} ^ {n} = ( \widetilde{x} _ {1} \dots \widetilde{x} _ {n} ) \in \widetilde{X} {} ^ {n} $, $ x _ {k} \in X $, $ \widetilde{x} _ {k} \in \widetilde{X} $, $ k = 1 \dots n $, and where $ \rho _ {0} ( x, \widetilde{x} ) $, $ x \in X $, $ \widetilde{x} \in \widetilde{X} $, is again a non-negative measurable function. In this case, instead of condition (1) one sometimes uses the following condition:
$$ \tag{2 } {\mathsf E} \rho _ {0} ( \xi _ {k} , \widetilde \xi _ {k} ) \leq \epsilon \ \ \textrm{ for } \textrm{ all } \ k = 1 \dots n. $$
In the case when $ X = \widetilde{X} $ and
$$ \rho _ {0} ( x, \widetilde{x} ) = \ \left \{ \begin{array}{ll} 0 & \textrm{ if } x = \widetilde{x} , \\ 1 & \textrm{ if } x \neq \widetilde{x} , \\ \end{array} \right .$$
the conditions (1) and (2) turn into restrictions on the mean or maximal probability of erroneous decoding (cf. Erroneous decoding, probability of) of separate components of the communication, respectively. In the case of sources with continuous spaces (such as a Gaussian source), it is often assumed that $ \rho _ {0} ( x, \widetilde{x} ) = ( x - \widetilde{x} ) ^ {2} $.
References
[1] | R. Gallagher, "Information theory and reliable communication" , Wiley (1968) |
[2] | T. Berger, "Rate distortion theory" , Prentice-Hall (1971) |
Comments
References
[a1] | I. Csiszar, J. Körner, "Information theory. Coding theorems for discrete memoryless systems" , Akad. Kiado (1981) |
Information, exactness of reproducibility of. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Information,_exactness_of_reproducibility_of&oldid=47350