Namespaces
Variants
Actions

Information, amount of

From Encyclopedia of Mathematics
Revision as of 22:12, 5 June 2020 by Ulf Rehmann (talk | contribs) (tex encoded by computer)
Jump to: navigation, search
The printable version is no longer supported and may have rendering errors. Please update your browser bookmarks and please use the default browser print function instead.


An information-theoretical measure of the quantity of information contained in one random variable relative to another random variable. Let $ \xi $ and $ \eta $ be random variables defined on a probability space $ ( \Omega , \mathfrak A , {\mathsf P} ) $ and taking values in measurable spaces (cf. Measurable space) $ ( \mathfrak X , S _ {\mathfrak X } ) $ and $ ( \mathfrak Y , S _ {\mathfrak Y } ) $, respectively. Let $ p _ {\xi \eta } ( C) $, $ C \in S _ {\mathfrak X } \times S _ {\mathfrak Y } $, and $ p _ \xi ( A) $, $ A \in S _ {\mathfrak X } $, $ p _ \eta ( B) $, $ B \in S _ {\mathfrak Y } $, be their joint and marginale probability distributions. If $ p _ {\xi \eta } ( \cdot ) $ is absolutely continuous with respect to the direct product of measures $ p _ \xi \times p _ \eta ( \cdot ) $, if $ a _ {\xi \eta } ( \cdot ) $ is the (Radon–Nikodým) density of $ p _ {\xi \eta } ( \cdot ) $ with respect to $ p _ \xi \times p _ \eta ( \cdot ) $, and if $ i _ {\xi \eta } ( \cdot ) = \mathop{\rm log} a _ {\xi \eta } ( \cdot ) $ is the information density (the logarithms are usually taken to base 2 or $ e $), then, by definition, the amount of information is given by

$$ I ( \xi , \eta ) = \ \int\limits _ {\mathfrak X \times \mathfrak Y } i _ {\xi \eta } ( x , y ) p _ {\xi \eta } ( d x , d y ) = $$

$$ = \ \int\limits _ {\mathfrak X \times \mathfrak Y } a _ {\xi \eta } ( x , y ) \mathop{\rm log} \ a _ {\xi \eta } ( x , y ) p _ \xi ( d x ) p _ \eta ( d y ) . $$

If $ p _ {\xi \eta } ( \cdot ) $ is not absolutely continuous with respect to $ p _ \xi \times p _ \eta ( \cdot ) $, then $ I ( \xi , \eta ) = + \infty $, by definition.

In case the random variables $ \xi $ and $ \eta $ take only a finite number of values, the expression for $ I ( \xi , \eta ) $ takes the form

$$ I ( \xi , \eta ) = \ \sum _ { i= } 1 ^ { n } \sum _ { j= } 1 ^ { m } p _ {ij} \mathop{\rm log} \ \frac{p _ {ij} }{p _ {i} q _ {i} } , $$

where

$$ \{ p _ {i} \} _ {i=} 1 ^ {n} ,\ \ \{ q _ {j} \} _ {j=} 1 ^ {m} ,\ \ \{ {p _ {ij} } : {i = 1 \dots n ; j = 1 \dots m } \} $$

are the probability functions of $ \xi $, $ \eta $ and the pair $ ( \xi , \eta ) $, respectively. (In particular,

$$ I ( \xi , \xi ) = - \sum _ { i= } 1 ^ { n } p _ {i} \mathop{\rm log} p _ {i} = H ( \xi ) $$

is the entropy of $ \xi $.) In case $ \xi $ and $ \eta $ are random vectors and the densities $ p _ \xi ( x) $, $ p _ \eta ( y) $ and $ p _ {\xi \eta } ( x , y ) $ of $ \xi $, $ \eta $ and the pair $ ( \xi , \eta ) $, respectively, exist, one has

$$ I ( \xi , \eta ) = \ \int\limits p _ {\xi \eta } ( x , y ) \mathop{\rm log} \frac{p _ {\xi \eta } ( x , y ) }{p _ \xi ( x) p _ \eta ( y) } \ d x d y . $$

In general,

$$ I ( \xi , \eta ) = \ \sup I ( \phi ( \xi ) , \psi ( \eta ) ) , $$

where the supremum is over all measurable functions $ \phi ( \cdot ) $ and $ \psi ( \cdot ) $ with a finite number of values. The concept of the amount of information is mainly used in the theory of information transmission.

For references, see , ,

to Information, transmission of.

How to Cite This Entry:
Information, amount of. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Information,_amount_of&oldid=12464
This article was adapted from an original article by R.L. DobrushinV.V. Prelov (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article