Namespaces
Variants
Actions

Differential entropy

From Encyclopedia of Mathematics
Jump to: navigation, search


The formal analogue of the concept of entropy for random variables having distribution densities. The differential entropy $ h ( \xi ) $ of a random variable $ \xi $ defined on some probability space $ ( \Omega , \mathfrak A , P) $, assuming values in an $ n $- dimensional Euclidean space $ \mathbf R ^ {n} $ and having distribution density $ p( x) $, $ x \in \mathbf R ^ {n} $, is given by the formula

$$ h ( \xi ) = \int\limits _ {\mathbf R ^ {n} } p ( x) \mathop{\rm log} p ( x) dx , $$

where $ 0 \mathop{\rm log} 0 $ is assumed to be equal to zero. Thus, the differential entropy coincides with the entropy of the measure $ P ( \cdot ) $ with respect to the Lebesgue measure $ \lambda ( \cdot ) $, where $ P ( \cdot ) $ is the distribution of $ \xi $.

The concept of the differential entropy proves useful in computing various information-theoretic characteristics, in the first place the mutual amount of information (cf. Information, amount of) $ J ( \xi , \eta ) $ of two random vectors $ \xi $ and $ \eta $. If $ h ( \xi ) $, $ h ( \eta ) $ and $ h ( \xi , \eta ) $( i.e. the differential entropy of the pair $ ( \xi , \eta ) $) are finite, the following formula is valid:

$$ J ( \xi , \eta ) = - h ( \xi , \eta ) + h ( \xi )+ h ( \eta ). $$

The following two properties of the differential entropy are worthy of mention: 1) as distinct from the ordinary entropy, the differential entropy is not covariant with respect to a change in the coordinate system and may assume negative values; and 2) let $ \phi ( \xi ) $ be the discretization of an $ n $- dimensional random variable $ \xi $ having a density, with steps of $ \Delta x $; then for the entropy $ H ( \phi ( x)) $ the formula

$$ H ( \phi ( \xi )) = - n \mathop{\rm log} \Delta x + h ( \xi ) + o ( 1) $$

is valid as $ \Delta \rightarrow 0 $. Thus, $ H ( \phi ( x )) \rightarrow + \infty $ as $ \Delta x \rightarrow 0 $. The principal term of the asymptotics of $ H ( \phi ( \xi )) $ depends on the dimension of the space of values of $ \xi $. The differential entropy defines the term next in order of the asymptotic expansion independent of $ \Delta x $ and it is the first term involving a dependence on the actual nature of the distribution of $ \xi $.

References

[1] I.M. Gel'fand, A.N. Kolmogorov, A.M. Yaglom, "The amount of information in, and entropy of, continuous distributions" , Proc. 3-rd All-Union Math. Congress , 3 , Moscow (1958) pp. 300–320 (In Russian)
[2] A. Rényi, "Wahrscheinlichkeitsrechnung" , Deutsch. Verlag Wissenschaft. (1962)
How to Cite This Entry:
Differential entropy. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Differential_entropy&oldid=46665
This article was adapted from an original article by R.L. DobrushinV.V. Prelov (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article