Namespaces
Variants
Actions

Difference between revisions of "Differential entropy"

From Encyclopedia of Mathematics
Jump to: navigation, search
(Importing text file)
 
m (tex encoded by computer)
 
Line 1: Line 1:
The formal analogue of the concept of entropy for random variables having distribution densities. The differential entropy <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/d/d031/d031890/d0318901.png" /> of a random variable <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/d/d031/d031890/d0318902.png" /> defined on some probability space <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/d/d031/d031890/d0318903.png" />, assuming values in an <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/d/d031/d031890/d0318904.png" />-dimensional Euclidean space <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/d/d031/d031890/d0318905.png" /> and having distribution density <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/d/d031/d031890/d0318906.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/d/d031/d031890/d0318907.png" />, is given by the formula
+
<!--
 +
d0318901.png
 +
$#A+1 = 34 n = 0
 +
$#C+1 = 34 : ~/encyclopedia/old_files/data/D031/D.0301890 Differential entropy
 +
Automatically converted into TeX, above some diagnostics.
 +
Please remove this comment and the {{TEX|auto}} line below,
 +
if TeX found to be correct.
 +
-->
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/d/d031/d031890/d0318908.png" /></td> </tr></table>
+
{{TEX|auto}}
 +
{{TEX|done}}
  
where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/d/d031/d031890/d0318909.png" /> is assumed to be equal to zero. Thus, the differential entropy coincides with the entropy of the measure <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/d/d031/d031890/d03189010.png" /> with respect to the Lebesgue measure <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/d/d031/d031890/d03189011.png" />, where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/d/d031/d031890/d03189012.png" /> is the distribution of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/d/d031/d031890/d03189013.png" />.
+
The formal analogue of the concept of entropy for random variables having distribution densities. The differential entropy $  h ( \xi ) $
 +
of a random variable  $  \xi $
 +
defined on some probability space  $  ( \Omega , \mathfrak A , P) $,
 +
assuming values in an  $  n $-
 +
dimensional Euclidean space  $  \mathbf R  ^ {n} $
 +
and having distribution density  $  p( x) $,
 +
$  x \in \mathbf R  ^ {n} $,  
 +
is given by the formula
  
The concept of the differential entropy proves useful in computing various information-theoretic characteristics, in the first place the mutual amount of information (cf. [[Information, amount of|Information, amount of]]) <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/d/d031/d031890/d03189014.png" /> of two random vectors <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/d/d031/d031890/d03189015.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/d/d031/d031890/d03189016.png" />. If <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/d/d031/d031890/d03189017.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/d/d031/d031890/d03189018.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/d/d031/d031890/d03189019.png" /> (i.e. the differential entropy of the pair <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/d/d031/d031890/d03189020.png" />) are finite, the following formula is valid:
+
$$
 +
h ( \xi ) = \int\limits _ {\mathbf R  ^ {n} } p ( x)  \mathop{\rm log}  p ( x) dx ,
 +
$$
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/d/d031/d031890/d03189021.png" /></td> </tr></table>
+
where  $  0   \mathop{\rm log}  0 $
 +
is assumed to be equal to zero. Thus, the differential entropy coincides with the entropy of the measure  $  P ( \cdot ) $
 +
with respect to the Lebesgue measure  $  \lambda ( \cdot ) $,
 +
where  $  P ( \cdot ) $
 +
is the distribution of  $  \xi $.
  
The following two properties of the differential entropy are worthy of mention: 1) as distinct from the ordinary entropy, the differential entropy is not covariant with respect to a change in the coordinate system and may assume negative values; and 2) let <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/d/d031/d031890/d03189022.png" /> be the discretization of an <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/d/d031/d031890/d03189023.png" />-dimensional random variable <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/d/d031/d031890/d03189024.png" /> having a density, with steps of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/d/d031/d031890/d03189025.png" />; then for the entropy <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/d/d031/d031890/d03189026.png" /> the formula
+
The concept of the differential entropy proves useful in computing various information-theoretic characteristics, in the first place the mutual amount of information (cf. [[Information, amount of|Information, amount of]])  $  J ( \xi , \eta ) $
 +
of two random vectors  $  \xi $
 +
and  $  \eta $.  
 +
If  $  h ( \xi ) $,
 +
$  h ( \eta ) $
 +
and  $  h ( \xi , \eta ) $(
 +
i.e. the differential entropy of the pair  $  ( \xi , \eta ) $)
 +
are finite, the following formula is valid:
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/d/d031/d031890/d03189027.png" /></td> </tr></table>
+
$$
 +
J ( \xi , \eta )  = - h ( \xi , \eta ) + h ( \xi )+ h ( \eta ).
 +
$$
  
is valid as <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/d/d031/d031890/d03189028.png" />. Thus, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/d/d031/d031890/d03189029.png" /> as <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/d/d031/d031890/d03189030.png" />. The principal term of the asymptotics of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/d/d031/d031890/d03189031.png" /> depends on the dimension of the space of values of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/d/d031/d031890/d03189032.png" />. The differential entropy defines the term next in order of the asymptotic expansion independent of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/d/d031/d031890/d03189033.png" /> and it is the first term involving a dependence on the actual nature of the distribution of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/d/d031/d031890/d03189034.png" />.
+
The following two properties of the differential entropy are worthy of mention: 1) as distinct from the ordinary entropy, the differential entropy is not covariant with respect to a change in the coordinate system and may assume negative values; and 2) let  $  \phi ( \xi ) $
 +
be the discretization of an  $  n $-
 +
dimensional random variable  $  \xi $
 +
having a density, with steps of  $  \Delta x $;
 +
then for the entropy  $  H ( \phi ( x)) $
 +
the formula
 +
 
 +
$$
 +
H ( \phi ( \xi ))  =  - n  \mathop{\rm log}  \Delta x + h ( \xi ) + o ( 1)
 +
$$
 +
 
 +
is valid as $  \Delta \rightarrow 0 $.  
 +
Thus, $  H ( \phi ( x )) \rightarrow + \infty $
 +
as $  \Delta x \rightarrow 0 $.  
 +
The principal term of the asymptotics of $  H ( \phi ( \xi )) $
 +
depends on the dimension of the space of values of $  \xi $.  
 +
The differential entropy defines the term next in order of the asymptotic expansion independent of $  \Delta x $
 +
and it is the first term involving a dependence on the actual nature of the distribution of $  \xi $.
  
 
====References====
 
====References====
 
<table><TR><TD valign="top">[1]</TD> <TD valign="top">  I.M. Gel'fand,  A.N. Kolmogorov,  A.M. Yaglom,  "The amount of information in, and entropy of, continuous distributions" , ''Proc. 3-rd All-Union Math. Congress'' , '''3''' , Moscow  (1958)  pp. 300–320  (In Russian)</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top">  A. Rényi,  "Wahrscheinlichkeitsrechnung" , Deutsch. Verlag Wissenschaft.  (1962)</TD></TR></table>
 
<table><TR><TD valign="top">[1]</TD> <TD valign="top">  I.M. Gel'fand,  A.N. Kolmogorov,  A.M. Yaglom,  "The amount of information in, and entropy of, continuous distributions" , ''Proc. 3-rd All-Union Math. Congress'' , '''3''' , Moscow  (1958)  pp. 300–320  (In Russian)</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top">  A. Rényi,  "Wahrscheinlichkeitsrechnung" , Deutsch. Verlag Wissenschaft.  (1962)</TD></TR></table>

Latest revision as of 17:33, 5 June 2020


The formal analogue of the concept of entropy for random variables having distribution densities. The differential entropy $ h ( \xi ) $ of a random variable $ \xi $ defined on some probability space $ ( \Omega , \mathfrak A , P) $, assuming values in an $ n $- dimensional Euclidean space $ \mathbf R ^ {n} $ and having distribution density $ p( x) $, $ x \in \mathbf R ^ {n} $, is given by the formula

$$ h ( \xi ) = \int\limits _ {\mathbf R ^ {n} } p ( x) \mathop{\rm log} p ( x) dx , $$

where $ 0 \mathop{\rm log} 0 $ is assumed to be equal to zero. Thus, the differential entropy coincides with the entropy of the measure $ P ( \cdot ) $ with respect to the Lebesgue measure $ \lambda ( \cdot ) $, where $ P ( \cdot ) $ is the distribution of $ \xi $.

The concept of the differential entropy proves useful in computing various information-theoretic characteristics, in the first place the mutual amount of information (cf. Information, amount of) $ J ( \xi , \eta ) $ of two random vectors $ \xi $ and $ \eta $. If $ h ( \xi ) $, $ h ( \eta ) $ and $ h ( \xi , \eta ) $( i.e. the differential entropy of the pair $ ( \xi , \eta ) $) are finite, the following formula is valid:

$$ J ( \xi , \eta ) = - h ( \xi , \eta ) + h ( \xi )+ h ( \eta ). $$

The following two properties of the differential entropy are worthy of mention: 1) as distinct from the ordinary entropy, the differential entropy is not covariant with respect to a change in the coordinate system and may assume negative values; and 2) let $ \phi ( \xi ) $ be the discretization of an $ n $- dimensional random variable $ \xi $ having a density, with steps of $ \Delta x $; then for the entropy $ H ( \phi ( x)) $ the formula

$$ H ( \phi ( \xi )) = - n \mathop{\rm log} \Delta x + h ( \xi ) + o ( 1) $$

is valid as $ \Delta \rightarrow 0 $. Thus, $ H ( \phi ( x )) \rightarrow + \infty $ as $ \Delta x \rightarrow 0 $. The principal term of the asymptotics of $ H ( \phi ( \xi )) $ depends on the dimension of the space of values of $ \xi $. The differential entropy defines the term next in order of the asymptotic expansion independent of $ \Delta x $ and it is the first term involving a dependence on the actual nature of the distribution of $ \xi $.

References

[1] I.M. Gel'fand, A.N. Kolmogorov, A.M. Yaglom, "The amount of information in, and entropy of, continuous distributions" , Proc. 3-rd All-Union Math. Congress , 3 , Moscow (1958) pp. 300–320 (In Russian)
[2] A. Rényi, "Wahrscheinlichkeitsrechnung" , Deutsch. Verlag Wissenschaft. (1962)
How to Cite This Entry:
Differential entropy. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Differential_entropy&oldid=46665
This article was adapted from an original article by R.L. DobrushinV.V. Prelov (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article