Difference between revisions of "Kullback-Leibler information"
Ulf Rehmann (talk | contribs) m (moved Kullback–Leibler information to Kullback-Leibler information: ascii title) |
Ulf Rehmann (talk | contribs) m (tex encoded by computer) |
||
Line 1: | Line 1: | ||
+ | <!-- | ||
+ | k1101801.png | ||
+ | $#A+1 = 15 n = 0 | ||
+ | $#C+1 = 15 : ~/encyclopedia/old_files/data/K110/K.1100180 Kullback\ANDLeibler information, | ||
+ | Automatically converted into TeX, above some diagnostics. | ||
+ | Please remove this comment and the {{TEX|auto}} line below, | ||
+ | if TeX found to be correct. | ||
+ | --> | ||
+ | |||
+ | {{TEX|auto}} | ||
+ | {{TEX|done}} | ||
+ | |||
''Kullback–Leibler quantity of information, Kullback–Leibler information quantity, directed divergence'' | ''Kullback–Leibler quantity of information, Kullback–Leibler information quantity, directed divergence'' | ||
− | For discrete distributions (cf. [[Discrete distribution|Discrete distribution]]) given by probability vectors | + | For discrete distributions (cf. [[Discrete distribution|Discrete distribution]]) given by probability vectors $ p = ( p _ {1} \dots p _ {n} ) $, |
+ | $ q = ( q _ {1} \dots q _ {n} ) $, | ||
+ | the Kullback–Leibler (quantity of) information of $ p $ | ||
+ | with respect to $ q $ | ||
+ | is: | ||
− | + | $$ | |
+ | I ( p;q ) = \sum _ {i = 1 } ^ { m } p _ {i} ( { \mathop{\rm log} } p _ {i} - { \mathop{\rm log} } q _ {i} ) , | ||
+ | $$ | ||
− | where | + | where $ { \mathop{\rm log} } $ |
+ | is the natural logarithm (cf. also [[Logarithm of a number|Logarithm of a number]]). | ||
More generally, one has: | More generally, one has: | ||
− | + | $$ | |
+ | I ( P;Q ) = \int\limits _ \Omega { { \mathop{\rm log} } { | ||
+ | \frac{p ( \omega ) }{q ( \omega ) } | ||
+ | } } {P ( d \omega ) } | ||
+ | $$ | ||
− | for probability distributions | + | for probability distributions $ P ( d \omega ) $ |
+ | and $ Q ( d \omega ) $ | ||
+ | with densities $ p ( \omega ) $ | ||
+ | and $ q ( \omega ) $( | ||
+ | cf. [[Density of a probability distribution|Density of a probability distribution]]). | ||
− | The negative of | + | The negative of $ I ( P;Q ) $ |
+ | is the conditional entropy (or relative entropy) of $ P ( d \omega ) $ | ||
+ | with respect to $ Q ( d \omega ) $; | ||
+ | see [[Entropy|Entropy]]. | ||
Various notions of (asymmetric and symmetric) information distances are based on the Kullback–Leibler information. | Various notions of (asymmetric and symmetric) information distances are based on the Kullback–Leibler information. | ||
− | The quantity | + | The quantity $ I ( p,q ) $ |
+ | is also called the informational divergence (see [[Huffman code|Huffman code]]). | ||
See also [[Information distance|Information distance]]; [[Kullback–Leibler-type distance measures|Kullback–Leibler-type distance measures]]. | See also [[Information distance|Information distance]]; [[Kullback–Leibler-type distance measures|Kullback–Leibler-type distance measures]]. |
Latest revision as of 22:15, 5 June 2020
Kullback–Leibler quantity of information, Kullback–Leibler information quantity, directed divergence
For discrete distributions (cf. Discrete distribution) given by probability vectors $ p = ( p _ {1} \dots p _ {n} ) $, $ q = ( q _ {1} \dots q _ {n} ) $, the Kullback–Leibler (quantity of) information of $ p $ with respect to $ q $ is:
$$ I ( p;q ) = \sum _ {i = 1 } ^ { m } p _ {i} ( { \mathop{\rm log} } p _ {i} - { \mathop{\rm log} } q _ {i} ) , $$
where $ { \mathop{\rm log} } $ is the natural logarithm (cf. also Logarithm of a number).
More generally, one has:
$$ I ( P;Q ) = \int\limits _ \Omega { { \mathop{\rm log} } { \frac{p ( \omega ) }{q ( \omega ) } } } {P ( d \omega ) } $$
for probability distributions $ P ( d \omega ) $ and $ Q ( d \omega ) $ with densities $ p ( \omega ) $ and $ q ( \omega ) $( cf. Density of a probability distribution).
The negative of $ I ( P;Q ) $ is the conditional entropy (or relative entropy) of $ P ( d \omega ) $ with respect to $ Q ( d \omega ) $; see Entropy.
Various notions of (asymmetric and symmetric) information distances are based on the Kullback–Leibler information.
The quantity $ I ( p,q ) $ is also called the informational divergence (see Huffman code).
See also Information distance; Kullback–Leibler-type distance measures.
References
[a1] | S. Kullback, "Information theory and statistics" , Wiley (1959) |
[a2] | S. Kullback, R.A. Leibler, "On information and sufficiency" Ann. Math. Stat. , 22 (1951) pp. 79–86 |
[a3] | J. Sakamoto, M. Ishiguro, G. Kitagawa, "Akaike information criterion statistics" , Reidel (1986) |
Kullback-Leibler information. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Kullback-Leibler_information&oldid=22681