# Kullback-Leibler information

Kullback–Leibler quantity of information, Kullback–Leibler information quantity, directed divergence

For discrete distributions (cf. Discrete distribution) given by probability vectors $p = ( p _ {1} \dots p _ {n} )$, $q = ( q _ {1} \dots q _ {n} )$, the Kullback–Leibler (quantity of) information of $p$ with respect to $q$ is:

$$I ( p;q ) = \sum _ {i = 1 } ^ { m } p _ {i} ( { \mathop{\rm log} } p _ {i} - { \mathop{\rm log} } q _ {i} ) ,$$

where ${ \mathop{\rm log} }$ is the natural logarithm (cf. also Logarithm of a number).

More generally, one has:

$$I ( P;Q ) = \int\limits _ \Omega { { \mathop{\rm log} } { \frac{p ( \omega ) }{q ( \omega ) } } } {P ( d \omega ) }$$

for probability distributions $P ( d \omega )$ and $Q ( d \omega )$ with densities $p ( \omega )$ and $q ( \omega )$( cf. Density of a probability distribution).

The negative of $I ( P;Q )$ is the conditional entropy (or relative entropy) of $P ( d \omega )$ with respect to $Q ( d \omega )$; see Entropy.

Various notions of (asymmetric and symmetric) information distances are based on the Kullback–Leibler information.

The quantity $I ( p,q )$ is also called the informational divergence (see Huffman code).

How to Cite This Entry:
Kullback-Leibler information. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Kullback-Leibler_information&oldid=47533
This article was adapted from an original article by M. Hazewinkel (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article