Difference between revisions of "Density of a probability distribution"
(→References: Feller: internal link) |
Ulf Rehmann (talk | contribs) m (tex encoded by computer) |
||
Line 1: | Line 1: | ||
+ | <!-- | ||
+ | d0311101.png | ||
+ | $#A+1 = 91 n = 0 | ||
+ | $#C+1 = 91 : ~/encyclopedia/old_files/data/D031/D.0301110 Density of a probability distribution, | ||
+ | Automatically converted into TeX, above some diagnostics. | ||
+ | Please remove this comment and the {{TEX|auto}} line below, | ||
+ | if TeX found to be correct. | ||
+ | --> | ||
+ | |||
+ | {{TEX|auto}} | ||
+ | {{TEX|done}} | ||
+ | |||
''probability density'' | ''probability density'' | ||
The derivative of the [[Distribution function|distribution function]] corresponding to an absolutely-continuous probability measure. | The derivative of the [[Distribution function|distribution function]] corresponding to an absolutely-continuous probability measure. | ||
− | Let | + | Let $ X $ |
+ | be a random vector taking values in an $ n $- | ||
+ | dimensional Euclidean space $ \mathbf R ^ {n} $ | ||
+ | $ ( n \geq 1) $, | ||
+ | let $ F $ | ||
+ | be its distribution function, and let there exist a non-negative function $ f $ | ||
+ | such that | ||
− | + | $$ | |
+ | F( x _ {1} \dots x _ {n} ) = \int\limits _ { - \infty } ^ { {x _ 1 } } \dots \int\limits _ { - \infty } ^ { {x _ n } } f( u _ {1} \dots u _ {n} ) du _ {1} \dots du _ {n} $$ | ||
− | for any real | + | for any real $ x _ {1} \dots x _ {n} $. |
+ | Then $ f $ | ||
+ | is called the probability density of $ X $, | ||
+ | and for any Borel set $ A\subset \mathbf R ^ {n} $, | ||
− | + | $$ | |
+ | {\mathsf P} \{ X \in A \} = {\int\limits \dots \int\limits } _ { A } f( u _ {1} \dots u _ {n} ) du _ {1} {} \dots du _ {n} . | ||
+ | $$ | ||
− | Any non-negative integrable function | + | Any non-negative integrable function $ f $ |
+ | satisfy the condition | ||
− | + | $$ | |
+ | \int\limits _ {- \infty } ^ \infty \dots \int\limits _ {- \infty } ^ \infty f( x _ {1} \dots x _ {n} ) dx _ {1} \dots dx _ {n} = 1 | ||
+ | $$ | ||
is the probability density of some random vector. | is the probability density of some random vector. | ||
− | If two random vectors | + | If two random vectors $ X $ |
+ | and $ Y $ | ||
+ | taking values in $ \mathbf R ^ {n} $ | ||
+ | are independent and have probability densities $ f $ | ||
+ | and $ g $ | ||
+ | respectively, then the random vector $ X+ Y $ | ||
+ | has the probability density $ h $ | ||
+ | that is the convolution of $ f $ | ||
+ | and $ g $: | ||
+ | |||
+ | $$ | ||
+ | h( x _ {1} \dots x _ {n} ) = | ||
+ | $$ | ||
+ | |||
+ | $$ | ||
+ | = \ | ||
+ | \int\limits _ {- \infty } ^ \infty \dots \int\limits _ {- \infty } ^ \infty | ||
+ | f( x _ {1} - u _ {1} \dots x _ {n} - u _ {n} ) g( u _ {1} \dots u _ {n} ) \times | ||
+ | $$ | ||
− | + | $$ | |
+ | \times | ||
+ | du _ {1} \dots du _ {n\ } = | ||
+ | $$ | ||
− | + | $$ | |
+ | = \ | ||
+ | \int\limits _ {- \infty } ^ \infty \dots \int\limits _ {- \infty } ^ \infty f( u _ {1} \dots | ||
+ | u _ {n} ) g( x _ {1} - u _ {1} \dots x _ {n} - u _ {n} ) \times | ||
+ | $$ | ||
− | + | $$ | |
+ | \times \ | ||
+ | du _ {1} \dots du _ {n} . | ||
+ | $$ | ||
− | + | Let $ X = ( X _ {1} \dots X _ {n} ) $ | |
+ | and $ Y = ( Y _ {1} \dots Y _ {m} ) $ | ||
+ | be random vectors taking values in $ \mathbf R ^ {n} $ | ||
+ | and $ \mathbf R ^ {m} $ | ||
+ | $ ( n, m \geq 1) $ | ||
+ | and having probability densities $ f $ | ||
+ | and $ g $ | ||
+ | respectively, and let $ Z = ( X _ {1} \dots X _ {n} , Y _ {1} \dots Y _ {m} ) $ | ||
+ | be a random vector in $ \mathbf R ^ {n+} m $. | ||
+ | If then $ X $ | ||
+ | and $ Y $ | ||
+ | are independent, $ Z $ | ||
+ | has the probability density $ h $, | ||
+ | which is called the joint probability density of the random vectors $ X $ | ||
+ | and $ Y $, | ||
+ | where | ||
− | + | $$ \tag{1 } | |
+ | h( t _ {1} \dots t _ {n+} m ) = f( t _ {1} \dots t _ {n} ) g( t _ {n+} 1 \dots t _ {n+} m ). | ||
+ | $$ | ||
− | + | Conversely, if $ Z $ | |
+ | has a probability density that satisfies (1), then $ X $ | ||
+ | and $ Y $ | ||
+ | are independent. | ||
− | + | The characteristic function $ \phi $ | |
+ | of a random vector $ X $ | ||
+ | having a probability density $ f $ | ||
+ | is expressed by | ||
− | + | $$ | |
+ | \phi ( t _ {1} \dots t _ {n} ) = | ||
+ | $$ | ||
− | + | $$ | |
+ | = \ | ||
+ | \int\limits _ {- \infty } ^ \infty \dots \int\limits _ {- \infty } ^ \infty e ^ {i( t _ {1} x _ {1} + \dots + t _ {n} x _ {n} ) } f( x _ {1} \dots x _ {n} ) dx _ {1} \dots dx _ {n} , | ||
+ | $$ | ||
− | + | where if $ \phi $ | |
+ | is absolutely integrable then $ f $ | ||
+ | is a bounded continuous function, and | ||
− | + | $$ | |
+ | f( x _ {1} \dots x _ {n} ) = | ||
+ | $$ | ||
− | + | $$ | |
+ | = \ | ||
− | + | \frac{1}{( 2 \pi ) ^ {n} } | |
+ | \int\limits _ {- \infty } ^ \infty \dots \int\limits _ {- \infty | ||
+ | } ^ \infty e ^ {- i( t _ {1} x _ {1} + \dots + t _ {n} x _ {n} ) } \phi ( t _ {1} \dots t _ {n} ) dt _ {1} \dots dt _ {n} . | ||
+ | $$ | ||
− | + | The probability density $ f $ | |
+ | and the corresponding characteristic function $ \phi $ | ||
+ | are related also by the following relation (Plancherel's identity): The function $ f ^ { 2 } $ | ||
+ | is integrable if and only if the function $ | \phi | ^ {2} $ | ||
+ | is integrable, and in that case | ||
− | + | $$ | |
+ | \int\limits _ {- \infty } ^ \infty \dots \int\limits _ {- \infty } ^ \infty f ^ { 2 } ( x _ {1} \dots x _ {n} ) dx _ {1} \dots dx _ {n\ } = | ||
+ | $$ | ||
− | + | $$ | |
+ | = \ | ||
− | + | \frac{1}{( 2 \pi ) ^ {n} } | |
+ | \int\limits _ {- \infty } ^ \infty \dots \int\limits _ {- \infty } ^ \infty | \phi ( t _ {1} \dots t _ {n} ) | ^ {2} dt _ {1} \dots dt _ {n} . | ||
+ | $$ | ||
− | Let | + | Let $ ( \Omega , \mathfrak A) $ |
+ | be a [[Measurable space|measurable space]], and let $ \nu $ | ||
+ | and $ \mu $ | ||
+ | be $ \sigma $- | ||
+ | finite measures on $ ( \Omega , \mathfrak A) $ | ||
+ | with $ \nu $ | ||
+ | absolutely continuous with respect to $ \mu $, | ||
+ | i.e. $ \mu ( A) = 0 $ | ||
+ | implies $ \nu ( A) = 0 $, | ||
+ | $ A \in \mathfrak A $. | ||
+ | In that case there exists on $ ( \Omega , \mathfrak A) $ | ||
+ | a non-negative measurable function $ f $ | ||
+ | such that | ||
− | + | $$ | |
+ | \nu ( A) = \int\limits _ { A } f d \mu | ||
+ | $$ | ||
− | for any | + | for any $ A \in \mathfrak A $. |
+ | The function $ f $ | ||
+ | is called the Radon–Nikodým derivative of $ \nu $ | ||
+ | with respect to $ \mu $, | ||
+ | while if $ \nu $ | ||
+ | is a probability measure, it is also the probability density of $ \nu $ | ||
+ | relative to $ \mu $. | ||
− | A concept closely related to the probability density is that of a dominated family of distributions. A family of probability distributions | + | A concept closely related to the probability density is that of a dominated family of distributions. A family of probability distributions $ \mathfrak P $ |
+ | on a measurable space $ ( \Omega , \mathfrak A) $ | ||
+ | is called dominated if there exists a $ \sigma $- | ||
+ | finite measure $ \mu $ | ||
+ | on $ ( \Omega , \mathfrak A) $ | ||
+ | such that each probability measure from $ \mathfrak P $ | ||
+ | has a probability density relative to $ \mu $( | ||
+ | or, what is the same, if each measure from $ \mathfrak P $ | ||
+ | is absolutely continuous with respect to $ \mu $). | ||
+ | The assumption of dominance is important in certain theorems in mathematical statistics. | ||
====References==== | ====References==== | ||
<table><TR><TD valign="top">[1]</TD> <TD valign="top"> Yu.V. [Yu.V. Prokhorov] Prohorov, Yu.A. Rozanov, "Probability theory, basic concepts. Limit theorems, random processes", Springer (1969) (Translated from Russian)</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top"> W. Feller, [[Feller, "An introduction to probability theory and its applications"|"An introduction to probability theory and its applications"]], '''2''', Wiley (1971)</TD></TR><TR><TD valign="top">[3]</TD> <TD valign="top"> E.L. Lehmann, "Testing statistical hypotheses", Wiley (1986)</TD></TR></table> | <table><TR><TD valign="top">[1]</TD> <TD valign="top"> Yu.V. [Yu.V. Prokhorov] Prohorov, Yu.A. Rozanov, "Probability theory, basic concepts. Limit theorems, random processes", Springer (1969) (Translated from Russian)</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top"> W. Feller, [[Feller, "An introduction to probability theory and its applications"|"An introduction to probability theory and its applications"]], '''2''', Wiley (1971)</TD></TR><TR><TD valign="top">[3]</TD> <TD valign="top"> E.L. Lehmann, "Testing statistical hypotheses", Wiley (1986)</TD></TR></table> |
Latest revision as of 17:32, 5 June 2020
probability density
The derivative of the distribution function corresponding to an absolutely-continuous probability measure.
Let $ X $ be a random vector taking values in an $ n $- dimensional Euclidean space $ \mathbf R ^ {n} $ $ ( n \geq 1) $, let $ F $ be its distribution function, and let there exist a non-negative function $ f $ such that
$$ F( x _ {1} \dots x _ {n} ) = \int\limits _ { - \infty } ^ { {x _ 1 } } \dots \int\limits _ { - \infty } ^ { {x _ n } } f( u _ {1} \dots u _ {n} ) du _ {1} \dots du _ {n} $$
for any real $ x _ {1} \dots x _ {n} $. Then $ f $ is called the probability density of $ X $, and for any Borel set $ A\subset \mathbf R ^ {n} $,
$$ {\mathsf P} \{ X \in A \} = {\int\limits \dots \int\limits } _ { A } f( u _ {1} \dots u _ {n} ) du _ {1} {} \dots du _ {n} . $$
Any non-negative integrable function $ f $ satisfy the condition
$$ \int\limits _ {- \infty } ^ \infty \dots \int\limits _ {- \infty } ^ \infty f( x _ {1} \dots x _ {n} ) dx _ {1} \dots dx _ {n} = 1 $$
is the probability density of some random vector.
If two random vectors $ X $ and $ Y $ taking values in $ \mathbf R ^ {n} $ are independent and have probability densities $ f $ and $ g $ respectively, then the random vector $ X+ Y $ has the probability density $ h $ that is the convolution of $ f $ and $ g $:
$$ h( x _ {1} \dots x _ {n} ) = $$
$$ = \ \int\limits _ {- \infty } ^ \infty \dots \int\limits _ {- \infty } ^ \infty f( x _ {1} - u _ {1} \dots x _ {n} - u _ {n} ) g( u _ {1} \dots u _ {n} ) \times $$
$$ \times du _ {1} \dots du _ {n\ } = $$
$$ = \ \int\limits _ {- \infty } ^ \infty \dots \int\limits _ {- \infty } ^ \infty f( u _ {1} \dots u _ {n} ) g( x _ {1} - u _ {1} \dots x _ {n} - u _ {n} ) \times $$
$$ \times \ du _ {1} \dots du _ {n} . $$
Let $ X = ( X _ {1} \dots X _ {n} ) $ and $ Y = ( Y _ {1} \dots Y _ {m} ) $ be random vectors taking values in $ \mathbf R ^ {n} $ and $ \mathbf R ^ {m} $ $ ( n, m \geq 1) $ and having probability densities $ f $ and $ g $ respectively, and let $ Z = ( X _ {1} \dots X _ {n} , Y _ {1} \dots Y _ {m} ) $ be a random vector in $ \mathbf R ^ {n+} m $. If then $ X $ and $ Y $ are independent, $ Z $ has the probability density $ h $, which is called the joint probability density of the random vectors $ X $ and $ Y $, where
$$ \tag{1 } h( t _ {1} \dots t _ {n+} m ) = f( t _ {1} \dots t _ {n} ) g( t _ {n+} 1 \dots t _ {n+} m ). $$
Conversely, if $ Z $ has a probability density that satisfies (1), then $ X $ and $ Y $ are independent.
The characteristic function $ \phi $ of a random vector $ X $ having a probability density $ f $ is expressed by
$$ \phi ( t _ {1} \dots t _ {n} ) = $$
$$ = \ \int\limits _ {- \infty } ^ \infty \dots \int\limits _ {- \infty } ^ \infty e ^ {i( t _ {1} x _ {1} + \dots + t _ {n} x _ {n} ) } f( x _ {1} \dots x _ {n} ) dx _ {1} \dots dx _ {n} , $$
where if $ \phi $ is absolutely integrable then $ f $ is a bounded continuous function, and
$$ f( x _ {1} \dots x _ {n} ) = $$
$$ = \ \frac{1}{( 2 \pi ) ^ {n} } \int\limits _ {- \infty } ^ \infty \dots \int\limits _ {- \infty } ^ \infty e ^ {- i( t _ {1} x _ {1} + \dots + t _ {n} x _ {n} ) } \phi ( t _ {1} \dots t _ {n} ) dt _ {1} \dots dt _ {n} . $$
The probability density $ f $ and the corresponding characteristic function $ \phi $ are related also by the following relation (Plancherel's identity): The function $ f ^ { 2 } $ is integrable if and only if the function $ | \phi | ^ {2} $ is integrable, and in that case
$$ \int\limits _ {- \infty } ^ \infty \dots \int\limits _ {- \infty } ^ \infty f ^ { 2 } ( x _ {1} \dots x _ {n} ) dx _ {1} \dots dx _ {n\ } = $$
$$ = \ \frac{1}{( 2 \pi ) ^ {n} } \int\limits _ {- \infty } ^ \infty \dots \int\limits _ {- \infty } ^ \infty | \phi ( t _ {1} \dots t _ {n} ) | ^ {2} dt _ {1} \dots dt _ {n} . $$
Let $ ( \Omega , \mathfrak A) $ be a measurable space, and let $ \nu $ and $ \mu $ be $ \sigma $- finite measures on $ ( \Omega , \mathfrak A) $ with $ \nu $ absolutely continuous with respect to $ \mu $, i.e. $ \mu ( A) = 0 $ implies $ \nu ( A) = 0 $, $ A \in \mathfrak A $. In that case there exists on $ ( \Omega , \mathfrak A) $ a non-negative measurable function $ f $ such that
$$ \nu ( A) = \int\limits _ { A } f d \mu $$
for any $ A \in \mathfrak A $. The function $ f $ is called the Radon–Nikodým derivative of $ \nu $ with respect to $ \mu $, while if $ \nu $ is a probability measure, it is also the probability density of $ \nu $ relative to $ \mu $.
A concept closely related to the probability density is that of a dominated family of distributions. A family of probability distributions $ \mathfrak P $ on a measurable space $ ( \Omega , \mathfrak A) $ is called dominated if there exists a $ \sigma $- finite measure $ \mu $ on $ ( \Omega , \mathfrak A) $ such that each probability measure from $ \mathfrak P $ has a probability density relative to $ \mu $( or, what is the same, if each measure from $ \mathfrak P $ is absolutely continuous with respect to $ \mu $). The assumption of dominance is important in certain theorems in mathematical statistics.
References
[1] | Yu.V. [Yu.V. Prokhorov] Prohorov, Yu.A. Rozanov, "Probability theory, basic concepts. Limit theorems, random processes", Springer (1969) (Translated from Russian) |
[2] | W. Feller, "An introduction to probability theory and its applications", 2, Wiley (1971) |
[3] | E.L. Lehmann, "Testing statistical hypotheses", Wiley (1986) |
Density of a probability distribution. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Density_of_a_probability_distribution&oldid=46629