Namespaces
Variants
Actions

Difference between revisions of "Uniform distribution"

From Encyclopedia of Mathematics
Jump to: navigation, search
(→‎References: Feller: internal link)
m (tex done)
 
(One intermediate revision by one other user not shown)
Line 1: Line 1:
 +
{{TEX|done}}
 +
 
{{MSC|60E99}}
 
{{MSC|60E99}}
  
Line 6: Line 8:
  
 
==The uniform distribution on an interval of the line (the rectangular distribution).==
 
==The uniform distribution on an interval of the line (the rectangular distribution).==
The uniform distribution on an interval <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u0952401.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u0952402.png" />, is the [[Probability distribution|probability distribution]] with density
+
The uniform distribution on an interval $  [a,\  b] $,  
 +
$  a < b $,  
 +
is the [[Probability distribution|probability distribution]] with density $$
 +
p (x)  = 
 +
\left \{
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u0952403.png" /></td> </tr></table>
+
\begin{array}{ll}
 +
{
 +
\frac{1}{b - a}
 +
} ,  &  x \in [a,\  b],  \\
 +
0,  &  x \notin [a,\  b]. \\
 +
\end{array}
  
The concept of a uniform distribution on <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u0952404.png" /> corresponds to the representation of a random choice of a point from the interval. The mathematical expectation and variance of the uniform distribution are equal, respectively, to <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u0952405.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u0952406.png" />. The distribution function is
+
\right .$$
 +
The concept of a uniform distribution on $  [a,\  b] $
 +
corresponds to the representation of a random choice of a point from the interval. The mathematical expectation and variance of the uniform distribution are equal, respectively, to $  (b + a)/2 $
 +
and $  (b - a) ^{2} /12 $.  
 +
The distribution function is $$
 +
F (x)  = 
 +
\left \{
 +
\begin{array}{ll}
 +
0 ,  &  x \leq a,  \\
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u0952407.png" /></td> </tr></table>
+
\frac{x - a}{b - a}
 +
,  &  a < x \leq b,  \\
 +
1,  &  x > b,  \\
 +
\end{array}
  
and the characteristic function is
+
\right .$$
 +
and the characteristic function is $$
 +
\phi (t)  =  {
 +
\frac{1}{it (b - a)}
 +
} (e ^{itb} - e ^{ita} ).
 +
$$
 +
A random variable with uniform distribution on  $  [0,\  1] $
 +
can be constructed from a sequence of independent random variables  $  X _{1} ,\  X _{2} \dots $
 +
taking the values 0 and 1 with probabilities  $  1/2 $,
 +
by putting $$
 +
X  =  \sum _ {n = 1} ^ \infty X _{n} 2 ^{-n}  $$(
 +
$  X _{n} $
 +
are the digits in the binary expansion of  $  X $).
 +
The random number  $  X $
 +
has a uniform distribution in the interval  $  [0,\  1] $.
 +
This fact has important statistical applications, see, for example, [[Random and pseudo-random numbers|Random and pseudo-random numbers]].
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u0952408.png" /></td> </tr></table>
+
If two independent random variables  $  X _{1} $
 +
and  $  X _{2} $
 +
have uniform distributions on  $  [0,\  1] $,
 +
then their sum has the so-called triangular distribution on  $  [0,\  2] $
 +
with density  $  u _{2} (x) = 1 - | 1 - x | $
 +
for  $  x \in [0,\  2] $
 +
and  $  u _{2} (x) = 0 $
 +
for  $  x \notin [0,\  2] $.  
 +
The sum of three independent random variables with uniform distributions on  $  [0,\  1] $
 +
has on  $  [0,\  3] $
 +
the distribution with density $$
 +
u _{3} (x)  =  \left \{
 +
\begin{array}{ll}
 +
{
 +
\frac{x ^ 2}{2}
 +
} ,  &  0 \leq x < 1,  \\
 +
{
 +
\frac{[x ^{2} - 3 (x - 1) ^{2} ]}{2}
 +
} ,  &  1 \leq x < 2,  \\
 +
{
 +
\frac{[x ^{2} - 3 (x - 1) ^{2} + 3 (x - 2) ^{2} ]}{2}
 +
} ,  &  2 \leq x < 3,  \\
 +
0,  &  x \notin [0,\  3].  \\
 +
\end{array}
  
A random variable with uniform distribution on <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u0952409.png" /> can be constructed from a sequence of independent random variables <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524010.png" /> taking the values 0 and 1 with probabilities <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524011.png" />, by putting
+
\right .$$
 +
In general, the distribution of the sum  $  X _{1} + \dots + X _{n} $
 +
of independent variables with uniform distributions on $  [0,\  1] $
 +
has density $$
 +
u _{n} (x)  =
 +
{
 +
\frac{1}{(n - 1)!}
 +
}
 +
\sum _ {k = 0} ^ n
 +
(-1) ^{k} \binom{n}{k}
 +
(x - k) _{+} ^ {n - 1}
 +
$$
 +
for  $  0 \leq x \leq n $
 +
and $  u _{n} (x) = 0 $
 +
for  $  x \notin [0,\  n] $;
 +
here $$
 +
z _{+}  =
 +
\left \{
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524012.png" /></td> </tr></table>
+
\begin{array}{ll}
 +
z,  &  z > 0,  \\
 +
0,  &  z \leq 0. \\
 +
\end{array}
  
(<img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524013.png" /> are the digits in the binary expansion of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524014.png" />). The random number <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524015.png" /> has a uniform distribution in the interval <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524016.png" />. This fact has important statistical applications, see, for example, [[Random and pseudo-random numbers|Random and pseudo-random numbers]].
+
\right .$$
 +
As  $  n \rightarrow \infty $,
 +
the distribution of the sum  $  X _{1} + \dots + X _{n} $,
 +
centred around the mathematical expectation  $  n/2 $
 +
and scaled by the standard deviation  $  \sqrt {n/12} $,
 +
tends to the normal distribution with parameters 0 and 1 (the approximation for  $  n = 3 $
 +
is already satisfactory for many practical purposes).
  
If two independent random variables <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524017.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524018.png" /> have uniform distributions on <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524019.png" />, then their sum has the so-called triangular distribution on <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524020.png" /> with density <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524021.png" /> for <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524022.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524023.png" /> for <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524024.png" />. The sum of three independent random variables with uniform distributions on <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524025.png" /> has on <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524026.png" /> the distribution with density
+
In statistical applications the procedure for constructing a random variable  $  X $
 +
with given distribution function  $  F $
 +
is based on the following fact. Let the random variable  $  Y $
 +
be uniformly distributed on $  [0,\  1] $
 +
and let the distribution function  $  F $
 +
be continuous and strictly increasing. Then the random variable  $  X = F ^ {\  -1} Y $
 +
has distribution function  $  F $(
 +
in the general case it is necessary to replace the inverse function  $  F ^ {\  -1} (y) $
 +
in the definition of $  X $
 +
by an analogue, namely  $  F ^ {\  -1} (y) = \mathop{\rm inf}\nolimits \{ {x} : {F (x) \leq y \leq F (x + 0)} \} $).
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524027.png" /></td> </tr></table>
 
  
In general, the distribution of the sum <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524028.png" /> of independent variables with uniform distributions on <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524029.png" /> has density
+
==The uniform distribution on an interval as a limit distribution.==
 
+
Some typical examples of the uniform distribution on $  [0,\  1] $
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524030.png" /></td> </tr></table>
+
arising as a limit are given below.
  
for <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524031.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524032.png" /> for <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524033.png" />; here
+
1) Let  $  X _{1} ,\  X _{2} \dots $
 +
be independent random variables having the same continuous distribution function. Then the distribution of their sums  $  S _{n} $,
 +
taken  $  \mathop{\rm mod}\nolimits \  1 $,
 +
that is, the distribution of the fractional parts  $  \{ S _{n} \} $
 +
of these sums  $  S _{n} $,
 +
converges to the uniform distribution on  $  [0,\  1] $.
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524034.png" /></td> </tr></table>
 
 
As <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524035.png" />, the distribution of the sum <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524036.png" />, centred around the mathematical expectation <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524037.png" /> and scaled by the standard deviation <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524038.png" />, tends to the normal distribution with parameters 0 and 1 (the approximation for <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524039.png" /> is already satisfactory for many practical purposes).
 
 
In statistical applications the procedure for constructing a random variable <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524040.png" /> with given distribution function <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524041.png" /> is based on the following fact. Let the random variable <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524042.png" /> be uniformly distributed on <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524043.png" /> and let the distribution function <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524044.png" /> be continuous and strictly increasing. Then the random variable <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524045.png" /> has distribution function <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524046.png" /> (in the general case it is necessary to replace the inverse function <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524047.png" /> in the definition of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524048.png" /> by an analogue, namely <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524049.png" />).
 
 
==The uniform distribution on an interval as a limit distribution.==
 
Some typical examples of the uniform distribution on <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524050.png" /> arising as a limit are given below.
 
  
1) Let <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524051.png" /> be independent random variables having the same continuous distribution function. Then the distribution of their sums <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524052.png" />, taken <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524053.png" />, that is, the distribution of the fractional parts <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524054.png" /> of these sums <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524055.png" />, converges to the uniform distribution on <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524056.png" />.
+
2) Let the random parameters  $  \alpha $
 +
and  $  \beta $
 +
have an absolutely-continuous joint distribution; then, as  $  t \rightarrow \infty $,  
 +
the distribution of $  \{ \alpha t + \beta \} $
 +
converges to the uniform distribution on $  [0,\  1] $.
  
2) Let the random parameters <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524057.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524058.png" /> have an absolutely-continuous joint distribution; then, as <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524059.png" />, the distribution of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524060.png" /> converges to the uniform distribution on <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524061.png" />.
 
  
3) A uniform distribution appears as the limit distribution of the fractional parts of certain functions <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524062.png" /> on the positive integers. For example, for irrational <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524063.png" /> the fraction of those <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524064.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524065.png" />, for which
+
3) A uniform distribution appears as the limit distribution of the fractional parts of certain functions $  g $
 +
on the positive integers. For example, for irrational $  \alpha $
 +
the fraction of those $  m $,  
 +
$  1 \leq m \leq n $,  
 +
for which $$
 +
0  \leq  a  \leq  \{ na \}  \leq  b  \leq  1,
 +
$$
 +
has the limit  $  b - a $
 +
as  $  n \rightarrow \infty $.
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524066.png" /></td> </tr></table>
 
  
has the limit <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524067.png" /> as <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524068.png" />.
+
==The uniform distribution on subsets of  $  \mathbf R ^{k} $.==
  
==The uniform distribution on subsets of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524069.png" />.==
+
An example of a uniform distribution in a rectangle appears already in the [[Buffon problem|Buffon problem]] (see also [[Geometric probabilities|Geometric probabilities]]; [[Stochastic geometry|Stochastic geometry]]). The uniform distribution on a bounded set $  D $
An example of a uniform distribution in a rectangle appears already in the [[Buffon problem|Buffon problem]] (see also [[Geometric probabilities|Geometric probabilities]]; [[Stochastic geometry|Stochastic geometry]]). The uniform distribution on a bounded set <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524070.png" /> in <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524071.png" /> is defined as the distribution with density
+
in $  \mathbf R ^{k} $
 +
is defined as the distribution with density $$
 +
p (x _{1} \dots x _{n} )  = 
 +
\left \{
 +
\begin{array}{ll}
 +
C \neq 0,  &  x \in D,  \\
 +
0,  &  x \notin D,  \\
 +
\end{array}
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524072.png" /></td> </tr></table>
+
\right .$$
 +
where  $  C $
 +
is the inverse of the  $  k $-
 +
dimensional volume (or Lebesgue measure) of  $  D $.
  
where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524073.png" /> is the inverse of the <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524074.png" />-dimensional volume (or Lebesgue measure) of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524075.png" />.
 
  
Uniform distributions on surfaces have also been discussed. Thus, a "random direction" (for example, in <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u095/u095240/u09524076.png" />), defined as a vector from the origin to a random point on the surface of the unit sphere, is uniformly distributed in the sense that the probability that it hits a part of the surface is proportional to the area of that part.
+
Uniform distributions on surfaces have also been discussed. Thus, a "random direction" (for example, in $  \mathbf R ^{3} $),  
 +
defined as a vector from the origin to a random point on the surface of the unit sphere, is uniformly distributed in the sense that the probability that it hits a part of the surface is proportional to the area of that part.
  
 
The role of the uniform distribution in algebraic groups is played by the normalized [[Haar measure|Haar measure]].
 
The role of the uniform distribution in algebraic groups is played by the normalized [[Haar measure|Haar measure]].
  
 
====References====
 
====References====
<table><TR><TD valign="top">[1]</TD> <TD valign="top"> W. Feller, [[Feller, "An introduction to probability theory and its  applications"|"An introduction to probability theory and its  applications"]], '''2''', Wiley (1971) </TD></TR></table>
+
{|
 +
|valign="top"|{{Ref|F}}|| W. Feller, [[Feller, "An introduction to probability theory and its  applications"|"An introduction to probability theory and its  applications"]], '''2''', Wiley (1971)
 +
|}

Latest revision as of 12:00, 22 December 2019


2020 Mathematics Subject Classification: Primary: 60E99 [MSN][ZBL]

A common name for a class of probability distributions, arising as an extension of the idea of "equally possible outcomes" to the continuous case. As with the normal distribution, the uniform distribution appears in probability theory as an exact distribution in some problems and as a limit in others.

The uniform distribution on an interval of the line (the rectangular distribution).

The uniform distribution on an interval $ [a,\ b] $, $ a < b $, is the probability distribution with density $$ p (x) = \left \{ \begin{array}{ll} { \frac{1}{b - a} } , & x \in [a,\ b], \\ 0, & x \notin [a,\ b]. \\ \end{array} \right .$$ The concept of a uniform distribution on $ [a,\ b] $ corresponds to the representation of a random choice of a point from the interval. The mathematical expectation and variance of the uniform distribution are equal, respectively, to $ (b + a)/2 $ and $ (b - a) ^{2} /12 $. The distribution function is $$ F (x) = \left \{ \begin{array}{ll} 0 , & x \leq a, \\ \frac{x - a}{b - a} , & a < x \leq b, \\ 1, & x > b, \\ \end{array} \right .$$ and the characteristic function is $$ \phi (t) = { \frac{1}{it (b - a)} } (e ^{itb} - e ^{ita} ). $$ A random variable with uniform distribution on $ [0,\ 1] $ can be constructed from a sequence of independent random variables $ X _{1} ,\ X _{2} \dots $ taking the values 0 and 1 with probabilities $ 1/2 $, by putting $$ X = \sum _ {n = 1} ^ \infty X _{n} 2 ^{-n} $$( $ X _{n} $ are the digits in the binary expansion of $ X $). The random number $ X $ has a uniform distribution in the interval $ [0,\ 1] $. This fact has important statistical applications, see, for example, Random and pseudo-random numbers.

If two independent random variables $ X _{1} $ and $ X _{2} $ have uniform distributions on $ [0,\ 1] $, then their sum has the so-called triangular distribution on $ [0,\ 2] $ with density $ u _{2} (x) = 1 - | 1 - x | $ for $ x \in [0,\ 2] $ and $ u _{2} (x) = 0 $ for $ x \notin [0,\ 2] $. The sum of three independent random variables with uniform distributions on $ [0,\ 1] $ has on $ [0,\ 3] $ the distribution with density $$ u _{3} (x) = \left \{ \begin{array}{ll} { \frac{x ^ 2}{2} } , & 0 \leq x < 1, \\ { \frac{[x ^{2} - 3 (x - 1) ^{2} ]}{2} } , & 1 \leq x < 2, \\ { \frac{[x ^{2} - 3 (x - 1) ^{2} + 3 (x - 2) ^{2} ]}{2} } , & 2 \leq x < 3, \\ 0, & x \notin [0,\ 3]. \\ \end{array} \right .$$ In general, the distribution of the sum $ X _{1} + \dots + X _{n} $ of independent variables with uniform distributions on $ [0,\ 1] $ has density $$ u _{n} (x) = { \frac{1}{(n - 1)!} } \sum _ {k = 0} ^ n (-1) ^{k} \binom{n}{k} (x - k) _{+} ^ {n - 1} $$ for $ 0 \leq x \leq n $ and $ u _{n} (x) = 0 $ for $ x \notin [0,\ n] $; here $$ z _{+} = \left \{ \begin{array}{ll} z, & z > 0, \\ 0, & z \leq 0. \\ \end{array} \right .$$ As $ n \rightarrow \infty $, the distribution of the sum $ X _{1} + \dots + X _{n} $, centred around the mathematical expectation $ n/2 $ and scaled by the standard deviation $ \sqrt {n/12} $, tends to the normal distribution with parameters 0 and 1 (the approximation for $ n = 3 $ is already satisfactory for many practical purposes).

In statistical applications the procedure for constructing a random variable $ X $ with given distribution function $ F $ is based on the following fact. Let the random variable $ Y $ be uniformly distributed on $ [0,\ 1] $ and let the distribution function $ F $ be continuous and strictly increasing. Then the random variable $ X = F ^ {\ -1} Y $ has distribution function $ F $( in the general case it is necessary to replace the inverse function $ F ^ {\ -1} (y) $ in the definition of $ X $ by an analogue, namely $ F ^ {\ -1} (y) = \mathop{\rm inf}\nolimits \{ {x} : {F (x) \leq y \leq F (x + 0)} \} $).


The uniform distribution on an interval as a limit distribution.

Some typical examples of the uniform distribution on $ [0,\ 1] $ arising as a limit are given below.

1) Let $ X _{1} ,\ X _{2} \dots $ be independent random variables having the same continuous distribution function. Then the distribution of their sums $ S _{n} $, taken $ \mathop{\rm mod}\nolimits \ 1 $, that is, the distribution of the fractional parts $ \{ S _{n} \} $ of these sums $ S _{n} $, converges to the uniform distribution on $ [0,\ 1] $.


2) Let the random parameters $ \alpha $ and $ \beta $ have an absolutely-continuous joint distribution; then, as $ t \rightarrow \infty $, the distribution of $ \{ \alpha t + \beta \} $ converges to the uniform distribution on $ [0,\ 1] $.


3) A uniform distribution appears as the limit distribution of the fractional parts of certain functions $ g $ on the positive integers. For example, for irrational $ \alpha $ the fraction of those $ m $, $ 1 \leq m \leq n $, for which $$ 0 \leq a \leq \{ na \} \leq b \leq 1, $$ has the limit $ b - a $ as $ n \rightarrow \infty $.


The uniform distribution on subsets of $ \mathbf R ^{k} $.

An example of a uniform distribution in a rectangle appears already in the Buffon problem (see also Geometric probabilities; Stochastic geometry). The uniform distribution on a bounded set $ D $ in $ \mathbf R ^{k} $ is defined as the distribution with density $$ p (x _{1} \dots x _{n} ) = \left \{ \begin{array}{ll} C \neq 0, & x \in D, \\ 0, & x \notin D, \\ \end{array} \right .$$ where $ C $ is the inverse of the $ k $- dimensional volume (or Lebesgue measure) of $ D $.


Uniform distributions on surfaces have also been discussed. Thus, a "random direction" (for example, in $ \mathbf R ^{3} $), defined as a vector from the origin to a random point on the surface of the unit sphere, is uniformly distributed in the sense that the probability that it hits a part of the surface is proportional to the area of that part.

The role of the uniform distribution in algebraic groups is played by the normalized Haar measure.

References

[F] W. Feller, "An introduction to probability theory and its applications", 2, Wiley (1971)
How to Cite This Entry:
Uniform distribution. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Uniform_distribution&oldid=25943
This article was adapted from an original article by A.V. Prokhorov (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article