Namespaces
Variants
Actions

Difference between revisions of "Asymmetry coefficient"

From Encyclopedia of Mathematics
Jump to: navigation, search
(Importing text file)
 
m (OldImage template added)
 
(2 intermediate revisions by one other user not shown)
Line 1: Line 1:
 +
<!--
 +
a0135901.png
 +
$#A+1 = 29 n = 0
 +
$#C+1 = 29 : ~/encyclopedia/old_files/data/A013/A.0103590 Asymmetry coefficient
 +
Automatically converted into TeX, above some diagnostics.
 +
Please remove this comment and the {{TEX|auto}} line below,
 +
if TeX found to be correct.
 +
-->
 +
 +
{{TEX|auto}}
 +
{{TEX|done}}
 +
 
The most frequently employed measure of the [[Asymmetry of a distribution|asymmetry of a distribution]], defined by the relationship
 
The most frequently employed measure of the [[Asymmetry of a distribution|asymmetry of a distribution]], defined by the relationship
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/a/a013/a013590/a0135901.png" /></td> </tr></table>
+
$$
 +
\gamma _ {1}  =
 +
\frac{\mu _ 3}{\mu _ {2}  ^ {3/2} }
 +
,
 +
$$
  
where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/a/a013/a013590/a0135902.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/a/a013/a013590/a0135903.png" /> are the second and third central moments of the distribution, respectively. For distributions that are symmetric with respect to the mathematical expectation, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/a/a013/a013590/a0135904.png" />; depending on the sign of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/a/a013/a013590/a0135905.png" /> one speaks of positive asymmetry (<img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/a/a013/a013590/a0135906.png" />) and negative asymmetry (<img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/a/a013/a013590/a0135907.png" />). In the case of the [[Binomial distribution|binomial distribution]] corresponding to <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/a/a013/a013590/a0135908.png" /> [[Bernoulli trials|Bernoulli trials]] with probability of success <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/a/a013/a013590/a0135909.png" />,
+
where $  \mu _ {2} $
 +
and $  \mu _ {3} $
 +
are the second and third central moments of the distribution, respectively. For distributions that are symmetric with respect to the mathematical expectation, $  \gamma _ {1} = 0 $;  
 +
depending on the sign of $  \gamma _ {1} $
 +
one speaks of positive asymmetry ( $  \gamma _ {1} > 0 $)  
 +
and negative asymmetry ( $  \gamma _ {1} < 0 $).  
 +
In the case of the [[Binomial distribution|binomial distribution]] corresponding to $  n $[[
 +
Bernoulli trials|Bernoulli trials]] with probability of success $  p $,
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/a/a013/a013590/a01359010.png" /></td> <td valign="top" style="width:5%;text-align:right;">(*)</td></tr></table>
+
$$ \tag{* }
 +
\gamma _ {1}  =
 +
\frac{1 - 2 p }{\sqrt {np ( 1 - p ) }}
 +
,
 +
$$
  
one has: If <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/a/a013/a013590/a01359011.png" />, the distribution is symmetric; if <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/a/a013/a013590/a01359012.png" /> or <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/a/a013/a013590/a01359013.png" />, one obtains typical distribution diagrams with a positive (Fig.a) and negative (Fig.b) asymmetry.
+
one has: If $  p = 1/2  ( \gamma _ {1} = 0 ) $,  
 +
the distribution is symmetric; if $  p < 1/2 $
 +
or  $  p > 1/2 $,  
 +
one obtains typical distribution diagrams with a positive (Fig.a) and negative (Fig.b) asymmetry.
  
 
<img style="border:1px solid;" src="https://www.encyclopediaofmath.org/legacyimages/common_img/a013590a.gif" />
 
<img style="border:1px solid;" src="https://www.encyclopediaofmath.org/legacyimages/common_img/a013590a.gif" />
Line 13: Line 43:
 
Figure: a013590a
 
Figure: a013590a
  
<img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/a/a013/a013590/a01359014.png" />. Diagram of the binomial distribution <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/a/a013/a013590/a01359015.png" /> corresponding to <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/a/a013/a013590/a01359016.png" /> Bernoulli trials, with positive asymmetry (<img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/a/a013/a013590/a01359017.png" />).
+
$  P(k, 10, 1/5 ) $.  
 +
Diagram of the binomial distribution $  P(k, n, p) $
 +
corresponding to $  n = 10 $
 +
Bernoulli trials, with positive asymmetry ( $  p = 1/5 $).
  
 
<img style="border:1px solid;" src="https://www.encyclopediaofmath.org/legacyimages/common_img/a013590b.gif" />
 
<img style="border:1px solid;" src="https://www.encyclopediaofmath.org/legacyimages/common_img/a013590b.gif" />
Line 19: Line 52:
 
Figure: a013590b
 
Figure: a013590b
  
<img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/a/a013/a013590/a01359018.png" />. Diagram of the binomial distribution <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/a/a013/a013590/a01359019.png" /> corresponding to <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/a/a013/a013590/a01359020.png" /> Bernoulli trials, with negative asymmetry (<img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/a/a013/a013590/a01359021.png" />).
+
$  P(k, 10, 4/5 ) $.  
 +
Diagram of the binomial distribution $  P(k, n, p) $
 +
corresponding to $  n = 10 $
 +
Bernoulli trials, with negative asymmetry ( $  p = 4/5 $).
 +
 
 +
The asymmetry coefficient (*) tends to zero as  $  n \rightarrow \infty $,
 +
in accordance with the fact that a normalized binomial distribution converges to the standard normal distribution.
  
The asymmetry coefficient (*) tends to zero as <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/a/a013/a013590/a01359022.png" />, in accordance with the fact that a normalized binomial distribution converges to the standard normal distribution.
+
The asymmetry coefficient and the [[Excess coefficient|excess coefficient]] are the most extensively used characteristics of the accuracy with which the distribution function  $  F _ {n} (x) $
 +
of the normalized sum
  
The asymmetry coefficient and the [[Excess coefficient|excess coefficient]] are the most extensively used characteristics of the accuracy with which the distribution function <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/a/a013/a013590/a01359023.png" /> of the normalized sum
+
$$
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/a/a013/a013590/a01359024.png" /></td> </tr></table>
+
\frac{( X _ {1} + \dots + X _ {n} ) - n \mu _ {1} }{\sqrt {n \mu _ {2} }}
 +
,
 +
$$
  
where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/a/a013/a013590/a01359025.png" /> are identically distributed and mutually independent with asymmetry coefficient <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/a/a013/a013590/a01359026.png" />, may be approximated by the normal distribution function
+
where $  X _ {1} \dots X _ {n} $
 +
are identically distributed and mutually independent with asymmetry coefficient $  \delta _ {1} $,  
 +
may be approximated by the normal distribution function
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/a/a013/a013590/a01359027.png" /></td> </tr></table>
+
$$
 +
\Phi (x)  =
 +
\frac{1}{\sqrt {2 \pi }}
 +
\int\limits _ {- \infty } ^ { x }
 +
e ^ {-z  ^ {2} /2 }  dz .
 +
$$
  
 
Under fairly general conditions the [[Edgeworth series|Edgeworth series]] yields
 
Under fairly general conditions the [[Edgeworth series|Edgeworth series]] yields
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/a/a013/a013590/a01359028.png" /></td> </tr></table>
+
$$
 +
F _ {n}  (x)  = \Phi (x) -  
 +
\frac{1}{\sqrt n}
 +
 +
\frac{\gamma _ 1}{6}
  
where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/a/a013/a013590/a01359029.png" /> is the derivative of order three.
+
\Phi  ^ {(3)} (x) + O \left (
 +
\frac{1}{n}
 +
\right ) ,
 +
$$
 +
 
 +
where $  \Phi  ^ {(3)} (x) $
 +
is the derivative of order three.
  
 
====References====
 
====References====
 
<table><TR><TD valign="top">[1]</TD> <TD valign="top">  H. Cramér,  "Mathematical methods of statistics" , Princeton Univ. Press  (1946)</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top">  S.S. Wilks,  "Mathematical statistics" , Wiley  (1962)</TD></TR></table>
 
<table><TR><TD valign="top">[1]</TD> <TD valign="top">  H. Cramér,  "Mathematical methods of statistics" , Princeton Univ. Press  (1946)</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top">  S.S. Wilks,  "Mathematical statistics" , Wiley  (1962)</TD></TR></table>
 
 
  
 
====Comments====
 
====Comments====
Line 46: Line 103:
  
 
The excess coefficient is more often called the coefficient of kurtosis.
 
The excess coefficient is more often called the coefficient of kurtosis.
 +
 +
{{OldImage}}

Latest revision as of 07:30, 26 March 2023


The most frequently employed measure of the asymmetry of a distribution, defined by the relationship

$$ \gamma _ {1} = \frac{\mu _ 3}{\mu _ {2} ^ {3/2} } , $$

where $ \mu _ {2} $ and $ \mu _ {3} $ are the second and third central moments of the distribution, respectively. For distributions that are symmetric with respect to the mathematical expectation, $ \gamma _ {1} = 0 $; depending on the sign of $ \gamma _ {1} $ one speaks of positive asymmetry ( $ \gamma _ {1} > 0 $) and negative asymmetry ( $ \gamma _ {1} < 0 $). In the case of the binomial distribution corresponding to $ n $[[ Bernoulli trials|Bernoulli trials]] with probability of success $ p $,

$$ \tag{* } \gamma _ {1} = \frac{1 - 2 p }{\sqrt {np ( 1 - p ) }} , $$

one has: If $ p = 1/2 ( \gamma _ {1} = 0 ) $, the distribution is symmetric; if $ p < 1/2 $ or $ p > 1/2 $, one obtains typical distribution diagrams with a positive (Fig.a) and negative (Fig.b) asymmetry.

Figure: a013590a

$ P(k, 10, 1/5 ) $. Diagram of the binomial distribution $ P(k, n, p) $ corresponding to $ n = 10 $ Bernoulli trials, with positive asymmetry ( $ p = 1/5 $).

Figure: a013590b

$ P(k, 10, 4/5 ) $. Diagram of the binomial distribution $ P(k, n, p) $ corresponding to $ n = 10 $ Bernoulli trials, with negative asymmetry ( $ p = 4/5 $).

The asymmetry coefficient (*) tends to zero as $ n \rightarrow \infty $, in accordance with the fact that a normalized binomial distribution converges to the standard normal distribution.

The asymmetry coefficient and the excess coefficient are the most extensively used characteristics of the accuracy with which the distribution function $ F _ {n} (x) $ of the normalized sum

$$ \frac{( X _ {1} + \dots + X _ {n} ) - n \mu _ {1} }{\sqrt {n \mu _ {2} }} , $$

where $ X _ {1} \dots X _ {n} $ are identically distributed and mutually independent with asymmetry coefficient $ \delta _ {1} $, may be approximated by the normal distribution function

$$ \Phi (x) = \frac{1}{\sqrt {2 \pi }} \int\limits _ {- \infty } ^ { x } e ^ {-z ^ {2} /2 } dz . $$

Under fairly general conditions the Edgeworth series yields

$$ F _ {n} (x) = \Phi (x) - \frac{1}{\sqrt n} \frac{\gamma _ 1}{6} \Phi ^ {(3)} (x) + O \left ( \frac{1}{n} \right ) , $$

where $ \Phi ^ {(3)} (x) $ is the derivative of order three.

References

[1] H. Cramér, "Mathematical methods of statistics" , Princeton Univ. Press (1946)
[2] S.S. Wilks, "Mathematical statistics" , Wiley (1962)

Comments

The asymmetry coefficient is usually called the coefficient of skewness. One correspondingly speaks of the skewness of a distribution and of positive, respectively negative, skewness.

The excess coefficient is more often called the coefficient of kurtosis.


🛠️ This page contains images that should be replaced by better images in the SVG file format. 🛠️
How to Cite This Entry:
Asymmetry coefficient. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Asymmetry_coefficient&oldid=16952
This article was adapted from an original article by A.V. Prokhorov (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article