Namespaces
Variants
Actions

Difference between revisions of "User:Boris Tsirelson/sandbox"

From Encyclopedia of Mathematics
Jump to: navigation, search
 
(30 intermediate revisions by the same user not shown)
Line 1: Line 1:
=Strong Mixing Conditions=
 
  
:Richard C. Bradley
 
:Department of Mathematics, Indiana University, Bloomington, Indiana, USA
 
  
There has been much research on stochastic models
 
that have a well defined, specific structure — for
 
example, Markov chains, Gaussian processes, or
 
linear models, including ARMA
 
(autoregressive – moving average) models.
 
However, it became clear in the middle of the last century
 
that there was a need for
 
a theory of statistical inference (e.g. central limit
 
theory) that could be used in the analysis of time series
 
that did not seem to "fit" any such specific structure
 
but which did seem to have some "asymptotic
 
independence" properties.
 
That motivated the development of a broad theory of
 
"strong mixing conditions" to handle such situations.
 
This note is a brief description of that theory.
 
  
The field of strong mixing conditions is a vast area,
 
and a short note such as this cannot even begin to do
 
justice to it.
 
Journal articles (with one exception) will not be cited;
 
and many researchers who made important contributions to
 
this field will not be mentioned here.
 
All that can be done here is to give a narrow snapshot
 
of part of the field.
 
  
'''The strong mixing ($\alpha$-mixing) condition.'''
+
{{MSC|62E}}
Suppose
+
{{TEX|done}}
$X := (X_k, k \in {\bf Z})$ is a sequence of
 
random variables on a given probability space
 
$(\Omega, {\cal F}, P)$.
 
For $-\infty \leq j \leq \ell \leq \infty$, let
 
${\cal F}_j^\ell$ denote the $\sigma$-field of events
 
generated by the random variables
 
$X_k,\ j \le k \leq \ell\ (k \in {\bf Z})$.
 
For any two $\sigma$-fields ${\cal A}$ and
 
${\cal B} \subset {\cal F}$, define the "measure of
 
dependence"
 
\begin{equation} \alpha({\cal A}, {\cal B}) :=
 
\sup_{A \in {\cal A}, B \in {\cal B}}
 
|P(A \cap B) - P(A)P(B)|.
 
\end{equation}
 
For the given random sequence $X$, for any positive
 
integer $n$, define the dependence coefficient
 
\begin{equation}\alpha(n) = \alpha(X,n) :=
 
\sup_{j \in {\bf Z}}
 
\alpha({\cal F}_{-\infty}^j, {\cal F}_{j + n}^{\infty}).
 
\end{equation}
 
By a trivial argument, the sequence of numbers
 
$(\alpha(n), n \in {\bf N})$ is nonincreasing.
 
The random sequence $X$ is said to be "strongly mixing",
 
or "$\alpha$-mixing", if $\alpha(n) \to 0$ as
 
$n \to \infty$.
 
This condition was introduced in 1956 by Rosenblatt [Ro1],
 
and was used in that paper in the proof of a central limit
 
theorem.
 
(The phrase "central limit theorem" will henceforth
 
be abbreviated CLT.)
 
  
In the case where the given sequence $X$ is strictly
+
A
stationary (i.e. its distribution is invariant under a
+
[[Probability distribution|probability distribution]] of a random variable $X$ which takes non-negative integer values, defined by the formula
shift of the indices), eq. (2) also has the simpler form
+
\begin{equation}\label{*}
\begin{equation}\alpha(n) = \alpha(X,n) :=
+
P(X=k)=\frac{ {k+m-1 \choose k}{N-m-k \choose M-m} } { {N \choose M} } \tag{*}
\alpha({\cal F}_{-\infty}^0, {\cal F}_n^{\infty}).
 
 
\end{equation}
 
\end{equation}
For simplicity, ''in the rest of this note,
+
where the parameters <math>N,M,m</math> are non-negative integers which satisfy the condition <math>m\leq M\leq N</math>. A negative hypergeometric distribution often arises in a scheme of sampling without replacement. If in the total population of size <math>N</math>, there are <math>M</math>  "marked"  and <math>N-M</math>  "unmarked"  elements, and if the sampling (without replacement) is performed until the number of "marked"  elements reaches a fixed number <math>m</math>, then the random variable <math>X</math> — the number of  "unmarked"  elements in the sample — has a negative hypergeometric distribution \eqref{*}. The random variable <math>X+m</math> — the size of the sample — also has a negative hypergeometric distribution. The distribution \eqref{*} is called a negative hypergeometric distribution by analogy with the
we shall restrict to strictly stationary sequences.''
+
[[Negative binomial distribution|negative binomial distribution]], which arises in the same way for sampling with replacement.
(Some comments below will have obvious adaptations to
 
nonstationary processes.)
 
  
In particular, for strictly stationary sequences,
+
The mathematical expectation and variance of a negative hypergeometric distribution are, respectively, equal to
the strong mixing ($\alpha$-mixing) condition implies Kolmogorov regularity
 
(a trivial "past tail" $\sigma$-field),
 
which in turn implies "mixing" (in the ergodic-theoretic
 
sense), which in turn implies ergodicity.
 
(None of the converse implications holds.)
 
For further related information, see
 
e.g. [Br, v1, Chapter 2].
 
  
'''Comments on limit theory under $\alpha$-mixing.'''
+
\begin{equation}
Under $\alpha$-mixing and other similar conditions
+
m\frac{N-M} {M+1}
(including ones reviewed below), there has been a vast development of limit theory — for example,
+
\end{equation}
CLTs, weak invariance principles,
 
laws of the iterated logarithm, almost sure invariance
 
principles, and rates of convergence in the strong law of
 
large numbers.
 
For example, the CLT in [Ro1] evolved through
 
subsequent refinements by several researchers
 
into the following "canonical" form.
 
(For its history and a generously detailed presentation
 
of its proof, see e.g. [Br, v1,
 
Theorems 1.19 and 10.2].)
 
  
'''Theorem 1.'''
+
and
''Suppose'' $(X_k, k \in {\bf Z})$
 
''is a strictly stationary sequence of random variables such that''
 
$EX_0 = 0$, $EX_0^2 < \infty$,
 
$\sigma_n^2 := ES_n^2 \to \infty$ as $n \to \infty$,
 
''and'' $\alpha(n) \to 0$ ''as'' $n \to \infty$.
 
''Then the following two conditions (A) and (B) are equivalent:''
 
  
(A) ''The family of random variables''
+
\begin{equation}
$(S_n^2/\sigma_n^2, n \in {\bf N})$ ''is uniformly integrable.''
+
m\frac{(N+1)(N-M)} {(M+1)(M+2)}\Big(1-\frac{m}{M+1}\Big) \, .
 +
\end{equation}
  
(B) $S_n/\sigma_n \Rightarrow N(0,1)$ ''as''
+
When <math>N, M, N-M \to \infty</math> such that <math>M/N\to p</math>, the negative hypergeometric distribution tends to the
$n \to \infty$.
+
[[negative binomial distribution]] with parameters <math>m</math> and <math>p</math>.
  
''If (the hypothesis and) these two equivalent conditions'' (A) ''and'' (B) ''hold, then''
+
The distribution function <math>F(n)</math> of the negative hypergeometric function with parameters <math>N,M,m</math> is related to the
$\sigma_n^2 = n \cdot h(n)$ ''for some function'' $h(t),\ t \in (0, \infty)$ ''which is slowly varying as'' $t \to \infty$.
+
[[Hypergeometric distribution|hypergeometric distribution]] <math>G(m)</math> with parameters <math>N,M,n</math> by the relation
 
+
\begin{equation}
Here $S_n := X_1 + X_2 + \dots + X_n$; and
+
F(n) = 1-G(m-1) \, .
$\Rightarrow$ denotes convergence in distribution.
+
\end{equation}
The assumption $ES_n^2 \to \infty$ is needed here in
+
This means that in solving problems in mathematical statistics related to negative hypergeometric distributions, tables of hypergeometric distributions can be used. The negative hypergeometric distribution is used, for example, in
order to avoid trivial $\alpha$-mixing (or even
+
[[Statistical quality control|statistical quality control]].
1-dependent) counterexamples in which a kind of "cancellation" prevents the partial sums $S_n$ from
 
"growing" (in probability) and becoming asymptotically
 
normal.
 
  
In the context of Theorem 1, if one wants to obtain asymptotic normality of the
+
====References====
partial sums (as in condition (B)) without an explicit
+
{|
uniform integrability assumption on the partial sums
+
|-
(as in condition (A)),
+
|valign="top"|{{Ref|Be}}||valign="top"|  Y.K. Belyaev,  "Probability methods of sampling control", Moscow  (1975)  (In Russian) {{MR|0428663}} 
then as an alternative, one can impose a combination of assumptions on, say, (i) the (marginal) distribution
+
|-
of $X_0$ and (ii) the rate of decay of the
+
|valign="top"|{{Ref|BoSm}}||valign="top"|  L.N. Bol'shev,   N.V. Smirnov,   "Tables of mathematical statistics", ''Libr. math. tables'', '''46''', Nauka  (1983) (In Russian) (Processed by L.S. Bark and E.S. Kedrova) {{MR|0243650}} {{ZBL|0529.62099}}
numbers $\alpha(n)$ to 0 (the "mixing rate").
+
|-
This involves a "trade-off"; the weaker one assumption
+
|valign="top"|{{Ref|JoKo}}||valign="top"|  N.L. Johnson,  S. Kotz,  "Distributions in statistics, discrete distributions", Wiley  (1969) {{MR|0268996}} {{ZBL|0292.62009}}
is, the stronger the other has to be.
+
|-
One such CLT of Ibragimov in 1962
+
|valign="top"|{{Ref|PaJo}}||valign="top"|  G.P. Patil,  S.W. Joshi,   "A dictionary and bibliography of discrete distributions", Hafner  (1968) {{MR|0282770}}
involved such a "trade-off" in which it is assumed that
+
|-
for some $\delta > 0$,
+
|}
$E|X_0|^{2 + \delta} < \infty$ and
 
$\sum_{n=1}^\infty [\alpha(n)]^{\delta/(2 + \delta)}
 
< \infty$.
 
Counterexamples of Davydov in 1973
 
(with just slightly weaker properties) showed that that
 
result is quite sharp.
 
However, it is not at the exact "borderline".
 
From a covariance inequality of Rio in 1993 and a
 
CLT (in fact a weak invariance principle)
 
of Doukhan, Massart, and Rio in 1994, it became clear that
 
the "exact borderline" CLTs of this
 
kind have to involve quantiles of the (marginal)
 
distribution of $X_0$ (rather than just moments).
 
For a generously detailed exposition of such CLTs,
 
see [Br, v1, Chapter 10]; and for further
 
related results, see also Rio [Ri].
 

Latest revision as of 14:46, 5 June 2017



2020 Mathematics Subject Classification: Primary: 62E [MSN][ZBL]


A probability distribution of a random variable $X$ which takes non-negative integer values, defined by the formula \begin{equation}\label{*} P(X=k)=\frac{ {k+m-1 \choose k}{N-m-k \choose M-m} } { {N \choose M} } \tag{*} \end{equation} where the parameters \(N,M,m\) are non-negative integers which satisfy the condition \(m\leq M\leq N\). A negative hypergeometric distribution often arises in a scheme of sampling without replacement. If in the total population of size \(N\), there are \(M\) "marked" and \(N-M\) "unmarked" elements, and if the sampling (without replacement) is performed until the number of "marked" elements reaches a fixed number \(m\), then the random variable \(X\) — the number of "unmarked" elements in the sample — has a negative hypergeometric distribution \eqref{*}. The random variable \(X+m\) — the size of the sample — also has a negative hypergeometric distribution. The distribution \eqref{*} is called a negative hypergeometric distribution by analogy with the negative binomial distribution, which arises in the same way for sampling with replacement.

The mathematical expectation and variance of a negative hypergeometric distribution are, respectively, equal to

\begin{equation} m\frac{N-M} {M+1} \end{equation}

and

\begin{equation} m\frac{(N+1)(N-M)} {(M+1)(M+2)}\Big(1-\frac{m}{M+1}\Big) \, . \end{equation}

When \(N, M, N-M \to \infty\) such that \(M/N\to p\), the negative hypergeometric distribution tends to the negative binomial distribution with parameters \(m\) and \(p\).

The distribution function \(F(n)\) of the negative hypergeometric function with parameters \(N,M,m\) is related to the hypergeometric distribution \(G(m)\) with parameters \(N,M,n\) by the relation \begin{equation} F(n) = 1-G(m-1) \, . \end{equation} This means that in solving problems in mathematical statistics related to negative hypergeometric distributions, tables of hypergeometric distributions can be used. The negative hypergeometric distribution is used, for example, in statistical quality control.

References

[Be] Y.K. Belyaev, "Probability methods of sampling control", Moscow (1975) (In Russian) MR0428663
[BoSm] L.N. Bol'shev, N.V. Smirnov, "Tables of mathematical statistics", Libr. math. tables, 46, Nauka (1983) (In Russian) (Processed by L.S. Bark and E.S. Kedrova) MR0243650 Zbl 0529.62099
[JoKo] N.L. Johnson, S. Kotz, "Distributions in statistics, discrete distributions", Wiley (1969) MR0268996 Zbl 0292.62009
[PaJo] G.P. Patil, S.W. Joshi, "A dictionary and bibliography of discrete distributions", Hafner (1968) MR0282770
How to Cite This Entry:
Boris Tsirelson/sandbox. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Boris_Tsirelson/sandbox&oldid=30036