Namespaces
Variants
Actions

Bahadur representation

From Encyclopedia of Mathematics
Revision as of 16:45, 1 July 2020 by Maximilian Janisch (talk | contribs) (AUTOMATIC EDIT (latexlist): Replaced 58 formulas out of 58 by TEX code with an average confidence of 2.0 and a minimal confidence of 2.0.)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

An approximation of sample quantiles by empirical distribution functions.

Let $U _ { 1 } , \dots , U _ { n } , \dots$ be a sequence of independent uniform-$( 0,1 )$ random variables (cf. also Random variable). Write

\begin{equation*} \Gamma _ { n } ( t ) = \frac { 1 } { n } \sum _ { i = 1 } ^ { n } 1 _ { [ 0 , t ] } ( U _ { i } ) \end{equation*}

for the empirical distribution function (cf. Distribution function; Empirical distribution) of the first $n$ random variables and denote the uniform empirical process by

\begin{equation*} \alpha _ { n } ( t ) = n ^ { 1 / 2 } ( \Gamma _ { n } ( t ) - t ) , \quad 0 \leq t \leq 1. \end{equation*}

Let $\Gamma _ { n } ^ { - 1 }$ be the left-continuous inverse or quantile function (cf. also Quantile) corresponding to $\Gamma _ { n }$ and write

\begin{equation*} \beta _ { n } ( t ) = n ^ { 1 / 2 } \left( \Gamma _ { n } ^ { - 1 } ( t ) - t \right) , \quad 0 \leq t \leq 1, \end{equation*}

for the uniform quantile process. Denote the supremum norm on $[0,1]$ by $|.|$. It is easy to show that $\operatorname { lim } _ { n \rightarrow \infty } \| \alpha _ { n } + \beta _ { n } \| = 0$ a.s., implying, e.g., that $\Gamma _ { n } ^ { - 1 } ( t ) = 2 t - \Gamma _ { n } ( t ) + o \left( n ^ { - 1 / 2 } \right)$ a.s., $0 \leq t \leq 1$. The process $\alpha _ { n } + \beta _ { n }$ was introduced by R.R. Bahadur in [a3] and further investigated by J.C. Kiefer in [a11], [a12]. Therefore this process is called the (uniform) Bahadur–Kiefer process. A final and much more delicate result for $\| \alpha _ { n } + \beta _ { n } \|$ is

\begin{equation} \tag{a1} \operatorname { lim } _ { n \rightarrow \infty } \frac { n ^ { 1 / 4 } } { ( \operatorname { log } n ) ^ { 1 / 2 } } \frac { \| \alpha _ { n } + \beta _ { n } \| } { \| \alpha _ { n } \| ^ { 1 / 2 } } = 1 \text{ a.s.}, \end{equation}

see [a7], [a8], [a12], [a13]. From the well-known results for $\alpha _ { n }$ it now immediately follows from (a1), that

\begin{equation} \tag{a2} \limsup _ { n \rightarrow \infty } \frac { n ^ { 1 / 4 } } { ( \operatorname { log } n ) ^ { 1 / 2 } ( \operatorname { log } \operatorname { log } n ) ^ { 1 / 4 } } \| \alpha _ { n } + \beta _ { n } \| = 2 ^ { - 1 / 4 } \text{ a.s.} \end{equation}

and

\begin{equation*} \frac { n ^ { 1 / 4 } } { ( \operatorname { log } n ) ^ { 1 / 2 } } \| \alpha _ { n } + \beta _ { n } \| \stackrel { d } { \rightarrow } \| B \| ^ { 1 / 2 }, \end{equation*}

where $B$ is a standard Brownian bridge (cf. Non-parametric methods in statistics). Similar results exist for a single, fixed $t \in ( 0,1 )$:

\begin{equation*} \operatorname { limsup } _ { n \rightarrow \infty } \pm \frac { n ^ { 1 / 4 } } { ( \operatorname { log } \operatorname { log } n ) ^ { 3 / 4 } } ( \alpha _ { n } ( t ) + \beta _ { n } ( t ) ) = \end{equation*}

\begin{equation*} = 2 ^ { 5 / 4 } 3 ^ { - 3 / 4 } ( t ( 1 - t ) ) ^ { 1 / 4 } \text { a.s., } n ^ { 1 / 4 } ( \alpha _ { n } ( t ) + \beta _ { n } ( t ) ) \stackrel { d } { \rightarrow } Z [ B ( t ) ] ^ { 1 / 2 }, \end{equation*}

where $Z$ is standard normal (cf. Normal distribution) and independent of $B$. Extensions of the latter two results to finitely many $t$'s also exist, see [a4], [a5].

Let $F$ be a continuous distribution function on $\mathbf{R}$, with quantile function $Q$, and set $X _ { i } = Q ( U _ { i } )$, $i = 1,2 , \dots$. Then the $X_i$ are independent and distributed according to $F$. Now define $F _ { n }$ to be the empirical distribution function of the first $n$ of the $X_i$ and write $\alpha_{n, F} = n ^ { 1 / 2 } ( F _ { n } - F )$ for the corresponding empirical process. Denote the empirical quantile function by $Q _ { n }$ and define the quantile process by $\beta _ { n , F } = f \circ Q n ^ { 1 / 2 } ( Q _ { n } - Q )$, where $f = F ^ { \prime }$. The general Bahadur–Kiefer process is now defined as $\alpha _ { n , F} \circ Q + \beta _ { n , F }$. Since $\alpha _ { n ,F} \circ Q \equiv \alpha _ { n }$, results for $\alpha _ { n , F} \circ Q + \beta _ { n , F }$ can be obtained when $\beta _ { n , F }$ is "close" to $\beta _ { n }$. Under natural conditions, see e.g. [a13], results hold which imply that for any $\varepsilon > 0$

\begin{equation*} \| \beta _ { n , F } - \beta _ { n } \| = o \left( \frac { 1 } { n ^ { 1 / 2 - \varepsilon } } \right) \ \text{a.s.}\ . \end{equation*}

This yields all the above results with $\beta _ { n }$ replaced with $\beta _ { n , F }$. Observe that (a2) now leads to the following Bahadur representation: If $f$ is bounded away from $0$, then uniformly in $t \in ( 0,1 )$,

\begin{equation*} Q _ { n } ( t ) = Q ( t ) + \frac { t - F _ { n } ( Q ( t ) ) } { f ( Q ( t ) ) } + \end{equation*}

\begin{equation*} + O \left( \frac { ( \operatorname { log } n ) ^ { 1 / 2 } ( \operatorname { log } \operatorname { log } n ) ^ { 1 / 4 } } { n ^ { 3 / 4 } } \right) \text{ a.s..} \end{equation*}

There are many extensions of the above results, e.g., to various generalizations of quantiles (one- and multi-dimensional) [a1], [a9], to weighted processes [a4], [a7], to single $t _ { n }$'s converging to $0$ [a6], to the two-sample case, to censorship models [a5], to partial-sum processes [a7], to dependent random variables [a2], [a4], [a10], and to regression models [a9].

References

[a1] M.A. Arcones, "The Bahadur–Kiefer representation of the two-dimensional spatial medians" Ann. Inst. Statist. Math. , 50 (1998) pp. 71–86
[a2] M.A. Arcones, "The Bahadur–Kiefer representation for $U$-quantiles" Ann. Statist. , 24 (1996) pp. 1400–1422
[a3] R.R. Bahadur, "A note on quantiles in large samples" Ann. Math. Stat. , 37 (1966) pp. 577–580
[a4] J. Beirlant, P. Deheuvels, J.H.J. Einmahl, D.M. Mason, "Bahadur–Kiefer theorems for uniform spacings processes" Theory Probab. Appl. , 36 (1992) pp. 647–669
[a5] J. Beirlant, J.H.J. Einmahl, "Bahadur–Kiefer theorems for the product-limit process" J. Multivariate Anal. , 35 (1990) pp. 276–294
[a6] P. Deheuvels, "Pointwise Bahadur–Kiefer-type theorems II" , Nonparametric statistics and related topics (Ottawa, 1991) , North-Holland (1992) pp. 331–345
[a7] P. Deheuvels, D.M. Mason, "Bahadur–Kiefer-type processes" Ann. of Probab. , 18 (1990) pp. 669–697
[a8] J.H.J. Einmahl, "A short and elementary proof of the main Bahadur–Kiefer theorem" Ann. of Probab. , 24 (1996) pp. 526–531
[a9] X. He, Q.-M. Shao, "A general Bahadur representation of $M$-estimators and its application to linear regression with nonstochastic designs" Ann. Statist. , 24 (1996) pp. 2608–2630
[a10] C.H. Hesse, "A Bahadur–Kiefer type representation for a large class of stationary, possibly infinite variance, linear processes" Ann. Statist. , 18 (1990) pp. 1188–1202
[a11] J.C. Kiefer, "On Bahadur's representation of sample quantiles" Ann. Math. Stat. , 38 (1967) pp. 1323–1342
[a12] J.C. Kiefer, "Deviations between the sample quantile process and the sample df" M. Puri (ed.) , Non-parametric Techniques in Statistical Inference , Cambridge Univ. Press (1970) pp. 299–319
[a13] G.R. Shorack, J.A. Wellner, "Empirical processes with applications to statistics" , Wiley (1986)
How to Cite This Entry:
Bahadur representation. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Bahadur_representation&oldid=13619
This article was adapted from an original article by J.H.J. Einmahl (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article