Pitman estimator
An equivariant estimator for the shift parameter with respect to a group of real shifts, having minimal risk with respect to a quadratic loss function.
Let the components of a random vector X = ( X _ {1} \dots X _ {n} ) be independent random variables having the same probability law, with probability density belonging to the family
\{ f( x- \theta ) , | x | < \infty , \theta \in \Theta =(- \infty , + \infty ) \} ,
and with
{\mathsf E} _ \theta X _ {1} ^ {2} = \int\limits _ {- \infty } ^ { +\infty } x ^ {2} f( x- \theta ) dx < \infty
for any \theta \in \Theta . Also, let G = \{ g \} be the group of real shifts operating in the realization space \mathbf R ^ {1} = (- \infty , + \infty ) of X _ {i} ( i = 1 \dots n) :
G = \{ {g } : {gX _ {i} = X _ {i} + g, | g | < \infty } \} .
In this case, the task of estimating \theta is invariant with respect to the quadratic loss function L( \theta , \widehat \theta ) = ( \theta - \widehat \theta ) ^ {2} if one uses an equivariant estimator \widehat \theta = \widehat \theta ( X) of \theta , i.e. \widehat \theta ( gX) = g \widehat \theta ( X) for all g \in G . E. Pitman [1] has shown that the equivariant estimator \widehat \theta ( X) for the shift parameter \theta with respect to the group G that has minimal risk with respect to the quadratic loss function takes the form
\widehat \theta ( X) = X _ {(n1)} - \frac{\int\limits _ {- \infty } ^ { +\infty } xf( x) \prod_{i=2} ^ { n } f( x+ Y _ {i} ) dx }{\int\limits _ {- \infty } ^ { +\infty } f( x) \prod_{i=2}^ { n } f( x+ Y _ {i} ) dx } ,
where Y _ {i} = X _ {(ni)} - X _ {(n1)} , and X _ {(ni)} is the i - th order statistic of the observation vector X . The Pitman estimator is unbiased (cf. Unbiased estimator); it is a minimax estimator in the class of all estimators for \theta with respect to the quadratic loss function if all equivariant estimators for \theta have finite risk function [2].
Example 1. If
f( x- \theta ) = e ^ {-( x- \theta ) } ,\ \ x \geq \theta ,
i.e. X _ {i} , i = 1 \dots n , has exponential distribution with unknown shift parameter \theta , then the Pitman estimator \widehat \theta ( X) for \theta is
\widehat \theta ( X) = X _ {(n1)} - \frac{1}{n} ,
and its variance is 1/n ^ {2} .
Example 2. If
f( x- \theta ) = \frac{1}{\sqrt {2 \pi } } e ^ {-( x- \theta ) ^ {2} /2 } ,\ \ | x | < \infty ,
i.e. X _ {i} , i = 1 \dots n , has normal distribution N( \theta , 1) with unknown mathematical expectation \theta , then the arithmetic mean
\overline{X}\; = \frac{X _ {1} + \dots + X _ {n} }{n}
is the Pitman estimator.
References
[1] | E.J. Pitman, "The estimation of the location and scale parameters of a continuous population of any given form" Biometrika , 30 (1939) pp. 391–421 |
[2] | M.A. Girshick, L.J. Savage, "Bayes and minimax estimates for quadratic loss functions" J. Neyman (ed.) , Proc. 2-nd Berkeley Symp. Math. Statist. Prob. , Univ. California Press (1951) pp. 53–73 |
[3] | S. Zachs, "The theory of statistical inference" , Wiley (1971) |
Pitman estimator. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Pitman_estimator&oldid=55167