Difference between revisions of "Pitman estimator"
(Importing text file) |
Ulf Rehmann (talk | contribs) m (tex encoded by computer) |
||
Line 1: | Line 1: | ||
+ | <!-- | ||
+ | p0727301.png | ||
+ | $#A+1 = 40 n = 0 | ||
+ | $#C+1 = 40 : ~/encyclopedia/old_files/data/P072/P.0702730 Pitman estimator | ||
+ | Automatically converted into TeX, above some diagnostics. | ||
+ | Please remove this comment and the {{TEX|auto}} line below, | ||
+ | if TeX found to be correct. | ||
+ | --> | ||
+ | |||
+ | {{TEX|auto}} | ||
+ | {{TEX|done}} | ||
+ | |||
An [[Equivariant estimator|equivariant estimator]] for the shift parameter with respect to a group of real shifts, having minimal risk with respect to a quadratic loss function. | An [[Equivariant estimator|equivariant estimator]] for the shift parameter with respect to a group of real shifts, having minimal risk with respect to a quadratic loss function. | ||
− | Let the components | + | Let the components $ X _ {1} \dots X _ {n} $ |
+ | of a random vector $ X = ( X _ {1} \dots X _ {n} ) $ | ||
+ | be independent random variables having the same probability law, with probability density belonging to the family | ||
− | < | + | $$ |
+ | \{ f( x- \theta ) , | x | < \infty , \theta \in \Theta =(- \infty , + \infty ) \} , | ||
+ | $$ | ||
and with | and with | ||
− | + | $$ | |
+ | {\mathsf E} _ \theta X _ {1} ^ {2} = \int\limits _ {- \infty } ^ { {+ } \infty } x ^ {2} f( x- | ||
+ | \theta ) dx < \infty | ||
+ | $$ | ||
− | for any | + | for any $ \theta \in \Theta $. |
+ | Also, let $ G = \{ g \} $ | ||
+ | be the group of real shifts operating in the realization space $ \mathbf R ^ {1} = (- \infty , + \infty ) $ | ||
+ | of $ X _ {i} $ | ||
+ | $ ( i = 1 \dots n) $: | ||
− | + | $$ | |
+ | G = \{ {g } : {gX _ {i} = X _ {i} + g, | g | < \infty } \} | ||
+ | . | ||
+ | $$ | ||
− | In this case, the task of estimating | + | In this case, the task of estimating $ \theta $ |
+ | is invariant with respect to the quadratic loss function $ L( \theta , \widehat \theta ) = ( \theta - \widehat \theta ) ^ {2} $ | ||
+ | if one uses an equivariant estimator $ \widehat \theta = \widehat \theta ( X) $ | ||
+ | of $ \theta $, | ||
+ | i.e. $ \widehat \theta ( gX) = g \widehat \theta ( X) $ | ||
+ | for all $ g \in G $. | ||
+ | E. Pitman [[#References|[1]]] has shown that the equivariant estimator $ \widehat \theta ( X) $ | ||
+ | for the shift parameter $ \theta $ | ||
+ | with respect to the group $ G $ | ||
+ | that has minimal risk with respect to the quadratic loss function takes the form | ||
− | + | $$ | |
+ | \widehat \theta ( X) = X _ {(} n1) - | ||
+ | \frac{\int\limits _ {- \infty } ^ { {+ } \infty } xf( x) \prod _ { i= } 2 ^ { n } f( x+ Y _ {i} ) dx }{\int\limits _ {- \infty } ^ { {+ } \infty } f( x) \prod _ { i= } 2 ^ { n } f( x+ Y _ {i} ) dx } | ||
+ | , | ||
+ | $$ | ||
− | where | + | where $ Y _ {i} = X _ {(} ni) - X _ {(} n1) $, |
+ | and $ X _ {(} ni) $ | ||
+ | is the $ i $- | ||
+ | th [[Order statistic|order statistic]] of the observation vector $ X $. | ||
+ | The Pitman estimator is unbiased (cf. [[Unbiased estimator|Unbiased estimator]]); it is a [[Minimax estimator|minimax estimator]] in the class of all estimators for $ \theta $ | ||
+ | with respect to the quadratic loss function if all equivariant estimators for $ \theta $ | ||
+ | have finite risk function [[#References|[2]]]. | ||
Example 1. If | Example 1. If | ||
− | + | $$ | |
+ | f( x- \theta ) = e ^ {-( x- \theta ) } ,\ \ | ||
+ | x \geq \theta , | ||
+ | $$ | ||
− | i.e. | + | i.e. $ X _ {i} $, |
+ | $ i = 1 \dots n $, | ||
+ | has exponential distribution with unknown shift parameter $ \theta $, | ||
+ | then the Pitman estimator $ \widehat \theta ( X) $ | ||
+ | for $ \theta $ | ||
+ | is | ||
− | + | $$ | |
+ | \widehat \theta ( X) = X _ {(} n1) - | ||
+ | \frac{1}{n} | ||
+ | , | ||
+ | $$ | ||
− | and its variance is | + | and its variance is $ 1/n ^ {2} $. |
Example 2. If | Example 2. If | ||
− | + | $$ | |
+ | f( x- \theta ) = | ||
+ | \frac{1}{\sqrt {2 \pi } } | ||
+ | e ^ {-( x- \theta ) ^ {2} /2 } ,\ \ | ||
+ | | x | < \infty , | ||
+ | $$ | ||
+ | |||
+ | i.e. $ X _ {i} $, | ||
+ | $ i = 1 \dots n $, | ||
+ | has normal distribution $ N( \theta , 1) $ | ||
+ | with unknown mathematical expectation $ \theta $, | ||
+ | then the arithmetic mean | ||
− | + | $$ | |
+ | \overline{X}\; = | ||
+ | \frac{X _ {1} + \dots + X _ {n} }{n} | ||
− | + | $$ | |
is the Pitman estimator. | is the Pitman estimator. |
Revision as of 08:06, 6 June 2020
An equivariant estimator for the shift parameter with respect to a group of real shifts, having minimal risk with respect to a quadratic loss function.
Let the components $ X _ {1} \dots X _ {n} $ of a random vector $ X = ( X _ {1} \dots X _ {n} ) $ be independent random variables having the same probability law, with probability density belonging to the family
$$ \{ f( x- \theta ) , | x | < \infty , \theta \in \Theta =(- \infty , + \infty ) \} , $$
and with
$$ {\mathsf E} _ \theta X _ {1} ^ {2} = \int\limits _ {- \infty } ^ { {+ } \infty } x ^ {2} f( x- \theta ) dx < \infty $$
for any $ \theta \in \Theta $. Also, let $ G = \{ g \} $ be the group of real shifts operating in the realization space $ \mathbf R ^ {1} = (- \infty , + \infty ) $ of $ X _ {i} $ $ ( i = 1 \dots n) $:
$$ G = \{ {g } : {gX _ {i} = X _ {i} + g, | g | < \infty } \} . $$
In this case, the task of estimating $ \theta $ is invariant with respect to the quadratic loss function $ L( \theta , \widehat \theta ) = ( \theta - \widehat \theta ) ^ {2} $ if one uses an equivariant estimator $ \widehat \theta = \widehat \theta ( X) $ of $ \theta $, i.e. $ \widehat \theta ( gX) = g \widehat \theta ( X) $ for all $ g \in G $. E. Pitman [1] has shown that the equivariant estimator $ \widehat \theta ( X) $ for the shift parameter $ \theta $ with respect to the group $ G $ that has minimal risk with respect to the quadratic loss function takes the form
$$ \widehat \theta ( X) = X _ {(} n1) - \frac{\int\limits _ {- \infty } ^ { {+ } \infty } xf( x) \prod _ { i= } 2 ^ { n } f( x+ Y _ {i} ) dx }{\int\limits _ {- \infty } ^ { {+ } \infty } f( x) \prod _ { i= } 2 ^ { n } f( x+ Y _ {i} ) dx } , $$
where $ Y _ {i} = X _ {(} ni) - X _ {(} n1) $, and $ X _ {(} ni) $ is the $ i $- th order statistic of the observation vector $ X $. The Pitman estimator is unbiased (cf. Unbiased estimator); it is a minimax estimator in the class of all estimators for $ \theta $ with respect to the quadratic loss function if all equivariant estimators for $ \theta $ have finite risk function [2].
Example 1. If
$$ f( x- \theta ) = e ^ {-( x- \theta ) } ,\ \ x \geq \theta , $$
i.e. $ X _ {i} $, $ i = 1 \dots n $, has exponential distribution with unknown shift parameter $ \theta $, then the Pitman estimator $ \widehat \theta ( X) $ for $ \theta $ is
$$ \widehat \theta ( X) = X _ {(} n1) - \frac{1}{n} , $$
and its variance is $ 1/n ^ {2} $.
Example 2. If
$$ f( x- \theta ) = \frac{1}{\sqrt {2 \pi } } e ^ {-( x- \theta ) ^ {2} /2 } ,\ \ | x | < \infty , $$
i.e. $ X _ {i} $, $ i = 1 \dots n $, has normal distribution $ N( \theta , 1) $ with unknown mathematical expectation $ \theta $, then the arithmetic mean
$$ \overline{X}\; = \frac{X _ {1} + \dots + X _ {n} }{n} $$
is the Pitman estimator.
References
[1] | E.J. Pitman, "The estimation of the location and scale parameters of a continuous population of any given form" Biometrika , 30 (1939) pp. 391–421 |
[2] | M.A. Girshick, L.J. Savage, "Bayes and minimax estimates for quadratic loss functions" J. Neyman (ed.) , Proc. 2-nd Berkeley Symp. Math. Statist. Prob. , Univ. California Press (1951) pp. 53–73 |
[3] | S. Zachs, "The theory of statistical inference" , Wiley (1971) |
Pitman estimator. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Pitman_estimator&oldid=13032