Namespaces
Variants
Actions

Difference between revisions of "Minimax estimator"

From Encyclopedia of Mathematics
Jump to: navigation, search
(Importing text file)
 
(TeX)
 
Line 1: Line 1:
 +
{{TEX|done}}
 
A statistical estimator obtained as a result of the application of the notion of a [[Minimax statistical procedure|minimax statistical procedure]] in the problem of statistical estimation.
 
A statistical estimator obtained as a result of the application of the notion of a [[Minimax statistical procedure|minimax statistical procedure]] in the problem of statistical estimation.
  
Example 1. Let a random variable <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063940/m0639401.png" /> be subject to the binomial law with parameters <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063940/m0639402.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063940/m0639403.png" />, where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063940/m0639404.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063940/m0639405.png" />, is unknown. The statistic
+
Example 1. Let a random variable $X$ be subject to the binomial law with parameters $n$ and $\theta$, where $\theta$, $0<\theta<1$, is unknown. The statistic
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063940/m0639406.png" /></td> </tr></table>
+
$$t=\frac Xn\frac{\sqrt n}{1+\sqrt n}+\frac1{2(1+\sqrt n)}$$
  
is a minimax estimator for the parameter <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063940/m0639407.png" /> with respect to the loss function
+
is a minimax estimator for the parameter $\theta$ with respect to the loss function
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063940/m0639408.png" /></td> </tr></table>
+
$$L(\theta,t)=(\theta-t)^2.$$
  
Example 2. Let <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063940/m0639409.png" /> be independent random variables subject to the same probability law, with a continuous probability density <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063940/m06394010.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063940/m06394011.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063940/m06394012.png" />. The Pitman estimator
+
Example 2. Let $X_1,\dots,X_n$ be independent random variables subject to the same probability law, with a continuous probability density $f(x-\theta)$, $|x|<\infty$, $|\theta|<\infty$. The Pitman estimator
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063940/m06394013.png" /></td> </tr></table>
+
$$t=t(X_1,\dots,X_n)=X_{(1)}-\frac{\int\limits_{-\infty}^\infty xf(x)\prod_{i=2}^nf(x+Y_{(i)})dx}{\int\limits_{-\infty}^\infty f(x)\prod_{i=2}^nf(x+Y_{(i)})dx}$$
  
is a minimax estimator for the unknown shift parameter <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063940/m06394014.png" /> relative to the loss function <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063940/m06394015.png" />, where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063940/m06394016.png" /> are the order statistics (cf. [[Order statistic|Order statistic]]) obtained from the sample <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063940/m06394017.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063940/m06394018.png" />. In particular, if <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063940/m06394019.png" />, then <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063940/m06394020.png" />.
+
is a minimax estimator for the unknown shift parameter $\theta$ relative to the loss function $L(\theta,t)=(\theta-t)^2$, where $X_{(1)}\leq\dots\leq X_{(n)}$ are the order statistics (cf. [[Order statistic|Order statistic]]) obtained from the sample $X_1,\dots,X_n$ and $Y_{(i)}=X_{(i)}-X_{(1)}$. In particular, if $X_1,\dots,X_n\sim N(\theta,1)$, then $t=(X_1+\dots+X_n)/n$.
  
 
====References====
 
====References====
 
<table><TR><TD valign="top">[1]</TD> <TD valign="top">  S. Zacks,  "The theory of statistical inference" , Wiley  (1971)</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top">  D.R. Cox,  D.V. Hinkley,  "Theoretical statistics" , Chapman &amp; Hall  (1974)</TD></TR></table>
 
<table><TR><TD valign="top">[1]</TD> <TD valign="top">  S. Zacks,  "The theory of statistical inference" , Wiley  (1971)</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top">  D.R. Cox,  D.V. Hinkley,  "Theoretical statistics" , Chapman &amp; Hall  (1974)</TD></TR></table>

Latest revision as of 09:41, 19 November 2018

A statistical estimator obtained as a result of the application of the notion of a minimax statistical procedure in the problem of statistical estimation.

Example 1. Let a random variable $X$ be subject to the binomial law with parameters $n$ and $\theta$, where $\theta$, $0<\theta<1$, is unknown. The statistic

$$t=\frac Xn\frac{\sqrt n}{1+\sqrt n}+\frac1{2(1+\sqrt n)}$$

is a minimax estimator for the parameter $\theta$ with respect to the loss function

$$L(\theta,t)=(\theta-t)^2.$$

Example 2. Let $X_1,\dots,X_n$ be independent random variables subject to the same probability law, with a continuous probability density $f(x-\theta)$, $|x|<\infty$, $|\theta|<\infty$. The Pitman estimator

$$t=t(X_1,\dots,X_n)=X_{(1)}-\frac{\int\limits_{-\infty}^\infty xf(x)\prod_{i=2}^nf(x+Y_{(i)})dx}{\int\limits_{-\infty}^\infty f(x)\prod_{i=2}^nf(x+Y_{(i)})dx}$$

is a minimax estimator for the unknown shift parameter $\theta$ relative to the loss function $L(\theta,t)=(\theta-t)^2$, where $X_{(1)}\leq\dots\leq X_{(n)}$ are the order statistics (cf. Order statistic) obtained from the sample $X_1,\dots,X_n$ and $Y_{(i)}=X_{(i)}-X_{(1)}$. In particular, if $X_1,\dots,X_n\sim N(\theta,1)$, then $t=(X_1+\dots+X_n)/n$.

References

[1] S. Zacks, "The theory of statistical inference" , Wiley (1971)
[2] D.R. Cox, D.V. Hinkley, "Theoretical statistics" , Chapman & Hall (1974)
How to Cite This Entry:
Minimax estimator. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Minimax_estimator&oldid=43440
This article was adapted from an original article by M.S. Nikulin (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article