Difference between revisions of "Minimax estimator"
(Importing text file) |
(TeX) |
||
Line 1: | Line 1: | ||
+ | {{TEX|done}} | ||
A statistical estimator obtained as a result of the application of the notion of a [[Minimax statistical procedure|minimax statistical procedure]] in the problem of statistical estimation. | A statistical estimator obtained as a result of the application of the notion of a [[Minimax statistical procedure|minimax statistical procedure]] in the problem of statistical estimation. | ||
− | Example 1. Let a random variable | + | Example 1. Let a random variable $X$ be subject to the binomial law with parameters $n$ and $\theta$, where $\theta$, $0<\theta<1$, is unknown. The statistic |
− | + | $$t=\frac Xn\frac{\sqrt n}{1+\sqrt n}+\frac1{2(1+\sqrt n)}$$ | |
− | is a minimax estimator for the parameter | + | is a minimax estimator for the parameter $\theta$ with respect to the loss function |
− | + | $$L(\theta,t)=(\theta-t)^2.$$ | |
− | Example 2. Let | + | Example 2. Let $X_1,\dots,X_n$ be independent random variables subject to the same probability law, with a continuous probability density $f(x-\theta)$, $|x|<\infty$, $|\theta|<\infty$. The Pitman estimator |
− | + | $$t=t(X_1,\dots,X_n)=X_{(1)}-\frac{\int\limits_{-\infty}^\infty xf(x)\prod_{i=2}^nf(x+Y_{(i)})dx}{\int\limits_{-\infty}^\infty f(x)\prod_{i=2}^nf(x+Y_{(i)})dx}$$ | |
− | is a minimax estimator for the unknown shift parameter | + | is a minimax estimator for the unknown shift parameter $\theta$ relative to the loss function $L(\theta,t)=(\theta-t)^2$, where $X_{(1)}\leq\dots\leq X_{(n)}$ are the order statistics (cf. [[Order statistic|Order statistic]]) obtained from the sample $X_1,\dots,X_n$ and $Y_{(i)}=X_{(i)}-X_{(1)}$. In particular, if $X_1,\dots,X_n\sim N(\theta,1)$, then $t=(X_1+\dots+X_n)/n$. |
====References==== | ====References==== | ||
<table><TR><TD valign="top">[1]</TD> <TD valign="top"> S. Zacks, "The theory of statistical inference" , Wiley (1971)</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top"> D.R. Cox, D.V. Hinkley, "Theoretical statistics" , Chapman & Hall (1974)</TD></TR></table> | <table><TR><TD valign="top">[1]</TD> <TD valign="top"> S. Zacks, "The theory of statistical inference" , Wiley (1971)</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top"> D.R. Cox, D.V. Hinkley, "Theoretical statistics" , Chapman & Hall (1974)</TD></TR></table> |
Latest revision as of 09:41, 19 November 2018
A statistical estimator obtained as a result of the application of the notion of a minimax statistical procedure in the problem of statistical estimation.
Example 1. Let a random variable $X$ be subject to the binomial law with parameters $n$ and $\theta$, where $\theta$, $0<\theta<1$, is unknown. The statistic
$$t=\frac Xn\frac{\sqrt n}{1+\sqrt n}+\frac1{2(1+\sqrt n)}$$
is a minimax estimator for the parameter $\theta$ with respect to the loss function
$$L(\theta,t)=(\theta-t)^2.$$
Example 2. Let $X_1,\dots,X_n$ be independent random variables subject to the same probability law, with a continuous probability density $f(x-\theta)$, $|x|<\infty$, $|\theta|<\infty$. The Pitman estimator
$$t=t(X_1,\dots,X_n)=X_{(1)}-\frac{\int\limits_{-\infty}^\infty xf(x)\prod_{i=2}^nf(x+Y_{(i)})dx}{\int\limits_{-\infty}^\infty f(x)\prod_{i=2}^nf(x+Y_{(i)})dx}$$
is a minimax estimator for the unknown shift parameter $\theta$ relative to the loss function $L(\theta,t)=(\theta-t)^2$, where $X_{(1)}\leq\dots\leq X_{(n)}$ are the order statistics (cf. Order statistic) obtained from the sample $X_1,\dots,X_n$ and $Y_{(i)}=X_{(i)}-X_{(1)}$. In particular, if $X_1,\dots,X_n\sim N(\theta,1)$, then $t=(X_1+\dots+X_n)/n$.
References
[1] | S. Zacks, "The theory of statistical inference" , Wiley (1971) |
[2] | D.R. Cox, D.V. Hinkley, "Theoretical statistics" , Chapman & Hall (1974) |
Minimax estimator. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Minimax_estimator&oldid=43440