Namespaces
Variants
Actions

Difference between revisions of "Equivariant estimator"

From Encyclopedia of Mathematics
Jump to: navigation, search
(Importing text file)
 
m (tex encoded by computer)
 
Line 1: Line 1:
 +
<!--
 +
e0361001.png
 +
$#A+1 = 21 n = 0
 +
$#C+1 = 21 : ~/encyclopedia/old_files/data/E036/E.0306100 Equivariant estimator
 +
Automatically converted into TeX, above some diagnostics.
 +
Please remove this comment and the {{TEX|auto}} line below,
 +
if TeX found to be correct.
 +
-->
 +
 +
{{TEX|auto}}
 +
{{TEX|done}}
 +
 
A statistical point estimator that preserves the structure of the problem of statistical estimation relative to a given group of one-to-one transformations of a sampling space.
 
A statistical point estimator that preserves the structure of the problem of statistical estimation relative to a given group of one-to-one transformations of a sampling space.
  
Suppose that in the realization of a random vector <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e036/e036100/e0361001.png" />, the components <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e036/e036100/e0361002.png" /> of which are independent, identically distributed random variables taking values in a sampling space <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e036/e036100/e0361003.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e036/e036100/e0361004.png" />, it is necessary to estimate the unknown true value of the parameter <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e036/e036100/e0361005.png" />. Next, suppose that on <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e036/e036100/e0361006.png" /> acts a group of one-to-one transformations <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e036/e036100/e0361007.png" /> such that
+
Suppose that in the realization of a random vector $  X = ( X _ {1} \dots X _ {n} ) $,  
 +
the components $  X _ {1} \dots X _ {n} $
 +
of which are independent, identically distributed random variables taking values in a sampling space $  ( \mathfrak X , {\mathcal B} , {\mathsf P} _  \theta  ) $,  
 +
$  \theta \in \Theta \subseteq \mathbf R  ^ {k} $,  
 +
it is necessary to estimate the unknown true value of the parameter $  \theta $.  
 +
Next, suppose that on $  \mathfrak X $
 +
acts a group of one-to-one transformations $  G = \{ g \} $
 +
such that
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e036/e036100/e0361008.png" /></td> </tr></table>
+
$$
 +
g \mathfrak X  = \mathfrak X \  \textrm{ and } \  g {\mathcal B} _ {\mathfrak X }  = {\mathcal B} _ {\mathfrak X }  \ \
 +
\textrm{ for  all  }  g \in G .
 +
$$
  
In turn, the group <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e036/e036100/e0361009.png" /> generates on the parameter space <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e036/e036100/e03610010.png" /> a so-called induced group of transformations <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e036/e036100/e03610011.png" />, the elements of which are defined by the formula
+
In turn, the group $  G $
 +
generates on the parameter space $  \Theta $
 +
a so-called induced group of transformations $  \overline{G}\; = \{ \overline{g}\; \} $,  
 +
the elements of which are defined by the formula
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e036/e036100/e03610012.png" /></td> </tr></table>
+
$$
 +
{\mathsf P} _  \theta  ( B)  = {\mathsf P} _ {\overline{g}\; \theta }
 +
( g B ) \  \textrm{ for  all  }  g \in G ,\
 +
B \in {\mathcal B} _ {\mathfrak X }  .
 +
$$
  
Let <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e036/e036100/e03610013.png" /> be a group of one-to-one transformations on <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e036/e036100/e03610014.png" /> such that
+
Let $  \overline{G}\; $
 +
be a group of one-to-one transformations on $  \Theta $
 +
such that
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e036/e036100/e03610015.png" /></td> </tr></table>
+
$$
 +
\overline{g}\; \Theta  = \Theta \  \textrm{ for  all  } \
 +
\overline{g}\; \in \overline{G}\; .
 +
$$
  
Under these conditions it is said that a point estimator <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e036/e036100/e03610016.png" /> of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e036/e036100/e03610017.png" /> is an equivariant estimator, or that it preserves the structure of the problem of statistical estimation of the parameter <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e036/e036100/e03610018.png" /> with respect to the group <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e036/e036100/e03610019.png" />, if
+
Under these conditions it is said that a point estimator $  \widehat \theta  = \widehat \theta  ( X) $
 +
of $  \theta $
 +
is an equivariant estimator, or that it preserves the structure of the problem of statistical estimation of the parameter $  \theta $
 +
with respect to the group $  G $,  
 +
if
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e036/e036100/e03610020.png" /></td> </tr></table>
+
$$
 +
\widehat \theta  ( g X )  = \overline{g}\; \widehat \theta  ( X) \ \
 +
\textrm{ for  all  }  g \in G .
 +
$$
  
The most interesting results in the theory of equivariant estimators have been obtained under the assumption that the loss function is invariant with respect to <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e036/e036100/e03610021.png" />.
+
The most interesting results in the theory of equivariant estimators have been obtained under the assumption that the loss function is invariant with respect to $  G $.
  
 
====References====
 
====References====
 
<table><TR><TD valign="top">[1]</TD> <TD valign="top">  S. Zachs,  "The theory of statistical inference" , Wiley  (1971)</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top">  E.L. Lehmann,  "Testing statistical hypotheses" , Wiley  (1986)</TD></TR></table>
 
<table><TR><TD valign="top">[1]</TD> <TD valign="top">  S. Zachs,  "The theory of statistical inference" , Wiley  (1971)</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top">  E.L. Lehmann,  "Testing statistical hypotheses" , Wiley  (1986)</TD></TR></table>

Latest revision as of 19:37, 5 June 2020


A statistical point estimator that preserves the structure of the problem of statistical estimation relative to a given group of one-to-one transformations of a sampling space.

Suppose that in the realization of a random vector $ X = ( X _ {1} \dots X _ {n} ) $, the components $ X _ {1} \dots X _ {n} $ of which are independent, identically distributed random variables taking values in a sampling space $ ( \mathfrak X , {\mathcal B} , {\mathsf P} _ \theta ) $, $ \theta \in \Theta \subseteq \mathbf R ^ {k} $, it is necessary to estimate the unknown true value of the parameter $ \theta $. Next, suppose that on $ \mathfrak X $ acts a group of one-to-one transformations $ G = \{ g \} $ such that

$$ g \mathfrak X = \mathfrak X \ \textrm{ and } \ g {\mathcal B} _ {\mathfrak X } = {\mathcal B} _ {\mathfrak X } \ \ \textrm{ for all } g \in G . $$

In turn, the group $ G $ generates on the parameter space $ \Theta $ a so-called induced group of transformations $ \overline{G}\; = \{ \overline{g}\; \} $, the elements of which are defined by the formula

$$ {\mathsf P} _ \theta ( B) = {\mathsf P} _ {\overline{g}\; \theta } ( g B ) \ \textrm{ for all } g \in G ,\ B \in {\mathcal B} _ {\mathfrak X } . $$

Let $ \overline{G}\; $ be a group of one-to-one transformations on $ \Theta $ such that

$$ \overline{g}\; \Theta = \Theta \ \textrm{ for all } \ \overline{g}\; \in \overline{G}\; . $$

Under these conditions it is said that a point estimator $ \widehat \theta = \widehat \theta ( X) $ of $ \theta $ is an equivariant estimator, or that it preserves the structure of the problem of statistical estimation of the parameter $ \theta $ with respect to the group $ G $, if

$$ \widehat \theta ( g X ) = \overline{g}\; \widehat \theta ( X) \ \ \textrm{ for all } g \in G . $$

The most interesting results in the theory of equivariant estimators have been obtained under the assumption that the loss function is invariant with respect to $ G $.

References

[1] S. Zachs, "The theory of statistical inference" , Wiley (1971)
[2] E.L. Lehmann, "Testing statistical hypotheses" , Wiley (1986)
How to Cite This Entry:
Equivariant estimator. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Equivariant_estimator&oldid=15622
This article was adapted from an original article by M.S. Nikulin (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article