# Linear estimator

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

A linear function of observable random variables, used (when the actual values of the observed variables are substituted into it) as an approximate value (estimate) of an unknown parameter of the stochastic model under analysis (see Statistical estimator). The special selection of the class of linear estimators is justified for the following reasons. Linear estimators lend themselves more easily to statistical analysis, in particular to the investigation of consistency, unbiasedness, efficiency, the construction of corresponding confidence intervals, etc. At the same time, in a fairly wide range of cases the search for "optimal" (in a well-defined sense) estimators does not lead beyond the limits of the class of linear estimators. For example, the statistical analysis of a linear regression model (see Linear regression) of the form

gives as best linear unbiased estimator of the parameter the least-squares estimator

(linear with respect to the observed values of the random variable under investigation). Here is the -dimensional column vector of observed values , , of the resulting test (random variable) under investigation, is the matrix (of rank ) of observed values , , , of non-random factor arguments on which the resulting test depends, is the -dimensional column vector of the unknown parameters , , and is the -dimensional random column vector of residual components, which satisfies the condition , ( being the unit matrix).

#### References

 [1] H. Cramér, "Mathematical methods of statistics" , Princeton Univ. Press (1946) [2] C.R. Rao, "Linear statistical inference and its applications" , Wiley (1965) [3] S. Zacks, "The theory of statistical inference" , Wiley (1971) [4] H. Scheffé, "Analysis of variance" , Wiley (1959)