Regression spectrum
The spectrum of a stochastic process occurring in the regression scheme for a stationary time series. Thus, let a stochastic process $ y _ {t} $
which is observable for $ t = 1 \dots n $
be represented in the form
$$ \tag{1 } y _ {t} = m _ {t} + x _ {t} , $$
where $ x _ {t} $ is a stationary stochastic process with $ {\mathsf E} x _ {t} \equiv 0 $, and let the mean value $ {\mathsf E} y _ {t} = m _ {t} $ be expressed in the form of a linear regression
$$ \tag{2 } m _ {t} = \ \sum _ { k= } 1 ^ { s } \beta _ {k} \phi _ {t} ^ {(} k) , $$
where $ \phi ^ {(} k) = ( \phi _ {1} ^ {(} k) \dots \phi _ {n} ^ {(} k) ) $, $ k = 1 \dots s $, are known regression vectors and $ \beta _ {1} \dots \beta _ {s} $ are unknown regression coefficients (cf. Regression coefficient). Let $ M ( \lambda ) $ be the spectral distribution function of the regression vectors $ \phi ^ {(} 1) \dots \phi ^ {(} s) $( cf. Spectral analysis of a stationary stochastic process). The regression spectrum for $ M ( \lambda ) $ is the set of all $ \lambda $ such that $ M ( \lambda _ {2} ) - M ( \lambda _ {1} ) > 0 $ for any interval $ ( \lambda _ {1} , \lambda _ {2} ) $ containing $ \lambda $, $ \lambda _ {1} < \lambda < \lambda _ {2} $.
The regression spectrum plays an important role in problems of estimating the regression coefficients in the scheme (1)–(2). For example, the elements of a regression spectrum can be used to express a necessary and sufficient condition for the asymptotic efficiency of an estimator for $ \beta $ by the method of least squares.
References
[1] | U. Grenander, M. Rosenblatt, "Statistical analysis of stationary time series" , Wiley (1957) |
Regression spectrum. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Regression_spectrum&oldid=15954