Regression matrix
The matrix of regression coefficients (cf. Regression coefficient)
,
,
, in a multi-dimensional linear regression model,
![]() | (*) |
Here is a matrix with elements
,
,
, where
,
, are observations of the
-th component of the original
-dimensional random variable,
is a matrix of known regression variables
,
,
, and
is the matrix of errors
,
,
, with
. The elements
of the regression matrix
are unknown and have to be estimated. The model (*) is a generalization to the
-dimensional case of the general linear model of regression analysis.
References
[1] | M.G. Kendall, A. Stuart, "The advanced theory of statistics" , 3. Design and analysis, and time series , Griffin (1983) |
Comments
In econometrics, for example, a frequently used model is that one has variables
to be explained (endogenous variables) in terms of
explanatory variables
(exogenous variables) by means of a linear relationship
. Given
sets of measurements (with errors),
, the matrix of relation coefficients
is to be estimated. The model is therefore
![]() |
With the assumption that the have zero mean and are independently and identically distributed with normal distribution, that is, the so-called standard linear multiple regression model or, briefly, linear model or standard linear model. The least squares method yields the optimal estimator:
![]() |
where ,
. In the case of a single endogenous variable,
, this can be conveniently written as
![]() |
where is the column vector of observations
and
is the
observation matrix consisting of the rows
,
. Numerous variants and generalizations are considered [a1], [a2]; cf. also Regression analysis.
References
[a1] | E. Malinvaud, "Statistical methods of econometrics" , North-Holland (1970) (Translated from French) |
[a2] | H. Theil, "Principles of econometrics" , North-Holland (1971) |
Regression matrix. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Regression_matrix&oldid=48475