Regression coefficient
A coefficient of an independent variable in a regression equation. For example, in the linear regression equation $ {\mathsf E} ( Y \mid X = x ) = \beta _ {0} + \beta _ {1} x $,
connecting the random variables $ Y $
and $ X $,
the regression coefficients $ \beta _ {0} $
and $ \beta _ {1} $
are given by
$$ \beta _ {0} = m _ {2} - \rho \frac{\sigma _ {2} }{\sigma _ {1} } m _ {1} ,\ \ \beta _ {1} = \rho \frac{\sigma _ {2} }{\sigma _ {1} } , $$
where $ \rho $ is the correlation coefficient of $ X $ and $ Y $, $ m _ {1} = {\mathsf E} X $, $ m _ {2} = {\mathsf E} Y $, $ \sigma _ {1} ^ {2} = {\mathsf D} X $, and $ \sigma _ {2} ^ {2} = {\mathsf D} Y $. The calculation of estimates for regression coefficients (sample regression coefficients) is a fundamental problem of regression analysis.
Regression coefficient. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Regression_coefficient&oldid=48474