Namespaces
Variants
Actions

Regression coefficient

From Encyclopedia of Mathematics
Revision as of 08:10, 6 June 2020 by Ulf Rehmann (talk | contribs) (tex encoded by computer)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search


A coefficient of an independent variable in a regression equation. For example, in the linear regression equation $ {\mathsf E} ( Y \mid X = x ) = \beta _ {0} + \beta _ {1} x $, connecting the random variables $ Y $ and $ X $, the regression coefficients $ \beta _ {0} $ and $ \beta _ {1} $ are given by

$$ \beta _ {0} = m _ {2} - \rho \frac{\sigma _ {2} }{\sigma _ {1} } m _ {1} ,\ \ \beta _ {1} = \rho \frac{\sigma _ {2} }{\sigma _ {1} } , $$

where $ \rho $ is the correlation coefficient of $ X $ and $ Y $, $ m _ {1} = {\mathsf E} X $, $ m _ {2} = {\mathsf E} Y $, $ \sigma _ {1} ^ {2} = {\mathsf D} X $, and $ \sigma _ {2} ^ {2} = {\mathsf D} Y $. The calculation of estimates for regression coefficients (sample regression coefficients) is a fundamental problem of regression analysis.

How to Cite This Entry:
Regression coefficient. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Regression_coefficient&oldid=19256
This article was adapted from an original article by A.V. Prokhorov (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article