Namespaces
Variants
Actions

Difference between revisions of "Parabolic regression"

From Encyclopedia of Mathematics
Jump to: navigation, search
(Importing text file)
 
m (tex encoded by computer)
 
Line 1: Line 1:
 +
<!--
 +
p0712401.png
 +
$#A+1 = 14 n = 0
 +
$#C+1 = 14 : ~/encyclopedia/old_files/data/P071/P.0701240 Parabolic regression,
 +
Automatically converted into TeX, above some diagnostics.
 +
Please remove this comment and the {{TEX|auto}} line below,
 +
if TeX found to be correct.
 +
-->
 +
 +
{{TEX|auto}}
 +
{{TEX|done}}
 +
 
''polynomial regression''
 
''polynomial regression''
  
A regression model in which the regression functions are polynomials. More precisely, let <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p071/p071240/p0712401.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p071/p071240/p0712402.png" /> be random vectors taking values <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p071/p071240/p0712403.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p071/p071240/p0712404.png" />, and suppose that
+
A regression model in which the regression functions are polynomials. More precisely, let $  X = ( X _ {1} \dots X _ {m} )  ^ {T} $
 +
and $  Y = ( Y _ {1} \dots Y _ {n} )  ^ {T} $
 +
be random vectors taking values $  x = ( x _ {1} \dots x _ {m} )  ^ {T} \in \mathbf R  ^ {m} $
 +
and $  y = ( y _ {1} \dots y _ {n} )  ^ {T} \in \mathbf R  ^ {n} $,  
 +
and suppose that
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p071/p071240/p0712405.png" /></td> </tr></table>
+
$$
 +
{\mathsf E} \{ Y  \mid  X \}  = \
 +
f( X)  = ( f _ {1} ( X) \dots f _ {n} ( X))  ^ {T}
 +
$$
  
exists (i.e. suppose that <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p071/p071240/p0712406.png" /> <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p071/p071240/p0712407.png" /> exist). The regression is called parabolic (polynomial) if the components of the vector <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p071/p071240/p0712408.png" /> are polynomial functions in the components of the vector <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p071/p071240/p0712409.png" />. For example, in the elementary case where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p071/p071240/p07124010.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p071/p071240/p07124011.png" /> are ordinary random variables, a polynomial regression equation is of the form
+
exists (i.e. suppose that $  {\mathsf E} \{ Y _ {1}  \mid  X \} = f _ {1} ( X) \dots $
 +
$  {\mathsf E} \{ Y _ {n}  \mid  X \} = f _ {n} ( X) $
 +
exist). The regression is called parabolic (polynomial) if the components of the vector $  {\mathsf E} \{ Y  \mid  X \} = f( x) $
 +
are polynomial functions in the components of the vector $  X $.  
 +
For example, in the elementary case where $  Y $
 +
and $  X $
 +
are ordinary random variables, a polynomial regression equation is of the form
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p071/p071240/p07124012.png" /></td> </tr></table>
+
$$
 +
= \beta _ {0} + \beta _ {1} X + \dots + \beta _ {p} X  ^ {p} ,
 +
$$
  
where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p071/p071240/p07124013.png" /> are the regression coefficients. A special case of parabolic regression is [[Linear regression|linear regression]]. By adding new components to the vector <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p071/p071240/p07124014.png" />, it is always possible to reduce parabolic regression to linear regression. See [[Regression|Regression]]; [[Regression analysis|Regression analysis]].
+
where $  \beta _ {0} \dots \beta _ {p} $
 +
are the regression coefficients. A special case of parabolic regression is [[Linear regression|linear regression]]. By adding new components to the vector $  X $,  
 +
it is always possible to reduce parabolic regression to linear regression. See [[Regression|Regression]]; [[Regression analysis|Regression analysis]].
  
 
====References====
 
====References====
 
<table><TR><TD valign="top">[1]</TD> <TD valign="top">  H. Cramér,  "Mathematical methods of statistics" , Princeton Univ. Press  (1946)</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top">  G.A.F. Seber,  "Linear regression analysis" , Wiley  (1977)</TD></TR></table>
 
<table><TR><TD valign="top">[1]</TD> <TD valign="top">  H. Cramér,  "Mathematical methods of statistics" , Princeton Univ. Press  (1946)</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top">  G.A.F. Seber,  "Linear regression analysis" , Wiley  (1977)</TD></TR></table>
 
 
  
 
====Comments====
 
====Comments====
 
The phrase  "parabolic regression"  is seldom used in the Western literature; one uses  "polynomial regression"  almost exclusively.
 
The phrase  "parabolic regression"  is seldom used in the Western literature; one uses  "polynomial regression"  almost exclusively.

Latest revision as of 08:05, 6 June 2020


polynomial regression

A regression model in which the regression functions are polynomials. More precisely, let $ X = ( X _ {1} \dots X _ {m} ) ^ {T} $ and $ Y = ( Y _ {1} \dots Y _ {n} ) ^ {T} $ be random vectors taking values $ x = ( x _ {1} \dots x _ {m} ) ^ {T} \in \mathbf R ^ {m} $ and $ y = ( y _ {1} \dots y _ {n} ) ^ {T} \in \mathbf R ^ {n} $, and suppose that

$$ {\mathsf E} \{ Y \mid X \} = \ f( X) = ( f _ {1} ( X) \dots f _ {n} ( X)) ^ {T} $$

exists (i.e. suppose that $ {\mathsf E} \{ Y _ {1} \mid X \} = f _ {1} ( X) \dots $ $ {\mathsf E} \{ Y _ {n} \mid X \} = f _ {n} ( X) $ exist). The regression is called parabolic (polynomial) if the components of the vector $ {\mathsf E} \{ Y \mid X \} = f( x) $ are polynomial functions in the components of the vector $ X $. For example, in the elementary case where $ Y $ and $ X $ are ordinary random variables, a polynomial regression equation is of the form

$$ y = \beta _ {0} + \beta _ {1} X + \dots + \beta _ {p} X ^ {p} , $$

where $ \beta _ {0} \dots \beta _ {p} $ are the regression coefficients. A special case of parabolic regression is linear regression. By adding new components to the vector $ X $, it is always possible to reduce parabolic regression to linear regression. See Regression; Regression analysis.

References

[1] H. Cramér, "Mathematical methods of statistics" , Princeton Univ. Press (1946)
[2] G.A.F. Seber, "Linear regression analysis" , Wiley (1977)

Comments

The phrase "parabolic regression" is seldom used in the Western literature; one uses "polynomial regression" almost exclusively.

How to Cite This Entry:
Parabolic regression. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Parabolic_regression&oldid=48109
This article was adapted from an original article by M.S. Nikulin (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article