Namespaces
Variants
Actions

Difference between revisions of "Stochastic processes, interpolation of"

From Encyclopedia of Mathematics
Jump to: navigation, search
(Importing text file)
 
m (tex encoded by computer)
 
Line 1: Line 1:
The problem of estimating the values of a [[Stochastic process|stochastic process]] <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090270/s0902701.png" /> on some interval <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090270/s0902702.png" /> using its observed values outside this interval. Usually one has in mind the interpolation estimator <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090270/s0902703.png" /> for which the mean-square error of interpolation is minimal compared to all other estimators:
+
<!--
 +
s0902701.png
 +
$#A+1 = 23 n = 0
 +
$#C+1 = 23 : ~/encyclopedia/old_files/data/S090/S.0900270 Stochastic processes, interpolation of
 +
Automatically converted into TeX, above some diagnostics.
 +
Please remove this comment and the {{TEX|auto}} line below,
 +
if TeX found to be correct.
 +
-->
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090270/s0902704.png" /></td> </tr></table>
+
{{TEX|auto}}
 +
{{TEX|done}}
  
the interpolation is called linear if one restricts attention to linear estimators. One of the first problems posed and solved was that of linear interpolation of the value <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090270/s0902705.png" /> of a stationary sequence. This problem is analogous to the following one: In the space <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090270/s0902706.png" /> of square-integrable functions on the interval <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090270/s0902707.png" />, one must find the projection of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090270/s0902708.png" /> onto the subspace generated by the functions <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090270/s0902709.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090270/s09027010.png" />. This problem has been greatly generalized in the theory of stationary stochastic processes (cf. [[Stationary stochastic process|Stationary stochastic process]]; [[#References|[1]]], [[#References|[2]]]). One application is the problem of interpolation of the stochastic process arising from the system
+
The problem of estimating the values of a [[Stochastic process|stochastic process]]  $  X ( t) $
 +
on some interval $  a < t < b $
 +
using its observed values outside this interval. Usually one has in mind the interpolation estimator  $  \widehat{X}  ( t) $
 +
for which the mean-square error of interpolation is minimal compared to all other estimators:
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090270/s09027011.png" /></td> </tr></table>
+
$$
 +
{\mathsf E} | \widehat{X}  ( t) - X ( t) |  ^ {2}  = \min ;
 +
$$
  
where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090270/s09027012.png" /> is a linear differential operator of order <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090270/s09027013.png" />, and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090270/s09027014.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090270/s09027015.png" />, is a [[White noise|white noise]] process. For given initial values <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090270/s09027016.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090270/s09027017.png" />, independent of the white noise, the optimal interpolation estimator <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090270/s09027018.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090270/s09027019.png" />, is the solution of the corresponding boundary value problem
+
the interpolation is called linear if one restricts attention to linear estimators. One of the first problems posed and solved was that of linear interpolation of the value  $  X ( 0) $
 +
of a stationary sequence. This problem is analogous to the following one: In the space  $  L _ {2} $
 +
of square-integrable functions on the interval  $  - \pi < \lambda \leq  \pi $,  
 +
one must find the projection of $  \phi ( \lambda ) \in L _ {2} $
 +
onto the subspace generated by the functions  $  e ^ {i \lambda k } \phi ( \lambda ) $,  
 +
$  k = \pm  1 , \pm  2 ,\dots $.  
 +
This problem has been greatly generalized in the theory of stationary stochastic processes (cf. [[Stationary stochastic process|Stationary stochastic process]]; [[#References|[1]]], [[#References|[2]]]). One application is the problem of interpolation of the stochastic process arising from the system
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090270/s09027020.png" /></td> </tr></table>
+
$$
 +
L X ( t)  = Y ( t) ,\  t > t _ {0} ,
 +
$$
  
where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090270/s09027021.png" /> denotes the formal adjoint operator,
+
where $  L $
 +
is a linear differential operator of order  $  l $,
 +
and  $  Y ( t) $,
 +
$  t > t _ {0} $,
 +
is a [[White noise|white noise]] process. For given initial values  $  X  ^ {(} k) ( t _ {0} ) $,
 +
$  k = 1 \dots l - 1 $,
 +
independent of the white noise, the optimal interpolation estimator  $  X ( t) $,
 +
$  a < t < b $,  
 +
is the solution of the corresponding boundary value problem
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090270/s09027022.png" /></td> </tr></table>
+
$$
 +
L  ^ {*} L \widehat{X}  ( t)  = 0 ,\ \
 +
a < t < b ,
 +
$$
  
with boundary conditions at the boundary points <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090270/s09027023.png" />. For systems of stochastic differential equations the problem of interpolation of some components given the values of other observed components reduces to similar interpolation equations. (See [[#References|[3]]].)
+
where  $  L  ^ {*} $
 +
denotes the formal adjoint operator,
 +
 
 +
$$
 +
\widehat{X}  {}  ^ {(} k) ( s)  =  X  ^ {(} k) ( s) ,\ \
 +
k = 0 \dots l ,
 +
$$
 +
 
 +
with boundary conditions at the boundary points $  s = a , b $.  
 +
For systems of stochastic differential equations the problem of interpolation of some components given the values of other observed components reduces to similar interpolation equations. (See [[#References|[3]]].)
  
 
====References====
 
====References====
 
<table><TR><TD valign="top">[1]</TD> <TD valign="top">  A.N. Kolmogorov,  "Stationary sequences in Hilbert space"  ''Byull. Moskov. Gos. Univ. Sekt. A'' , '''2''' :  6  (1941)  pp. 1–40  (In Russian)</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top">  Yu.A. Rozanov,  "Stationary stochastic processes" , Holden-Day  (1967)  (Translated from Russian)</TD></TR><TR><TD valign="top">[3]</TD> <TD valign="top">  R.S. Liptser,  A.N. Shiryaev,  "Statistics of stochastic processes" , '''1–2''' , Springer  (1977–1978)  (Translated from Russian)</TD></TR></table>
 
<table><TR><TD valign="top">[1]</TD> <TD valign="top">  A.N. Kolmogorov,  "Stationary sequences in Hilbert space"  ''Byull. Moskov. Gos. Univ. Sekt. A'' , '''2''' :  6  (1941)  pp. 1–40  (In Russian)</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top">  Yu.A. Rozanov,  "Stationary stochastic processes" , Holden-Day  (1967)  (Translated from Russian)</TD></TR><TR><TD valign="top">[3]</TD> <TD valign="top">  R.S. Liptser,  A.N. Shiryaev,  "Statistics of stochastic processes" , '''1–2''' , Springer  (1977–1978)  (Translated from Russian)</TD></TR></table>
 
 
  
 
====Comments====
 
====Comments====

Latest revision as of 08:23, 6 June 2020


The problem of estimating the values of a stochastic process $ X ( t) $ on some interval $ a < t < b $ using its observed values outside this interval. Usually one has in mind the interpolation estimator $ \widehat{X} ( t) $ for which the mean-square error of interpolation is minimal compared to all other estimators:

$$ {\mathsf E} | \widehat{X} ( t) - X ( t) | ^ {2} = \min ; $$

the interpolation is called linear if one restricts attention to linear estimators. One of the first problems posed and solved was that of linear interpolation of the value $ X ( 0) $ of a stationary sequence. This problem is analogous to the following one: In the space $ L _ {2} $ of square-integrable functions on the interval $ - \pi < \lambda \leq \pi $, one must find the projection of $ \phi ( \lambda ) \in L _ {2} $ onto the subspace generated by the functions $ e ^ {i \lambda k } \phi ( \lambda ) $, $ k = \pm 1 , \pm 2 ,\dots $. This problem has been greatly generalized in the theory of stationary stochastic processes (cf. Stationary stochastic process; [1], [2]). One application is the problem of interpolation of the stochastic process arising from the system

$$ L X ( t) = Y ( t) ,\ t > t _ {0} , $$

where $ L $ is a linear differential operator of order $ l $, and $ Y ( t) $, $ t > t _ {0} $, is a white noise process. For given initial values $ X ^ {(} k) ( t _ {0} ) $, $ k = 1 \dots l - 1 $, independent of the white noise, the optimal interpolation estimator $ X ( t) $, $ a < t < b $, is the solution of the corresponding boundary value problem

$$ L ^ {*} L \widehat{X} ( t) = 0 ,\ \ a < t < b , $$

where $ L ^ {*} $ denotes the formal adjoint operator,

$$ \widehat{X} {} ^ {(} k) ( s) = X ^ {(} k) ( s) ,\ \ k = 0 \dots l , $$

with boundary conditions at the boundary points $ s = a , b $. For systems of stochastic differential equations the problem of interpolation of some components given the values of other observed components reduces to similar interpolation equations. (See [3].)

References

[1] A.N. Kolmogorov, "Stationary sequences in Hilbert space" Byull. Moskov. Gos. Univ. Sekt. A , 2 : 6 (1941) pp. 1–40 (In Russian)
[2] Yu.A. Rozanov, "Stationary stochastic processes" , Holden-Day (1967) (Translated from Russian)
[3] R.S. Liptser, A.N. Shiryaev, "Statistics of stochastic processes" , 1–2 , Springer (1977–1978) (Translated from Russian)

Comments

The interpolation problem is usually defined as the estimation of an unobserved stochastic process on some time interval given a related stochastic process that is observed outside this time interval. One distinguishes two special cases: 1) linear least-squares interpolation, in which the estimator is constrained to be linear and minimizes a least-squares criterion, see [a1], [a3]; and 2) interpolation in which the conditional distribution of the estimator given the observations is determined, see [a2].

For the Western literature on interpolation see [a5], Sect. 5.3 and [a1], Sect. 4.13. Additional Russian references that have been translated are [a6]; [a7], Sect. 37. For recent developments using stochastic realization theory see [a3], [a4]. Results for the interpolation problem may also be deduced from those for the smoothing problem [a2].

References

[a1] H. Dym, H.P. McKean, "Gaussian processes, function theory, and the inverse spectral problem" , Acad. Press (1976)
[a2] E. Pardoux, "Equations du filtrage nonlinéaire, de la prédiction et du lissage" Stochastics , 6 (1982) pp. 193–231
[a3] M. Pavon, "New results on the interpolation problem for continuous-time stationary increment processes" SIAM J. Control Optim. , 22 (1984) pp. 133–142
[a4] M. Pavon, "Optimal interpolation for linear stochastic systems" SIAM J. Control Optim. , 22 (1984) pp. 618–629
[a5] N. Wiener, "Extrapolation, interpolation, and smoothing of stationary time series: with engineering applications" , M.I.T. (1949)
[a6] A.M. Yaglom, "Extrapolation, interpolation and filtration of stationary random processes with rational spectral density" Amer. Math. Soc. Sel. Transl. Math. Statist. , 4 (1963) pp. 345–387 Tr. Moskov. Mat. Obshch. , 4 (1955) pp. 333–374
[a7] A.M. Yaglom, "An introduction to the theory of stationary random functions" , Prentice-Hall (1962) (Translated from Russian)
How to Cite This Entry:
Stochastic processes, interpolation of. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Stochastic_processes,_interpolation_of&oldid=17488
This article was adapted from an original article by Yu.A. Rozanov (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article