Namespaces
Variants
Actions

Difference between revisions of "Gaussian process"

From Encyclopedia of Mathematics
Jump to: navigation, search
(Importing text file)
 
m (tex encoded by computer)
 
(3 intermediate revisions by 2 users not shown)
Line 1: Line 1:
A real stochastic process <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g0436001.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g0436002.png" />, all finite-dimensional distributions of which are Gaussian, i.e. for any <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g0436003.png" /> the characteristic function of the joint probability distribution of the random variables <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g0436004.png" /> has the form
+
<!--
 +
g0436001.png
 +
$#A+1 = 92 n = 0
 +
$#C+1 = 92 : ~/encyclopedia/old_files/data/G043/G.0403600 Gaussian process
 +
Automatically converted into TeX, above some diagnostics.
 +
Please remove this comment and the {{TEX|auto}} line below,
 +
if TeX found to be correct.
 +
-->
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g0436005.png" /></td> </tr></table>
+
{{TEX|auto}}
 +
{{TEX|done}}
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g0436006.png" /></td> </tr></table>
+
{{MSC|60G15}}
  
where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g0436007.png" /> is the mathematical expectation and
+
[[Category:Stochastic processes]]
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g0436008.png" /></td> </tr></table>
+
A real stochastic process  $  X = X( t) $,
 +
$  t \in T $,
 +
all finite-dimensional distributions of which are Gaussian, i.e. for any  $  t _ {1} \dots t _ {n} \in T $
 +
the characteristic function of the joint probability distribution of the random variables  $  X( t _ {1} ) \dots X( t _ {n} ) $
 +
has the form
  
is the [[Covariance|covariance]] function. The probability distribution <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g0436009.png" /> of a Gaussian process is completely determined by its mathematical expectation <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360010.png" /> and by the covariance function <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360011.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360012.png" />. For any function <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360013.png" /> and any positive-definite function <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360014.png" /> there exists a Gaussian process <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360015.png" /> with expectation <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360016.png" /> and covariance function <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360017.png" />. A multi-dimensional stochastic process with vector values
+
$$
 +
\phi _ {t _ {1}  \dots t _ {n} }
 +
( u _ {1} \dots u _ {n} ) =
 +
$$
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360018.png" /></td> </tr></table>
+
$$
 +
= \
 +
\mathop{\rm exp}  \left \{ i \sum _ {k = 1 } ^ { n }  A ( t _ {k} ) u _ {k} - {
 +
\frac{1}{2}
 +
} \sum _ {k, j = 1 } ^ { n }  B
 +
( t _ {k} , t _ {j} ) u _ {k} u _ {j} \right \} ,
 +
$$
 +
 
 +
where  $  A( t) = {\mathsf E} X( t) $
 +
is the mathematical expectation and
 +
 
 +
$$
 +
B ( t, s)  = {\mathsf E} [ X ( t) - A ( t)] [ X ( s) - A ( s)]
 +
$$
 +
 
 +
is the [[Covariance|covariance]] function. The probability distribution  $  X = X( t) $
 +
of a Gaussian process is completely determined by its mathematical expectation  $  A( t) $
 +
and by the covariance function  $  B( t, s) $,
 +
$  s, t \in T $.  
 +
For any function  $  A( t) $
 +
and any positive-definite function  $  B( t, s) $
 +
there exists a Gaussian process  $  X( t) $
 +
with expectation  $  A( t) $
 +
and covariance function  $  B( t, s) $.  
 +
A multi-dimensional stochastic process with vector values
 +
 
 +
$$
 +
X ( t)  =  \{ X _ {1} ( t) \dots X _ {m} ( t) \}
 +
$$
  
 
is called Gaussian if the joint probability distributions of arbitrary variables
 
is called Gaussian if the joint probability distributions of arbitrary variables
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360019.png" /></td> </tr></table>
+
$$
 +
X _ {i _ {1}  } ( t _ {1} ) \dots X _ {i _ {n}  } ( t _ {n} )
 +
$$
  
 
are Gaussian.
 
are Gaussian.
  
A complex Gaussian process <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360020.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360021.png" />, is a process of the form
+
A complex Gaussian process $  X = X( t) $,  
 +
$  t \in T $,  
 +
is a process of the form
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360022.png" /></td> </tr></table>
+
$$
 +
X ( t)  = X _ {1} ( t) + iX _ {2} ( t),
 +
$$
  
in which <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360023.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360024.png" /> jointly form a two-dimensional real Gaussian process. Regarding a complex Gaussian process <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360025.png" /> one additional stipulation is imposed:
+
in which $  X _ {1} ( t) $,  
 +
$  X _ {2} ( t) $
 +
jointly form a two-dimensional real Gaussian process. Regarding a complex Gaussian process $  X( t) = X _ {1} ( t) + i X _ {2} ( t) $
 +
one additional stipulation is imposed:
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360026.png" /></td> </tr></table>
+
$$
 +
{\mathsf E} X ( s) X ( t)  = A ( s) A ( t),
 +
$$
  
 
where
 
where
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360027.png" /></td> </tr></table>
+
$$
 +
A ( t)  = {\mathsf E} X ( t).
 +
$$
  
 
This condition is introduced in order to ensure the preservation of the equivalence between non-correlation and independence, which is a property of ordinary Gaussian random variables. It may be rewritten as follows:
 
This condition is introduced in order to ensure the preservation of the equivalence between non-correlation and independence, which is a property of ordinary Gaussian random variables. It may be rewritten as follows:
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360028.png" /></td> </tr></table>
+
$$
 +
{\mathsf E} [ X _ {1} ( t) - A _ {1} ( t)] [ X _ {1} ( s) - A _ {1} ( s)] =
 +
$$
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360029.png" /></td> </tr></table>
+
$$
 +
= \
 +
{\mathsf E} [ X _ {2} ( t) - A _ {2} ( t)] [ X _ {2} ( s)
 +
- A _ {2} ( s)]  = {
 +
\frac{1}{2}
 +
}  \mathop{\rm Re}  B ( t, s),
 +
$$
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360030.png" /></td> </tr></table>
+
$$
 +
{\mathsf E} [ X _ {1} ( t) - A _ {1} ( t)] [ X _ {2} ( s) - A _ {2} ( s) ]  = - {
 +
\frac{1}{2}
 +
}  \mathop{\rm Im}  B ( t, s),
 +
$$
  
 
where
 
where
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360031.png" /></td> </tr></table>
+
$$
 +
B ( t, s)  = {\mathsf E} [ X ( t) - A ( t)] \overline{ {[ X ( s) - A ( s)] }}\;
 +
$$
  
is the covariance function of the process <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360032.png" /> and
+
is the covariance function of the process $  X( t) $
 +
and
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360033.png" /></td> </tr></table>
+
$$
 +
A _ {1} ( t)  = {\mathsf E} X _ {1} ( t),\ \
 +
A _ {2} ( t)  = {\mathsf E} X _ {2} ( t).
 +
$$
  
A linear generalized stochastic process <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360034.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360035.png" />, on a linear space <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360036.png" /> is called a generalized Gaussian process if its characteristic functional <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360037.png" /> has the form
+
A linear generalized stochastic process $  X = \langle  u , X \rangle $,  
 +
$  u \in U $,  
 +
on a linear space $  U $
 +
is called a generalized Gaussian process if its characteristic functional $  \phi _ {X} ( u ) $
 +
has the form
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360038.png" /></td> </tr></table>
+
$$
 +
\phi _ {X} ( u)  = e ^ {iA ( u) - B ( u, u) /2 } ,\ \
 +
u \in U ,
 +
$$
  
where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360039.png" /> is the mathematical expectation of the generalized process <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360040.png" /> and
+
where $  A( u ) = {\mathsf E} \langle  u , X\rangle $
 +
is the mathematical expectation of the generalized process $  X = \langle  u , X\rangle $
 +
and
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360041.png" /></td> </tr></table>
+
$$
 +
B ( u , v)  = \
 +
{\mathsf E} [ \langle  u , X\rangle - A ( u)] [ \langle  v, X\rangle - A ( v)]
 +
$$
  
 
is its covariance functional.
 
is its covariance functional.
  
Let <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360042.png" /> be a Hilbert space with scalar product <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360043.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360044.png" />. A random variable <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360045.png" /> with values in <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360046.png" /> is called Gaussian if <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360047.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360048.png" />, is a generalized Gaussian process. The mathematical expectation <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360049.png" /> is a continuous linear functional, while the covariance function <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360050.png" /> is a continuous bilinear functional on the Hilbert space <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360051.png" />, and
+
Let $  U $
 +
be a Hilbert space with scalar product $  ( u , v) $,
 +
$  u , v \in U $.  
 +
A random variable $  X $
 +
with values in $  U $
 +
is called Gaussian if $  X = \langle  u , X\rangle $,  
 +
$  u \in U $,  
 +
is a generalized Gaussian process. The mathematical expectation $  A( u) $
 +
is a continuous linear functional, while the covariance function $  B( u , v) $
 +
is a continuous bilinear functional on the Hilbert space $  U $,  
 +
and
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360052.png" /></td> </tr></table>
+
$$
 +
B ( u , v)  = ( Bu , v),\ \
 +
u , v \in U,
 +
$$
  
where the positive operator <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360053.png" /> is a nuclear operator, called the covariance operator. For any such <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360054.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360055.png" /> there exists a Gaussian variable <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360056.png" /> such that the generalized process <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360057.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360058.png" />, has expectation <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360059.png" /> and covariance function <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360060.png" />.
+
where the positive operator $  B $
 +
is a nuclear operator, called the covariance operator. For any such $  A( u ) $
 +
and $  B( u , v) $
 +
there exists a Gaussian variable $  X \in U $
 +
such that the generalized process $  X = \langle  u , X\rangle $,  
 +
$  u \in U $,  
 +
has expectation $  A( u ) $
 +
and covariance function $  B( u , v) $.
  
Example. Let <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360061.png" /> be a Gaussian process on the segment <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360062.png" />, let the process <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360063.png" /> be measurable, and let also
+
Example. Let $  X = X( t) $
 +
be a Gaussian process on the segment $  T = [ a, b] $,  
 +
let the process $  X( t) $
 +
be measurable, and let also
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360064.png" /></td> </tr></table>
+
$$
 +
\int\limits _ { a } ^ { b }  {\mathsf E} [ X ( t)]  ^ {2}  dt  < \infty .
 +
$$
  
Then almost-all the trajectories of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360065.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360066.png" />, will belong to the space of square-integrable functions <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360067.png" /> on <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360068.png" /> with the scalar product
+
Then almost-all the trajectories of $  X( t) $,  
 +
$  t \in T $,  
 +
will belong to the space of square-integrable functions $  u = u( t) $
 +
on $  T $
 +
with the scalar product
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360069.png" /></td> </tr></table>
+
$$
 +
( u , v)  = \
 +
\int\limits _ { a } ^ { b }  u ( t) v ( t)  dt.
 +
$$
  
 
The formula
 
The formula
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360070.png" /></td> </tr></table>
+
$$
 +
\langle  u , X\rangle  = \
 +
\int\limits _ { a } ^ { b }  u ( t) X ( t)  dt,\ \
 +
u \in U,
 +
$$
  
defines a generalized Gaussian process on this space <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360071.png" />. The expectation and the covariance functional of the generalized process <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360072.png" /> are expressed by the formulas
+
defines a generalized Gaussian process on this space $  U $.  
 +
The expectation and the covariance functional of the generalized process $  X = \langle  u , X\rangle $
 +
are expressed by the formulas
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360073.png" /></td> </tr></table>
+
$$
 +
A ( u)  = \int\limits _ { a } ^ { b }  u ( t) A ( t)  dt,
 +
$$
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360074.png" /></td> </tr></table>
+
$$
 +
B ( u , v)  = \int\limits _ { a } ^ { b }  \int\limits _ { a } ^ { b }  B ( t, s) u ( t) v ( s)  dt  ds,
 +
$$
  
where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360075.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360076.png" /> are, respectively, the expectation and the covariance function of the initial process <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360077.png" /> on <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360078.png" />.
+
where $  A( t) $
 +
and $  B( t, s) $
 +
are, respectively, the expectation and the covariance function of the initial process $  X = X( t) $
 +
on $  T = [ a, b] $.
  
Almost-all the fundamental properties of a Gaussian process <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360079.png" /> (the parameter <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360080.png" /> runs through an arbitrary set <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360081.png" />) may be expressed in geometrical terms if the process is considered as a curve in the Hilbert space <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360082.png" /> of all random variables <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360083.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360084.png" />, with the scalar product <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360085.png" /> for which
+
Almost-all the fundamental properties of a Gaussian process $  X = X( t) $(
 +
the parameter $  t $
 +
runs through an arbitrary set $  T $)  
 +
may be expressed in geometrical terms if the process is considered as a curve in the Hilbert space $  H $
 +
of all random variables $  Y $,  
 +
$  {\mathsf E} Y  ^ {2} < \infty $,  
 +
with the scalar product $  ( Y _ {1} , Y _ {2} ) = {\mathsf E} Y _ {1} Y _ {2} $
 +
for which
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360086.png" /></td> </tr></table>
+
$$
 +
( X ( t), 1)  = A ( t),
 +
$$
  
 
and
 
and
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360087.png" /></td> </tr></table>
+
$$
 +
( X ( t) - A ( t), X ( s) - A ( s))  = B ( t, s).
 +
$$
  
 
''Yu.A. Rozanov''
 
''Yu.A. Rozanov''
  
Gaussian processes that are stationary in the narrow sense may be realized by way of certain dynamical systems (a shift in the space of trajectories [[#References|[1]]]). The dynamical systems obtained (which are sometimes denoted as normal, on account of the resemblance to the normal probability distributions) are of interest as examples of dynamical systems with a continuous spectrum the properties of which can be more exhaustively studied owing to the decomposition of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360088.png" /> introduced in [[#References|[4]]], [[#References|[5]]]. The first actual examples of dynamical systems with "non-classical" spectral properties have been constructed in this way.
+
Gaussian processes that are stationary in the narrow sense may be realized by way of certain dynamical systems (a shift in the space of trajectories {{Cite|D}}). The dynamical systems obtained (which are sometimes denoted as normal, on account of the resemblance to the normal probability distributions) are of interest as examples of dynamical systems with a continuous spectrum the properties of which can be more exhaustively studied owing to the decomposition of $  H $
 +
introduced in {{Cite|I}}, {{Cite|I2}}. The first actual examples of dynamical systems with "non-classical" spectral properties have been constructed in this way.
  
 
====References====
 
====References====
<table><TR><TD valign="top">[1]</TD> <TD valign="top">  J.L. Doob,   "Stochastic processes" , Chapman &amp; Hall (1953)</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top">  I.A. Ibragimov,   Yu.A. Rozanov,   "Gaussian random processes" , Springer (1978) (Translated from Russian)</TD></TR><TR><TD valign="top">[3]</TD> <TD valign="top">  H. Cramér,   M.R. Leadbetter,   "Stationary and related stochastic processes" , Wiley (1967) pp. Chapts. 33–34</TD></TR><TR><TD valign="top">[4]</TD> <TD valign="top">  K. Itô,   "Multiple Wiener integral" ''J. Math. Soc. Japan'' , '''3''' : 1 (1951) pp. 157–169</TD></TR><TR><TD valign="top">[5]</TD> <TD valign="top"K. Itô,   "Complex multiple Wiener integral" ''Japan J. Math.'' , '''22''' (1952) pp. 63–86</TD></TR></table>
+
{|
 +
|valign="top"|{{Ref|D}}|| J.L. Doob, "Stochastic processes" , Chapman &amp; Hall (1953) {{MR|1570654}} {{MR|0058896}} {{ZBL|0053.26802}}
 +
|-
 +
|valign="top"|{{Ref|IR}}|| I.A. Ibragimov, Yu.A. Rozanov, "Gaussian random processes" , Springer (1978) (Translated from Russian) {{MR|0543837}} {{ZBL|0392.60037}}
 +
|-
 +
|valign="top"|{{Ref|CL}}|| H. Cramér, M.R. Leadbetter, "Stationary and related stochastic processes" , Wiley (1967) pp. Chapts. 33–34 {{MR|0217860}} {{ZBL|0162.21102}}
 +
|-
 +
|valign="top"|{{Ref|I}}|| K. Itô, "Multiple Wiener integral" ''J. Math. Soc. Japan'' , '''3''' : 1 (1951) pp. 157–169 {{MR|0044064}} {{ZBL|0044.12202}}
 +
|-
 +
|valign="top"|{{Ref|I2}}|| K. Itô, "Complex multiple Wiener integral" ''Japan J. Math.'' , '''22''' (1952) pp. 63–86 {{MR|0063609}} {{ZBL|0049.08602}}
 +
|}
  
 
''D.V. Anosov''
 
''D.V. Anosov''
Line 103: Line 256:
 
A Gaussian process is sometimes called a normal process. See [[Stationary stochastic process|Stationary stochastic process]] for details about stationary Gaussian processes.
 
A Gaussian process is sometimes called a normal process. See [[Stationary stochastic process|Stationary stochastic process]] for details about stationary Gaussian processes.
  
During the last twenty years hard work has been done by the American and French school in studying the regularity of the paths of a (real-valued) Gaussian process <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360089.png" /> with respect to the (pseudo-)metric <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360090.png" /> on <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360091.png" /> defined by
+
During the last twenty years hard work has been done by the American and French school in studying the regularity of the paths of a (real-valued) Gaussian process $  ( X _ {t} ) _ {t \in T }  $
 +
with respect to the (pseudo-)metric $  d $
 +
on $  T $
 +
defined by
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/g/g043/g043600/g04360092.png" /></td> </tr></table>
+
$$
 +
d ( s , t )  = \| X ( s) - X ( t) \| _ {L _ {2}  }  = \
 +
[ B ( s , s ) - 2 B ( s , t ) + B ( t , t ) ]  ^ {1/2} .
 +
$$
  
See [[#References|[a2]]] for a history and an exposition of the definite results. This work produced also tools in order to study (non-Gaussian) Banach-valued stochastic processes.
+
See {{Cite|F}} for a history and an exposition of the definite results. This work produced also tools in order to study (non-Gaussian) Banach-valued stochastic processes.
  
 
====References====
 
====References====
<table><TR><TD valign="top">[a1]</TD> <TD valign="top">  J. Neveu,   "Processus aléatoires Gaussiens" , Univ. Montréal (1968)</TD></TR><TR><TD valign="top">[a2]</TD> <TD valign="top">  X. Fernique,   "Fonctions aléatores gaussiennes, les résultats de M. Talagrand" ''Astérisque'' , '''145–146''' (1987) pp. 177–186 (Exp. 660, Sém. Bourbaki 1985/86)</TD></TR></table>
+
{|
 +
|valign="top"|{{Ref|N}}|| J. Neveu, "Processus aléatoires Gaussiens" , Univ. Montréal (1968) {{MR|0272042}} {{ZBL|0192.54701}}
 +
|-
 +
|valign="top"|{{Ref|F}}|| X. Fernique, "Fonctions aléatores gaussiennes, les résultats de M. Talagrand" ''Astérisque'' , '''145–146''' (1987) pp. 177–186 (Exp. 660, Sém. Bourbaki 1985/86)
 +
|}

Latest revision as of 19:41, 5 June 2020


2020 Mathematics Subject Classification: Primary: 60G15 [MSN][ZBL]

A real stochastic process $ X = X( t) $, $ t \in T $, all finite-dimensional distributions of which are Gaussian, i.e. for any $ t _ {1} \dots t _ {n} \in T $ the characteristic function of the joint probability distribution of the random variables $ X( t _ {1} ) \dots X( t _ {n} ) $ has the form

$$ \phi _ {t _ {1} \dots t _ {n} } ( u _ {1} \dots u _ {n} ) = $$

$$ = \ \mathop{\rm exp} \left \{ i \sum _ {k = 1 } ^ { n } A ( t _ {k} ) u _ {k} - { \frac{1}{2} } \sum _ {k, j = 1 } ^ { n } B ( t _ {k} , t _ {j} ) u _ {k} u _ {j} \right \} , $$

where $ A( t) = {\mathsf E} X( t) $ is the mathematical expectation and

$$ B ( t, s) = {\mathsf E} [ X ( t) - A ( t)] [ X ( s) - A ( s)] $$

is the covariance function. The probability distribution $ X = X( t) $ of a Gaussian process is completely determined by its mathematical expectation $ A( t) $ and by the covariance function $ B( t, s) $, $ s, t \in T $. For any function $ A( t) $ and any positive-definite function $ B( t, s) $ there exists a Gaussian process $ X( t) $ with expectation $ A( t) $ and covariance function $ B( t, s) $. A multi-dimensional stochastic process with vector values

$$ X ( t) = \{ X _ {1} ( t) \dots X _ {m} ( t) \} $$

is called Gaussian if the joint probability distributions of arbitrary variables

$$ X _ {i _ {1} } ( t _ {1} ) \dots X _ {i _ {n} } ( t _ {n} ) $$

are Gaussian.

A complex Gaussian process $ X = X( t) $, $ t \in T $, is a process of the form

$$ X ( t) = X _ {1} ( t) + iX _ {2} ( t), $$

in which $ X _ {1} ( t) $, $ X _ {2} ( t) $ jointly form a two-dimensional real Gaussian process. Regarding a complex Gaussian process $ X( t) = X _ {1} ( t) + i X _ {2} ( t) $ one additional stipulation is imposed:

$$ {\mathsf E} X ( s) X ( t) = A ( s) A ( t), $$

where

$$ A ( t) = {\mathsf E} X ( t). $$

This condition is introduced in order to ensure the preservation of the equivalence between non-correlation and independence, which is a property of ordinary Gaussian random variables. It may be rewritten as follows:

$$ {\mathsf E} [ X _ {1} ( t) - A _ {1} ( t)] [ X _ {1} ( s) - A _ {1} ( s)] = $$

$$ = \ {\mathsf E} [ X _ {2} ( t) - A _ {2} ( t)] [ X _ {2} ( s) - A _ {2} ( s)] = { \frac{1}{2} } \mathop{\rm Re} B ( t, s), $$

$$ {\mathsf E} [ X _ {1} ( t) - A _ {1} ( t)] [ X _ {2} ( s) - A _ {2} ( s) ] = - { \frac{1}{2} } \mathop{\rm Im} B ( t, s), $$

where

$$ B ( t, s) = {\mathsf E} [ X ( t) - A ( t)] \overline{ {[ X ( s) - A ( s)] }}\; $$

is the covariance function of the process $ X( t) $ and

$$ A _ {1} ( t) = {\mathsf E} X _ {1} ( t),\ \ A _ {2} ( t) = {\mathsf E} X _ {2} ( t). $$

A linear generalized stochastic process $ X = \langle u , X \rangle $, $ u \in U $, on a linear space $ U $ is called a generalized Gaussian process if its characteristic functional $ \phi _ {X} ( u ) $ has the form

$$ \phi _ {X} ( u) = e ^ {iA ( u) - B ( u, u) /2 } ,\ \ u \in U , $$

where $ A( u ) = {\mathsf E} \langle u , X\rangle $ is the mathematical expectation of the generalized process $ X = \langle u , X\rangle $ and

$$ B ( u , v) = \ {\mathsf E} [ \langle u , X\rangle - A ( u)] [ \langle v, X\rangle - A ( v)] $$

is its covariance functional.

Let $ U $ be a Hilbert space with scalar product $ ( u , v) $, $ u , v \in U $. A random variable $ X $ with values in $ U $ is called Gaussian if $ X = \langle u , X\rangle $, $ u \in U $, is a generalized Gaussian process. The mathematical expectation $ A( u) $ is a continuous linear functional, while the covariance function $ B( u , v) $ is a continuous bilinear functional on the Hilbert space $ U $, and

$$ B ( u , v) = ( Bu , v),\ \ u , v \in U, $$

where the positive operator $ B $ is a nuclear operator, called the covariance operator. For any such $ A( u ) $ and $ B( u , v) $ there exists a Gaussian variable $ X \in U $ such that the generalized process $ X = \langle u , X\rangle $, $ u \in U $, has expectation $ A( u ) $ and covariance function $ B( u , v) $.

Example. Let $ X = X( t) $ be a Gaussian process on the segment $ T = [ a, b] $, let the process $ X( t) $ be measurable, and let also

$$ \int\limits _ { a } ^ { b } {\mathsf E} [ X ( t)] ^ {2} dt < \infty . $$

Then almost-all the trajectories of $ X( t) $, $ t \in T $, will belong to the space of square-integrable functions $ u = u( t) $ on $ T $ with the scalar product

$$ ( u , v) = \ \int\limits _ { a } ^ { b } u ( t) v ( t) dt. $$

The formula

$$ \langle u , X\rangle = \ \int\limits _ { a } ^ { b } u ( t) X ( t) dt,\ \ u \in U, $$

defines a generalized Gaussian process on this space $ U $. The expectation and the covariance functional of the generalized process $ X = \langle u , X\rangle $ are expressed by the formulas

$$ A ( u) = \int\limits _ { a } ^ { b } u ( t) A ( t) dt, $$

$$ B ( u , v) = \int\limits _ { a } ^ { b } \int\limits _ { a } ^ { b } B ( t, s) u ( t) v ( s) dt ds, $$

where $ A( t) $ and $ B( t, s) $ are, respectively, the expectation and the covariance function of the initial process $ X = X( t) $ on $ T = [ a, b] $.

Almost-all the fundamental properties of a Gaussian process $ X = X( t) $( the parameter $ t $ runs through an arbitrary set $ T $) may be expressed in geometrical terms if the process is considered as a curve in the Hilbert space $ H $ of all random variables $ Y $, $ {\mathsf E} Y ^ {2} < \infty $, with the scalar product $ ( Y _ {1} , Y _ {2} ) = {\mathsf E} Y _ {1} Y _ {2} $ for which

$$ ( X ( t), 1) = A ( t), $$

and

$$ ( X ( t) - A ( t), X ( s) - A ( s)) = B ( t, s). $$

Yu.A. Rozanov

Gaussian processes that are stationary in the narrow sense may be realized by way of certain dynamical systems (a shift in the space of trajectories [D]). The dynamical systems obtained (which are sometimes denoted as normal, on account of the resemblance to the normal probability distributions) are of interest as examples of dynamical systems with a continuous spectrum the properties of which can be more exhaustively studied owing to the decomposition of $ H $ introduced in [I], [I2]. The first actual examples of dynamical systems with "non-classical" spectral properties have been constructed in this way.

References

[D] J.L. Doob, "Stochastic processes" , Chapman & Hall (1953) MR1570654 MR0058896 Zbl 0053.26802
[IR] I.A. Ibragimov, Yu.A. Rozanov, "Gaussian random processes" , Springer (1978) (Translated from Russian) MR0543837 Zbl 0392.60037
[CL] H. Cramér, M.R. Leadbetter, "Stationary and related stochastic processes" , Wiley (1967) pp. Chapts. 33–34 MR0217860 Zbl 0162.21102
[I] K. Itô, "Multiple Wiener integral" J. Math. Soc. Japan , 3 : 1 (1951) pp. 157–169 MR0044064 Zbl 0044.12202
[I2] K. Itô, "Complex multiple Wiener integral" Japan J. Math. , 22 (1952) pp. 63–86 MR0063609 Zbl 0049.08602

D.V. Anosov

Comments

A Gaussian process is sometimes called a normal process. See Stationary stochastic process for details about stationary Gaussian processes.

During the last twenty years hard work has been done by the American and French school in studying the regularity of the paths of a (real-valued) Gaussian process $ ( X _ {t} ) _ {t \in T } $ with respect to the (pseudo-)metric $ d $ on $ T $ defined by

$$ d ( s , t ) = \| X ( s) - X ( t) \| _ {L _ {2} } = \ [ B ( s , s ) - 2 B ( s , t ) + B ( t , t ) ] ^ {1/2} . $$

See [F] for a history and an exposition of the definite results. This work produced also tools in order to study (non-Gaussian) Banach-valued stochastic processes.

References

[N] J. Neveu, "Processus aléatoires Gaussiens" , Univ. Montréal (1968) MR0272042 Zbl 0192.54701
[F] X. Fernique, "Fonctions aléatores gaussiennes, les résultats de M. Talagrand" Astérisque , 145–146 (1987) pp. 177–186 (Exp. 660, Sém. Bourbaki 1985/86)
How to Cite This Entry:
Gaussian process. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Gaussian_process&oldid=14281
This article was adapted from an original article by Yu.A. Rozanov, D.V. Anosov (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article