Gaussian process
2020 Mathematics Subject Classification: Primary: 60G15 [MSN][ZBL]
A real stochastic process $ X = X( t) $, $ t \in T $, all finite-dimensional distributions of which are Gaussian, i.e. for any $ t _ {1} \dots t _ {n} \in T $ the characteristic function of the joint probability distribution of the random variables $ X( t _ {1} ) \dots X( t _ {n} ) $ has the form
$$ \phi _ {t _ {1} \dots t _ {n} } ( u _ {1} \dots u _ {n} ) = $$
$$ = \ \mathop{\rm exp} \left \{ i \sum _ {k = 1 } ^ { n } A ( t _ {k} ) u _ {k} - { \frac{1}{2} } \sum _ {k, j = 1 } ^ { n } B ( t _ {k} , t _ {j} ) u _ {k} u _ {j} \right \} , $$
where $ A( t) = {\mathsf E} X( t) $ is the mathematical expectation and
$$ B ( t, s) = {\mathsf E} [ X ( t) - A ( t)] [ X ( s) - A ( s)] $$
is the covariance function. The probability distribution $ X = X( t) $ of a Gaussian process is completely determined by its mathematical expectation $ A( t) $ and by the covariance function $ B( t, s) $, $ s, t \in T $. For any function $ A( t) $ and any positive-definite function $ B( t, s) $ there exists a Gaussian process $ X( t) $ with expectation $ A( t) $ and covariance function $ B( t, s) $. A multi-dimensional stochastic process with vector values
$$ X ( t) = \{ X _ {1} ( t) \dots X _ {m} ( t) \} $$
is called Gaussian if the joint probability distributions of arbitrary variables
$$ X _ {i _ {1} } ( t _ {1} ) \dots X _ {i _ {n} } ( t _ {n} ) $$
are Gaussian.
A complex Gaussian process $ X = X( t) $, $ t \in T $, is a process of the form
$$ X ( t) = X _ {1} ( t) + iX _ {2} ( t), $$
in which $ X _ {1} ( t) $, $ X _ {2} ( t) $ jointly form a two-dimensional real Gaussian process. Regarding a complex Gaussian process $ X( t) = X _ {1} ( t) + i X _ {2} ( t) $ one additional stipulation is imposed:
$$ {\mathsf E} X ( s) X ( t) = A ( s) A ( t), $$
where
$$ A ( t) = {\mathsf E} X ( t). $$
This condition is introduced in order to ensure the preservation of the equivalence between non-correlation and independence, which is a property of ordinary Gaussian random variables. It may be rewritten as follows:
$$ {\mathsf E} [ X _ {1} ( t) - A _ {1} ( t)] [ X _ {1} ( s) - A _ {1} ( s)] = $$
$$ = \ {\mathsf E} [ X _ {2} ( t) - A _ {2} ( t)] [ X _ {2} ( s) - A _ {2} ( s)] = { \frac{1}{2} } \mathop{\rm Re} B ( t, s), $$
$$ {\mathsf E} [ X _ {1} ( t) - A _ {1} ( t)] [ X _ {2} ( s) - A _ {2} ( s) ] = - { \frac{1}{2} } \mathop{\rm Im} B ( t, s), $$
where
$$ B ( t, s) = {\mathsf E} [ X ( t) - A ( t)] \overline{ {[ X ( s) - A ( s)] }}\; $$
is the covariance function of the process $ X( t) $ and
$$ A _ {1} ( t) = {\mathsf E} X _ {1} ( t),\ \ A _ {2} ( t) = {\mathsf E} X _ {2} ( t). $$
A linear generalized stochastic process $ X = \langle u , X \rangle $, $ u \in U $, on a linear space $ U $ is called a generalized Gaussian process if its characteristic functional $ \phi _ {X} ( u ) $ has the form
$$ \phi _ {X} ( u) = e ^ {iA ( u) - B ( u, u) /2 } ,\ \ u \in U , $$
where $ A( u ) = {\mathsf E} \langle u , X\rangle $ is the mathematical expectation of the generalized process $ X = \langle u , X\rangle $ and
$$ B ( u , v) = \ {\mathsf E} [ \langle u , X\rangle - A ( u)] [ \langle v, X\rangle - A ( v)] $$
is its covariance functional.
Let $ U $ be a Hilbert space with scalar product $ ( u , v) $, $ u , v \in U $. A random variable $ X $ with values in $ U $ is called Gaussian if $ X = \langle u , X\rangle $, $ u \in U $, is a generalized Gaussian process. The mathematical expectation $ A( u) $ is a continuous linear functional, while the covariance function $ B( u , v) $ is a continuous bilinear functional on the Hilbert space $ U $, and
$$ B ( u , v) = ( Bu , v),\ \ u , v \in U, $$
where the positive operator $ B $ is a nuclear operator, called the covariance operator. For any such $ A( u ) $ and $ B( u , v) $ there exists a Gaussian variable $ X \in U $ such that the generalized process $ X = \langle u , X\rangle $, $ u \in U $, has expectation $ A( u ) $ and covariance function $ B( u , v) $.
Example. Let $ X = X( t) $ be a Gaussian process on the segment $ T = [ a, b] $, let the process $ X( t) $ be measurable, and let also
$$ \int\limits _ { a } ^ { b } {\mathsf E} [ X ( t)] ^ {2} dt < \infty . $$
Then almost-all the trajectories of $ X( t) $, $ t \in T $, will belong to the space of square-integrable functions $ u = u( t) $ on $ T $ with the scalar product
$$ ( u , v) = \ \int\limits _ { a } ^ { b } u ( t) v ( t) dt. $$
The formula
$$ \langle u , X\rangle = \ \int\limits _ { a } ^ { b } u ( t) X ( t) dt,\ \ u \in U, $$
defines a generalized Gaussian process on this space $ U $. The expectation and the covariance functional of the generalized process $ X = \langle u , X\rangle $ are expressed by the formulas
$$ A ( u) = \int\limits _ { a } ^ { b } u ( t) A ( t) dt, $$
$$ B ( u , v) = \int\limits _ { a } ^ { b } \int\limits _ { a } ^ { b } B ( t, s) u ( t) v ( s) dt ds, $$
where $ A( t) $ and $ B( t, s) $ are, respectively, the expectation and the covariance function of the initial process $ X = X( t) $ on $ T = [ a, b] $.
Almost-all the fundamental properties of a Gaussian process $ X = X( t) $( the parameter $ t $ runs through an arbitrary set $ T $) may be expressed in geometrical terms if the process is considered as a curve in the Hilbert space $ H $ of all random variables $ Y $, $ {\mathsf E} Y ^ {2} < \infty $, with the scalar product $ ( Y _ {1} , Y _ {2} ) = {\mathsf E} Y _ {1} Y _ {2} $ for which
$$ ( X ( t), 1) = A ( t), $$
and
$$ ( X ( t) - A ( t), X ( s) - A ( s)) = B ( t, s). $$
Yu.A. Rozanov
Gaussian processes that are stationary in the narrow sense may be realized by way of certain dynamical systems (a shift in the space of trajectories [D]). The dynamical systems obtained (which are sometimes denoted as normal, on account of the resemblance to the normal probability distributions) are of interest as examples of dynamical systems with a continuous spectrum the properties of which can be more exhaustively studied owing to the decomposition of $ H $ introduced in [I], [I2]. The first actual examples of dynamical systems with "non-classical" spectral properties have been constructed in this way.
References
[D] | J.L. Doob, "Stochastic processes" , Chapman & Hall (1953) MR1570654 MR0058896 Zbl 0053.26802 |
[IR] | I.A. Ibragimov, Yu.A. Rozanov, "Gaussian random processes" , Springer (1978) (Translated from Russian) MR0543837 Zbl 0392.60037 |
[CL] | H. Cramér, M.R. Leadbetter, "Stationary and related stochastic processes" , Wiley (1967) pp. Chapts. 33–34 MR0217860 Zbl 0162.21102 |
[I] | K. Itô, "Multiple Wiener integral" J. Math. Soc. Japan , 3 : 1 (1951) pp. 157–169 MR0044064 Zbl 0044.12202 |
[I2] | K. Itô, "Complex multiple Wiener integral" Japan J. Math. , 22 (1952) pp. 63–86 MR0063609 Zbl 0049.08602 |
D.V. Anosov
Comments
A Gaussian process is sometimes called a normal process. See Stationary stochastic process for details about stationary Gaussian processes.
During the last twenty years hard work has been done by the American and French school in studying the regularity of the paths of a (real-valued) Gaussian process $ ( X _ {t} ) _ {t \in T } $ with respect to the (pseudo-)metric $ d $ on $ T $ defined by
$$ d ( s , t ) = \| X ( s) - X ( t) \| _ {L _ {2} } = \ [ B ( s , s ) - 2 B ( s , t ) + B ( t , t ) ] ^ {1/2} . $$
See [F] for a history and an exposition of the definite results. This work produced also tools in order to study (non-Gaussian) Banach-valued stochastic processes.
References
[N] | J. Neveu, "Processus aléatoires Gaussiens" , Univ. Montréal (1968) MR0272042 Zbl 0192.54701 |
[F] | X. Fernique, "Fonctions aléatores gaussiennes, les résultats de M. Talagrand" Astérisque , 145–146 (1987) pp. 177–186 (Exp. 660, Sém. Bourbaki 1985/86) |
Gaussian process. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Gaussian_process&oldid=47056