Namespaces
Variants
Actions

Correlation function

From Encyclopedia of Mathematics
Revision as of 17:31, 5 June 2020 by Ulf Rehmann (talk | contribs) (tex encoded by computer)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search


of a real stochastic process $ \{ {X ( t) } : {t \in T } \} $

The function in the arguments $ t, s \in T $ defined by

$$ B ( t, s) = \ {\mathsf E} [ X ( t) - {\mathsf E} X ( t)] [ X ( s) - {\mathsf E} X ( s)]. $$

For the correlation function to be defined, it must be assumed that the process $ X ( t) $ has a finite second moment $ {\mathsf E} X ( t) ^ {2} $ for all $ t \in T $. The parameter $ t $ varies here over some subset $ T $ of the real line; it is usually interpreted as "time" , though an entirely analogous definition is possible for the correlation function of a stochastic field, where $ T $ is a subset of a finite-dimensional space. If $ \mathbf X ( t) = [ X _ {1} ( t) \dots X _ {n} ( t)] $ is a multivariate stochastic process (stochastic function), then its correlation function is defined to be the matrix-valued function

$$ B ( t, s) = \ \| B _ {ij} ( t, s) \| _ {i, j = 1 } ^ {n} , $$

where

$$ B _ {ij} ( t, s) = \ {\mathsf E} [ X _ {i} ( t) - {\mathsf E} X _ {i} ( t) ] [ X _ {j} ( s) - {\mathsf E} X _ {j} ( s)] $$

is the joint correlation function of the processes $ X _ {i} ( t) $, $ X _ {j} ( t) $.

The correlation function is an important characteristic of a stochastic process. If $ X ( t) $ is a Gaussian process, then its correlation function $ B ( t, s) $ and its mean value $ {\mathsf E} X ( t) $( i.e. its first and second moments) uniquely determine its finite-dimensional distributions; hence also the process as a whole. In the general case, the first two moments are known to be insufficient for a full description of a stochastic process. For example, $ B ( t, s) = e ^ {- a | t - s | } $ is at one and the same time the correlation function of a stationary Gaussian Markov process the trajectories of which are continuous, and also the correlation function of the so-called telegraph signal, a stationary Markov point process taking the two values $ \pm 1 $. However, the correlation function does determine several important properties of a process: the so-called second-order properties (i.e. properties expressed in terms of second moments). In view of this, and also because of their relative simplicity, correlation methods are frequently employed both in the theory of stochastic processes and in its statistical applications (see Correlogram).

The rate and nature of decrease of the correlations as $ | t - s | \rightarrow \infty $ provides an idea of the ergodic properties of a process. Conditions relating to the rate of decrease of correlations, in some form or another, appear in limit theorems for stochastic processes. Local second-order properties, such as mean-square continuity and differentiability, provide a useful — though extremely crude — characteristic of the local behaviour of a process. The properties of the trajectories in terms of the correlation function have been investigated to a considerable degree in the Gaussian case (see Sample function). One of the most complete branches of the theory of stochastic processes is the theory of linear extrapolation and filtration, which yields optimal linear algorithms for the prediction and approximation of stochastic processes; this theory is based on a knowledge of the correlation function.

A characteristic property of the correlation function is the fact that it is positive definite:

$$ \sum _ {i, j = 1 } ^ { n } c _ {i} \overline{c}\; _ {j} B ( t _ {i} , t _ {j} ) \geq 0 , $$

for any $ n $, any complex $ c _ {1} \dots c _ {n} $ and any $ t _ {1} \dots t _ {n} \in T $. In the most important case of a stationary process in the broad sense, $ B ( t, s) $ depends (only) on the difference between the arguments: $ B ( t, s) = R ( t - s) $. The condition that it be positive definite then becomes

$$ \sum _ {i, j = 1 } ^ { n } c _ {i} \overline{c}\; _ {j} R ( t _ {i} - t _ {j} ) \geq 0. $$

If $ R ( t) $ is also continuous at $ t = 0 $( in other words, the process $ X ( t) $ is mean-square continuous), then

$$ R ( t) = \int\limits e ^ {it \lambda } F ( d \lambda ), $$

where $ F ( d \lambda ) $ is a positive finite measure; here $ \lambda $ runs over the entire real line if $ T = (- \infty , \infty ) $( the case of "continuous time" ), or over the interval $ [- \pi , \pi ] $ if $ T = \{ \dots, - 1, 0, 1 ,\dots \} $( the case of "discrete time" ). The measure $ F ( d \lambda ) $ is known as the spectral measure of the stochastic process. Thus, the correlation and spectral properties of a stationary stochastic process prove to be closely related; for example, the rate of decrease in correlations as $ t \rightarrow \infty $ corresponds to the degree of smoothness of the spectral density $ f ( \lambda ) = F ( d \lambda )/d \lambda $, etc.

In statistical mechanics, the term is also used for the joint probability density $ \rho ( x _ {1} \dots x _ {m} ) $ of $ m $ distinct particles of the system under consideration placed at points $ x _ {1} \dots x _ {m} $; the totality of these functions uniquely determines the corresponding discrete stochastic field.

References

[1] J.L. Doob, "Stochastic processes" , Chapman & Hall (1953)
[2] M. Loève, "Probability theory" , Princeton Univ. Press (1963)
[3] I.I. Gikhman, A.V. Skorokhod, "Introduction to the theory of stochastic processes" , Saunders (1969) (Translated from Russian)
How to Cite This Entry:
Correlation function. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Correlation_function&oldid=46523
This article was adapted from an original article by A.S. Kholevo (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article