Random field (Statprob)
Copyright notice |
---|
This article Random field was adapted from an original article by Mikhail Moklyachuk, which appeared in StatProb: The Encyclopedia Sponsored by Statistics and Probability Societies. The original article ([http://statprob.com/encyclopedia/RandomField6.html StatProb Source], Local Files: pdf | tex) is copyrighted by the author(s), the article has been donated to Encyclopedia of Mathematics, and its further issues are under Creative Commons Attribution Share-Alike License'. All pages from StatProb are contained in the Category StatProb. |
2020 Mathematics Subject Classification: Primary: 62M40 Secondary: 60G60 [MSN][ZBL]
Keywords and Phrases: Random field; Kolmogorov Existence Theorem; Gaussian random field; Wiener sheet; Brownian sheet; Poisson random field; Markov random field; Homogeneous random field; Isotropic random field; Spectral decomposition
Random field $X(t)$ on $D\subset\mathbb R^n$ (i.e. $t\in D\subset\mathbb R^n$) is a function whose values are random variables for any $t\in D$. The dimension of the coordinate is usually in the range from one to four, but any $n>0$ is possible. A one-dimensional random field is usually called a stochastic process. The term 'random field' is used to stress that the dimension of the coordinate is higher than one. Random fields in two and three dimensions are encountered in a wide range of sciences and especially in the earth sciences such as hydrology, agriculture, and geology. Random fields where $t$ is a position in space-time are studied in turbulence theory and in meteorology.
Random field $X(t)$ is described by its finite-dimensional (cumulative) distributions $$F_{t_1,\dots,t_k}(x_1,\dots,x_k)=P\{X(t_1)<x_1,\dots,X(t_k)<x_k\}, k=1,2,\dots$$ The cumulative distribution functions are by definition left-continuous and nondecreasing. Two requirements on the finite-dimensional distributions must be satisfied. The symmetry condition $$F_{t_1,\dots,t_k}(x_1,\dots,x_k)=F_{t_{\pi1},\dots,t_{\pi k}}(x_{\pi1},\dots,x_{\pi k}),$$ $\pi$ is a permutation of the index set $\{1,\dots,k\}$. The compatibility condition $$F_{t_1,\dots,t_{k-1}}(x_1,\dots,x_{k-1})= F_{t_1,\dots,t_k}(x_1,\dots,x_{k-1},\infty).$$
Kolmogorov Existence Theorem states: If a system of finite-dimensional distributions $F_{t_1,\dots,t_k}(x_1,\dots,x_k)$, $k=1,2,\dots$ satisfies the symmetry and compatibility conditions, then there exists on some probability space a random field $X(t)$, $t\in D$, having $F_{t_1,\dots,t_k}(x_1,\dots,x_k)$, $k=1,2,\dots$ as its finite-dimensional distributions.
The expectation (mean value) of a random field is by definition the Stieltjes integral $$ m(t)=EX(t)=\int_{\mathbb R^1}xdF_t(x). $$ The (auto-)covariance function is also expressed as the Stieltjes integral $$ B(t,s)=E(X(t)X(s))-m(t)m(s)=\iint_{\mathbb R^2}xydF_{t,s}(x,y)-m(t)m(s), $$ whereas the variance is $\sigma^2(t)=B(t,t)$.
Gaussian random fields play an important role for several reasons: the specification of their finite-dimensional distributions is simple, they are reasonable models for many natural phenomenons, and their estimation and inference are simple.
A Gaussian random field is a random field where all the finite-dimensional distributions are multivariate normal distributions. Since multivariate normal distributions are completely specified by expectations and covariances, it suffices to specify $m(t)$ and $B(t, s)$ in such a way that the symmetry condition and the compatibility condition hold true. The expectation can be arbitrarily chosen, but the covariance function must be positive definite to ensure the existence of all finite-dimensional distributions ([Ad, Adler and Taylor 2007]; [Pit, Piterbarg 1996])
Wiener sheet (Brownian sheet) is a Gaussian random field $W(t)$, $t=(t_1,t_2)\in\mathbb R_+^2$ with $EW(t)=0$ and correlation function $$ B(t,s)=E(X(t)X(s))=\min\{s_1,t_1\}\min\{s_2,t_2\}.$$ Analogously, $n$-parametric Wiener process is a Gaussian random field $W(t)$, $t\in\mathbb R^n_+$ with $EW(t)=0$ and correlation function $ B(t,s)=\prod_{i=1}^n\min(s_i,t_i)$. Multiparametric Wiener process $W(t)$ has independent homogeneous increments. Generalized derivative of multiparametric Wiener process $W(t)$ is Gaussian white noise process on $\mathbb R^n_+$ ([Ch, Chung and Walsh 2005]; [Khos, Khoshnevisan 2002]).
Poisson random fields are also reasonable models for many natural phenomenon. A Poisson random field is an integer-valued (point) random field where the (random) amount of points which belong to a bounded set from the range of values of the field has a Poisson distribution and the random amounts of points which belong to nonoverlapping sets are mutually independent ([Ker, Kerstan et al. 1974]). Point-valued random fields (Poisson random fields, Cox random fields, Poisson cluster random fields, Markov point random fields, homogeneous and isotropic point random fields, marked point random fields) are appropriate mathematical models for geostatistical data. A mathematically elegant approach to analysis of point-valued random fields (spatial point processes) is proposed by Noel A.C. Cressie ([Cr, Cressie 1991]).
Markov random field $X(t)$, $t\in D\subset\mathbb R^n$, is a random function which has the Markov property with respect to a fixed system of ordered triples $(S_1,\Gamma,S_2)$ of nonoverlapping subsets from the domain of definition $D$. The Markov property means that for any measurable set $B$ from the range of values of the function $X(t)$ and every $t_0\in S_2$ the following equality holds true $$ P\{X(t_0)\in B\vert X(t),t\in S_1\cup\Gamma\}= P\{X(t_0)\in B\vert X(t),t\in \Gamma\}. $$ This means that the future $S_2$ does not depend on the past $S_1$ when the present $\Gamma$ is given. Let, for example, $D=\mathbb R^n$, $\{\Gamma\}$ be a family of all spheres in $\mathbb R^n$, $S_1$ be the interior of $\Gamma$, $S_2$ be the exterior of $\Gamma$. A homogeneous and isotropic Gaussian random field $X(t)$, $t\in\mathbb R^n$, has the Markov property with respect to the ordered triples $(S_1,\Gamma,S_2)$ if and only if $X(t)=\xi$, where $\xi$ is a random variable. Nontrivial examples of homogeneous and isotropic Markov random fields can be constructed when consider the generalized random fields. Markov random fields are completely described in the class of homogeneous Gaussian random fields on $\mathbb Z^n$, in the class of multidimensional homogeneous generalized Gaussian random fields on the space $\mathbb C_0^{\infty}(\mathbb R^m)$ and the class of multidimensional homogeneous and isotropic generalized Gaussian random fields ([Gl, Glimm and Jaffe 1981]; [Roz, Rozanov 1982]; [Yadr, Yadrenko 1983]).
Gibbs random fields form a class of random fields that have extensive applications in solutions of problems in statistical physics. The distribution functions of these fields are determined by Gibbs distribution ([Mal, Malyshev and Minlos 1985]).
Homogeneous random field in the strict sense is a real valued random function $X(t)$, $t\in\mathbb R^n$ (or $t\in\mathbb Z^n$), where all its finite-dimensional distributions are invariant under arbitrary translations, i.e. $$F_{t_1+s,\dots,t_k+s}(x_1,\dots,x_k)=F_{t_1,\dots,t_k}(x_1,\dots,x_k)\,\forall s\in\mathbb R^n.$$
Homogeneous random field in the wide sense is a real valued random function $X(t)$, $t\in\mathbb R^n$ ($t\in\mathbb Z^n$), $E|X(t)|^2<+\infty$, where $EX(t)=m=\text{const}$ and the correlation function $EX(t)X(s)=B(t-s)$ depends on the difference $t-s$ of coordinates of points $t$ and $s$.
Homogeneous random field $X(t)$, $t\in\mathbb R^n$, $EX(t)=0$, $E|X(t)|^2<+\infty$, and its correlation function $ B(t)=EX(t+s)X(s)$ admit the spectral representations $$ X(t)=\int\cdots\int\exp\left\{\sum_{k=1}^nt_k\lambda_k\right\}Z(d\lambda), $$ $$ B(t)=\int\cdots\int\exp\left\{\sum_{k=1}^nt_k\lambda_k\right\}F(d\lambda), $$ where $F(d\lambda)$ is a measure on the Borel $\sigma$-algebra $B_n$ of sets from $\mathbb R^n$, $Z(d\lambda)$ is an orthogonal random measure on $B_n$ such that $E(Z(S_1)Z(S_2))=F(S_1\cap S_2)$. The integration range is $\mathbb R^n$ in the case of continuous time random field $X(t)$, $t\in\mathbb R^n$ and $[-\pi,\pi]^n$ in the case of discrete time random field $X(t)$, $t\in\mathbb Z^n$. In the case where the spectral representation of the correlation function is of the form $$ B(t)= \int\cdots\int\exp\left\{\sum_{k=1}^nt_k\lambda_k\right\}f(\lambda)d\lambda, $$ the function $f(\lambda)$ is called spectral density of the field $X(t)$. Based on these spectral representations we can prove, for example, the law of large numbers for random field $X(t)$:
The mean square limit $$ \lim_{N\to\infty}\frac{1}{(2N+1)^n} \sum_{|t_i|\leq N,i=1,\dots,n}X(t)=Z\{0\}. $$ This limit is equal to $EX(t)=0$ if and only if $E|Z\{0\}|^2=F\{0\}$. In the case where $F\{0\}=0$ and $$ \int_{-\pi}^{\pi}\cdots\int_{-\pi}^{\pi} \prod_{i=1}^n \log\left|\log\frac{1}{|\lambda_i|}\right|F(d\lambda)<+\infty, $$ the strong law of large numbers holds true for the random field $X(t)$.
Isotropic random field is a real valued random function $X(t)$, $t\in\mathbb R^n$, $E|X(t)|^2<+\infty$, where the expectation and the correlation function have properties: $EX(t)=EX(gt)$ and $EX(t)X(s)=EX(gt)X(gs)$ for all rotations $g$ around the origin of coordinates. An isotropic random field $X(t)$ admits the decomposition $$ X(t)=\sum_{m=0}^{\infty}\sum_{l=1}^{h(m,n)} X_m^l(r)S_m^l(\theta_1,\theta_2,\dots,\theta_{n-2},\varphi), $$ where $(r,\theta_1,\theta_2,\dots,\theta_{n-2},\varphi)$ are spherical coordinates of the point $t\in\mathbb R^n$, $S_m^l(\theta_1,\theta_2,\dots,\theta_{n-2},\varphi)$ are spherical harmonics of the degree $m$, $h(m,n)$ is the amount of such harmonics, $X_m^l(r)$ are uncorrelated stochastic processes such that $E(X_m^l(r) X_{m_1}^{l_1}(s))=b_m(r,s)\delta_{m}^{m_1}\delta_l^{l_1}$, where $\delta_i^j$ is the Kronecker symbol, $b_m(r,s)$ is a sequence of positive definite kernels such that $\sum_{m=0}^{\infty}h(m,n)b_m(r,s)<+\infty$, $b_m(0,s)=0,m\not=0$.
Isotropic random field $X(t)$, $t\in\mathbb R^2$, on the plane admits the decomposition $$ X(r,\varphi)=\sum_{m=0}^{\infty} \left\{X_m^1(r)\cos(m\varphi)+X_m^2(r)\sin(m\varphi)\right\}. $$ The class of isotropic random fields includes homogeneous and isotropic random fields, multiparametric Brownian motion processes.
Homogeneous and isotropic random field is a real valued random function $X(t)$, $t\in\mathbb R^n$, $E|X(t)|^2<+\infty$, where the expectation $EX(t)=c=\text{const}$ and the correlation function $EX(t)X(s)=B(|t-s|)$ depends on the distance $|t-s|$ between points $t$ and $s$. Homogeneous and isotropic random field $X(t)$ and its correlation function $B(r)$ admit the spectral representations ([Yadr, Yadrenko 1983]) $$ X(t)=c_n\sum_{m=0}^{\infty}\sum_{l=1}^{h(m,n)} S_m^l(\theta_1,\theta_2,\dots,\theta_{n-2},\varphi) \int_0^{\infty}\frac{J_{m+(n-2)/2}(r\lambda)}{(r\lambda)^{(n-2)/2}} Z_m^l(d\lambda), $$ $$ B(r)= \int_0^{\infty}Y_{n}(r\lambda)d\Phi(\lambda), $$ where $$ Y_n(x)=2^{(n-2)/2}\Gamma\left(\frac{n}{2}\right)\frac{J_{(n-2)/2}(x)}{x^{(n-2)/2}} $$ is a spherical Bessel function, $\Phi(\lambda)$ is a bounded nondecreasing function called the spectral function the field $X(t)$, $Z_m^l(d\lambda)$ are random measures with orthogonal values such that $E(Z_m^l(S_1)Z_{m_1}^{l_1}(S_2))=\delta_m^{m_1}\delta_l^{l_1}\Phi(S_1\cap S_2)$, $c_n^2=2^{n-1}\Gamma(n/2)\pi^{n/2}$.
Homogeneous and isotropic random field $X(t)$, $t\in\mathbb R^2$, on the plane admits the spectral representation
$$ X(t,\varphi)=
\sum_{m=0}^{\infty}
\cos(m\varphi)Y_m(r\lambda)Z_m^1(d\lambda)+
\sum_{m=1}^{\infty}
\sin(m\varphi)Y_m(r\lambda)Z_m^2(d\lambda).
$$
These spectral decompositions of random fields form a power tool for solution of statistical problems for random fields such as extrapolation,
interpolation, filtering, estimation of parameters of the distribution.
Estimation problems for random fields $X(t)$, $t\in\mathbb R^n$
(estimation of the unknown mathematical expectation, estimation of the correlation function,
estimation of regression parameters, extrapolation, interpolation, filtering, etc)
are similar to the corresponding problems for stochastic processes
(random fields of dimension 1). Complications usually are caused by the form
of domain of points $\{t_j\}=D\subset\mathbb R^n $,
where observations $\{X(t_j)\}$ are given and the dimension of the field.
The complications are overcoming by considering specific domains of
observations and particular classes of random fields.
Let in the domain $D\subset\mathbb R^n$ there are given observations of the random field $$ X(t)=\sum_{i=1}^q\theta_ig_i(t)+Y(t), $$ where $g_i(t),i=1,\dots,q,$ are known non-random functions, $\theta_i,i=1,\dots,q,$ are unknown parameters, $Y(t)$ is a random field with $EY(t)=0$. The problem is to estimate the regression parameters $\theta_i,i=1,\dots,q$. This problem includes as a particular case $(q=1,g_1(t)=1)$ the problem of estimation of the unknown mathematical expectation. Linear unbiased least squares estimates of the regression parameters can be found by solving the corresponding linear algebraic equations or linear integral equations determined with the help of the correlation function. For the class of isotropic random fields formulas for estimates of the regression parameters are proposed by M. I. Yadrenko ([Yadr, Yadrenko 1983]). For example, the estimate $\hat{\theta}$ of the unknown mathematical expectation ${\theta}$ of an isotropic random field $X(t)=X(r,u)$ from observations on the sphere $S_n(r)=\{x\in\mathbb R^n,\|x\|=r\}$ is of the form $$ \hat{\theta}=\frac{1}{\omega_n}\int_{S_n(r)}X(r,u)m_n(du), n\geq2, $$ where $m_n(du)$ is the Lebesgue measure on the sphere $S_n(r)$, $\omega_n$ is the square of the surface of the sphere, $(r,u)$ are spherical coordinates of the point $t\in\mathbb R^n$.
Consider the extrapolation problem. 1. Let observations of the
mean-square continuous homogeneous and isotropic random field
$X(t)$, $t\in\mathbb R^n$, are
given on the sphere $S_n(r)=\{x\in\mathbb R^n,\|x\|=r\}$.
The problem is to determine the optimal mean-square linear estimate
$\hat{X}(s)$ of the unknown value $X(s)$, $s\not\in S_n(r)$, of the
random field.
It follows from the spectral representation of the field that this
estimate is of the form
$$
\hat{X}(s)=\sum_{m=0}^{\infty}\sum_{l=1}^{h(m,n)}
c_m^l(s)
\int_0^{\infty}\frac{J_{m+(n-2)/2}(r\lambda)}{(r\lambda)^{(n-2)/2}}
Z_m^l(d\lambda),
$$
where
coefficients $c_m^l(s)$ are determined by a special algorithm ([Yadr, Yadrenko 1983]).
For practical purposes it is more convenient to have a formula where
observations $X(t)$, $t\in S_n(r)$, are used directly.
The composition theorem for spherical harmonics gives us this opportunity.
We can write
$$
\hat{X}(s)=\int_{S_n(r)}c(s,t)X(t)dm_n(t),
$$
where the function $c(s,t)$ is determined by the spectral function
$\Phi(\lambda)$ of the field $X(t)$ ([Yadr, Yadrenko 1983]).
2. Let an isotropic random field $X(t)$, $t=(r,u)\in\mathbb R^n$, is
observed in the sphere $V_R=\{x\in\mathbb R^n,\|x\|\leq R\}$.
The optimal liner estimate
$\hat{X}(s)$ of the unknown value $X(s)$, $s=(\rho,v)\not\in V_R$, of the
field has the form
$$
\hat{X}(s)=\int_{V_R}C(s,t)X(t)dm_n(t),
$$
$$
C(s,t)=\sum_{m=0}^{\infty}\sum_{l=1}^{h(m,n)}
c_m^l(r)S_m^l(u),
$$
where coefficients $c_m^l(r)$ are determined via special integral equations
$$
b_m(\rho,q)S_m^l(v)=\int_0^Rb_m(r,q)c_m^l(r)r^{n-1}dr,\quad\, m=0,1,\dots;\,\,l=1,2,\dots,h(m,n),\,
q\in[0,R].
$$
If, for example, $X(t),t=(r,u),$ is an isotropic random field where
$b_m(r,q)=a^{|m|}\exp\{-\beta|r-q|\}$,
then it is easy to see that
$\hat{X}(\rho,v)=\exp\{-\beta|\rho-R|\}X(R,v),\,v\in S_n$.
For methods of solutions of other estimation problems for random fields (extrapolation, interpolation, filtering, etc) see [Cr, Cressie (1991)], [Gren, Grenander (1981)], [Mok, Moklyachuk (2008)], [Ramm, Ramm (2005)], [Ripl, Ripley (1981)], [Roz, Rozanov (1982)], [Yadr, Yadrenko (1983)], and [Yagla, Yaglom (1987)].
References
[Ad] | Adler, Robert J. and Taylor, Jonathan E. (2007). Random fields and geometry. Springer Monographs in Mathematics. New York, NY: Springer. |
[Ch] | Chung, Kai Lai and Walsh, John B. (2005). Markov processes, Brownian motion, and time symmetry. 2nd ed. Grundlehren der Mathematischen Wissenschaften 249. New York: Springer. |
[Cr] | Cressie, Noel A.C. (1991). Statistics for spatial data. Wiley Series in Probability and Mathematical Statistics. New York etc.: John Wiley & Sons, Inc. |
[Gl] | Glimm, James and Jaffe, Arthur. (1981). Quantum physics. A functional integral point of view. New York - Heidelberg - Berlin: Springer-Verlag. |
[Gren] | Grenander, Ulf (1981). Abstract inference. Wiley Series in Probability and Mathematical Statistics. New York etc.: John Wiley & Sons. |
[Ker] | Kerstan, Johannes; Matthes, Klaus and Mecke, Joseph. (1974). Unbegrenzt teilbare Punktprozesse. Mathematische Lehrbücher und Monographien. II. Abt. Mathematische Monographien. Band XXVII. Berlin: Akademie-Verlag. |
[Khos] | Khoshnevisan, Davar (2002). Multiparameter processes. An introduction to random fields. Springer Monographs in Mathematics. New York, NY: Springer. |
[Mal] | Malyshev, V. A. and Minlos, R. A. (1985). Stochastic Gibbs fields. The method of cluster expansions. Moskva: "Nauka". |
[Mok] | Moklyachuk, M. P. (2008). Robust estimates of functionals of stochastic processes. Vydavnycho-Poligrafichnyĭ Tsentr, Kyïvskyĭ Universytet, Kyïv. |
[Mona] | Monin, A. S. and Yaglom, A. M. (2007a). Statistical fluid mechanics: mechanics of turbulence. Vol. I. Edited and with a preface by John L. Lumley. Mineola, NY: Dover Publications. |
[Monb] | Monin, A. S. and Yaglom, A. M. (2007b). Statistical fluid mechanics: mechanics of turbulence. Vol. II. Edited and with a preface by John L. Lumley. Mineola, NY: Dover Publications. |
[Pit] | Piterbarg, V.I. (1996). Asymptotic methods in the theory of Gaussian processes and fields. Translations of Mathematical Monographs. 148. Providence, RI: AMS. |
[Ramm] | Ramm, A. G. (2005). Random fields estimation. Hackensack, NJ: World Scientific. |
[Ripl] | Ripley, B. D. (1981). Spatial statistics. Wiley Series in Probability and Mathematical Statistics. New York etc.: John Wiley & Sons, Inc. |
[Roz] | Rozanov, Yu. A. (1982). Markov random fields. New York - Heidelberg - Berlin: Springer-Verlag. |
[Yadr] | Yadrenko, M. I. (1983). Spectral theory of random fields. Translation Series in Mathematics and Engineering. New York: Optimization Software, Inc., Publications Division; New York-Heidelberg-Berlin: Springer-Verlag. |
[Yagla] | Yaglom, A. M. (1987a). Correlation theory of stationary and related random functions. Volume I: Basic results. Springer Series in Statistics. New York etc.: Springer-Verlag. |
[Yaglb] | Yaglom, A. M. (1987b). Correlation theory of stationary and related random functions. Volume II: Supplementary notes and references. Springer Series in Statistics. New York etc.: Springer-Verlag. |
Reprinted with permission from Lovric, Miodrag (2011), International Encyclopedia of Statistical Science. Heidelberg: Springer Science +Business Media, LLC
Classification
AMS MSC:62M40;60G60
Random field (Statprob). Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Random_field_(Statprob)&oldid=54179