Difference between revisions of "Random function"
(Importing text file) |
Ulf Rehmann (talk | contribs) m (tex encoded by computer) |
||
Line 1: | Line 1: | ||
− | + | <!-- | |
+ | r0773301.png | ||
+ | $#A+1 = 92 n = 0 | ||
+ | $#C+1 = 92 : ~/encyclopedia/old_files/data/R077/R.0707330 Random function | ||
+ | Automatically converted into TeX, above some diagnostics. | ||
+ | Please remove this comment and the {{TEX|auto}} line below, | ||
+ | if TeX found to be correct. | ||
+ | --> | ||
− | + | {{TEX|auto}} | |
+ | {{TEX|done}} | ||
− | + | A function of an arbitrary argument $ t $( | |
+ | defined on the set $ T $ | ||
+ | of its values, and taking numerical values or, more generally, values in a vector space) whose values are defined in terms of a certain experiment and may vary with the outcome of this experiment according to a given probability distribution. In [[Probability theory|probability theory]], attention centres on numerical (that is, scalar) random functions $ X ( t) $; | ||
+ | a random vector function $ \mathbf X ( t) $ | ||
+ | can be regarded as the aggregate of the scalar functions $ X _ \alpha ( t) $, | ||
+ | where $ \alpha $ | ||
+ | ranges over the finite or countable set $ A $ | ||
+ | of components of $ \mathbf X $, | ||
+ | that is, as a numerical random function on the set $ T _ {1} = T \times A $ | ||
+ | of pairs $ ( t , \alpha ) $, | ||
+ | $ t \in T $, | ||
+ | $ \alpha \in A $. | ||
− | + | When $ T $ | |
+ | is finite, $ X ( t) $ | ||
+ | is a finite set of random variables, and can be regarded as a multi-dimensional (vector) random variable characterized by a multi-dimensional distribution function. When $ T $ | ||
+ | is infinite, the case mostly studied is that in which $ t $ | ||
+ | takes numerical (real) values; in this case, $ t $ | ||
+ | usually denotes time, and $ X ( t) $ | ||
+ | is called a [[Stochastic process|stochastic process]], or, if $ t $ | ||
+ | takes only integral values, a [[Random sequence|random sequence]] (or time series). If the values of $ t $ | ||
+ | are the points of a manifold (such as a $ k $- | ||
+ | dimensional Euclidean space $ \mathbf R ^ {k} $), | ||
+ | then $ X ( t) $ | ||
+ | is called a [[Random field|random field]]. | ||
− | + | The probability distribution of the values of a random function $ X ( t) $ | |
+ | defined on an infinite set $ T $ | ||
+ | is characterized by the aggregate of finite-dimensional probability distributions of sets of random variables $ X ( t _ {1} ) \dots X ( t _ {n} ) $ | ||
+ | corresponding to all finite subsets $ \{ t _ {1} \dots t _ {n} \} $ | ||
+ | of $ T $, | ||
+ | that is, the aggregate of corresponding finite-dimensional distribution functions $ F _ {t _ {1} \dots t _ {n} } ( x _ {1} \dots x _ {n} ) $, | ||
+ | satisfying the consistency conditions: | ||
− | + | $$ \tag{1 } | |
+ | F _ {t _ {1} \dots t _ {n} , t _ {n+} 1 \dots t _ {n+} m } ( x _ {1} \dots x _ {n} , \infty \dots \infty ) = | ||
+ | $$ | ||
− | + | $$ | |
+ | = \ | ||
+ | F _ {t _ {1} \dots t _ {n} } ( x _ {1} \dots x _ {n} ) , | ||
+ | $$ | ||
− | + | $$ \tag{2 } | |
+ | F _ {t _ {i _ {1} } \dots t _ {i _ {n} } } | ||
+ | ( x _ {i _ {1} } \dots x _ {i _ {n} } ) = F _ {t _ {1} \dots t _ {n} } ( x _ {1} \dots x _ {n} ) , | ||
+ | $$ | ||
− | + | where $ i _ {1} \dots i _ {n} $ | |
+ | is an arbitrary permutation of the subscripts $ 1 \dots n $. | ||
+ | This characterization of the probability distribution of $ X ( t) $ | ||
+ | is sufficient in all cases when one is only interested in events depending on the values of $ X $ | ||
+ | on countable subsets of $ T $. | ||
+ | But it does not enable one to determine the probability of properties of $ X $ | ||
+ | that depend on its values on a continuous subset of $ T $, | ||
+ | such as the probability of continuity or differentiability, or the probability that $ X ( t) < a $ | ||
+ | on a continuous subset of $ T $( | ||
+ | see [[Separable process|Separable process]]). | ||
+ | Random functions can be described more generally in terms of aggregates of random variables $ X = X ( \omega ) $ | ||
+ | defined on a fixed [[Probability space|probability space]] $ ( \Omega , {\mathcal A} , {\mathsf P} ) $( | ||
+ | where $ \Omega $ | ||
+ | is a set of points $ \omega $, | ||
+ | $ {\mathcal A} $ | ||
+ | is a $ \sigma $- | ||
+ | algebra of subsets of $ \Omega $ | ||
+ | and $ {\mathsf P} $ | ||
+ | is a given probability measure on $ {\mathcal A} $), | ||
+ | one for each point $ t $ | ||
+ | of $ T $. | ||
+ | In this approach, a random function on $ T $ | ||
+ | is regarded as a function $ X ( t , \omega ) $ | ||
+ | of two variables $ t \in T $ | ||
+ | and $ \omega \in \Omega $ | ||
+ | which is $ {\mathcal A} $- | ||
+ | measurable for every $ t $( | ||
+ | that is, for fixed $ t $ | ||
+ | it reduces to a random variable defined on the probability space $ ( \Omega , {\mathcal A} , {\mathsf P} ) $). | ||
+ | By taking a fixed value $ \omega _ {0} $ | ||
+ | of $ \omega $, | ||
+ | one obtains a numerical function $ X ( t , \omega _ {0} ) = x ( t) $ | ||
+ | on $ T $, | ||
+ | called a realization (or sample function or, when $ t $ | ||
+ | denotes time, a trajectory) of $ X ( t) $; | ||
+ | $ {\mathcal A} $ | ||
+ | and $ {\mathsf P} $ | ||
+ | induce a $ \sigma $- | ||
+ | algebra of subsets and a probability measure defined on it in the function space $ \mathbf R ^ {T} = \{ {x ( t) } : {t \in T } \} $ | ||
+ | of realizations $ x ( t) $, | ||
+ | whose specification can also be regarded as equivalent to that of the random function. The specification of a random function as a probability measure on a $ \sigma $- | ||
+ | algebra of subsets of the function space $ \mathbf R ^ {T} $ | ||
+ | of all possible realizations $ x ( t) $ | ||
+ | can be regarded as a special case of its general specification as a function of two variables $ X ( t , \omega ) $( | ||
+ | where $ \omega $ | ||
+ | belongs to the probability space $ ( \Omega , {\mathcal A} , {\mathsf P} ) $ | ||
+ | in which $ \Omega = \mathbf R ^ {T} $), | ||
+ | that is, elementary events (points $ \omega $ | ||
+ | in the given probability space) are identified at the outset with the realizations $ x ( t) $ | ||
+ | of $ X ( t) $. | ||
+ | On the other hand, it is also possible to show that any other way of specifying $ X ( t) $ | ||
+ | can be reduced to this form using a special determination of a probability measure on $ \mathbf R ^ {T} $. | ||
+ | In particular, Kolmogorov's fundamental theorem on consistent distributions (see [[Probability space|Probability space]]) shows that the specification of the aggregate of all possible finite-dimensional distribution functions $ F _ {t _ {1} \dots t _ {n} } ( x _ {1} \dots x _ {n} ) $ | ||
+ | satisfying the above consistency conditions (1) and (2) defines a probability measure on the $ \sigma $- | ||
+ | algebra of subsets of the function space $ \mathbf R ^ {T} = \{ {x ( t) } : {t \in T } \} $ | ||
+ | generated by the aggregate of cylindrical sets (cf. [[Cylinder set|Cylinder set]]) of the form $ \{ {x ( t) } : {[ x ( t _ {1} ) \dots x ( t _ {n} ) ] \in B ^ {n} } \} $, | ||
+ | where $ n $ | ||
+ | is an arbitrary positive integer and $ B ^ {n} $ | ||
+ | is an arbitrary Borel set of the $ n $- | ||
+ | dimensional space $ \mathbf R ^ {n} $ | ||
+ | of vectors $ [ x ( t _ {1} ) \dots x ( t _ {n} ) ] $. | ||
+ | For references see [[Stochastic process|Stochastic process]]. | ||
====Comments==== | ====Comments==== | ||
− | |||
====References==== | ====References==== | ||
<table><TR><TD valign="top">[a1]</TD> <TD valign="top"> J.L. Doob, "Stochastic processes" , Wiley (1953)</TD></TR><TR><TD valign="top">[a2]</TD> <TD valign="top"> M. Loève, "Probability theory" , Springer (1977)</TD></TR><TR><TD valign="top">[a3]</TD> <TD valign="top"> I.I. [I.I. Gikhman] Gihman, A.V. [A.V. Skorokhod] Skorohod, "The theory of stochastic processes" , '''1''' , Springer (1974) (Translated from Russian)</TD></TR><TR><TD valign="top">[a4]</TD> <TD valign="top"> A. Blanc-Lapierre, R. Fortet, "Theory of random functions" , '''1–2''' , Gordon & Breach (1965) (Translated from French)</TD></TR></table> | <table><TR><TD valign="top">[a1]</TD> <TD valign="top"> J.L. Doob, "Stochastic processes" , Wiley (1953)</TD></TR><TR><TD valign="top">[a2]</TD> <TD valign="top"> M. Loève, "Probability theory" , Springer (1977)</TD></TR><TR><TD valign="top">[a3]</TD> <TD valign="top"> I.I. [I.I. Gikhman] Gihman, A.V. [A.V. Skorokhod] Skorohod, "The theory of stochastic processes" , '''1''' , Springer (1974) (Translated from Russian)</TD></TR><TR><TD valign="top">[a4]</TD> <TD valign="top"> A. Blanc-Lapierre, R. Fortet, "Theory of random functions" , '''1–2''' , Gordon & Breach (1965) (Translated from French)</TD></TR></table> |
Latest revision as of 08:09, 6 June 2020
A function of an arbitrary argument $ t $(
defined on the set $ T $
of its values, and taking numerical values or, more generally, values in a vector space) whose values are defined in terms of a certain experiment and may vary with the outcome of this experiment according to a given probability distribution. In probability theory, attention centres on numerical (that is, scalar) random functions $ X ( t) $;
a random vector function $ \mathbf X ( t) $
can be regarded as the aggregate of the scalar functions $ X _ \alpha ( t) $,
where $ \alpha $
ranges over the finite or countable set $ A $
of components of $ \mathbf X $,
that is, as a numerical random function on the set $ T _ {1} = T \times A $
of pairs $ ( t , \alpha ) $,
$ t \in T $,
$ \alpha \in A $.
When $ T $ is finite, $ X ( t) $ is a finite set of random variables, and can be regarded as a multi-dimensional (vector) random variable characterized by a multi-dimensional distribution function. When $ T $ is infinite, the case mostly studied is that in which $ t $ takes numerical (real) values; in this case, $ t $ usually denotes time, and $ X ( t) $ is called a stochastic process, or, if $ t $ takes only integral values, a random sequence (or time series). If the values of $ t $ are the points of a manifold (such as a $ k $- dimensional Euclidean space $ \mathbf R ^ {k} $), then $ X ( t) $ is called a random field.
The probability distribution of the values of a random function $ X ( t) $ defined on an infinite set $ T $ is characterized by the aggregate of finite-dimensional probability distributions of sets of random variables $ X ( t _ {1} ) \dots X ( t _ {n} ) $ corresponding to all finite subsets $ \{ t _ {1} \dots t _ {n} \} $ of $ T $, that is, the aggregate of corresponding finite-dimensional distribution functions $ F _ {t _ {1} \dots t _ {n} } ( x _ {1} \dots x _ {n} ) $, satisfying the consistency conditions:
$$ \tag{1 } F _ {t _ {1} \dots t _ {n} , t _ {n+} 1 \dots t _ {n+} m } ( x _ {1} \dots x _ {n} , \infty \dots \infty ) = $$
$$ = \ F _ {t _ {1} \dots t _ {n} } ( x _ {1} \dots x _ {n} ) , $$
$$ \tag{2 } F _ {t _ {i _ {1} } \dots t _ {i _ {n} } } ( x _ {i _ {1} } \dots x _ {i _ {n} } ) = F _ {t _ {1} \dots t _ {n} } ( x _ {1} \dots x _ {n} ) , $$
where $ i _ {1} \dots i _ {n} $ is an arbitrary permutation of the subscripts $ 1 \dots n $. This characterization of the probability distribution of $ X ( t) $ is sufficient in all cases when one is only interested in events depending on the values of $ X $ on countable subsets of $ T $. But it does not enable one to determine the probability of properties of $ X $ that depend on its values on a continuous subset of $ T $, such as the probability of continuity or differentiability, or the probability that $ X ( t) < a $ on a continuous subset of $ T $( see Separable process).
Random functions can be described more generally in terms of aggregates of random variables $ X = X ( \omega ) $ defined on a fixed probability space $ ( \Omega , {\mathcal A} , {\mathsf P} ) $( where $ \Omega $ is a set of points $ \omega $, $ {\mathcal A} $ is a $ \sigma $- algebra of subsets of $ \Omega $ and $ {\mathsf P} $ is a given probability measure on $ {\mathcal A} $), one for each point $ t $ of $ T $. In this approach, a random function on $ T $ is regarded as a function $ X ( t , \omega ) $ of two variables $ t \in T $ and $ \omega \in \Omega $ which is $ {\mathcal A} $- measurable for every $ t $( that is, for fixed $ t $ it reduces to a random variable defined on the probability space $ ( \Omega , {\mathcal A} , {\mathsf P} ) $). By taking a fixed value $ \omega _ {0} $ of $ \omega $, one obtains a numerical function $ X ( t , \omega _ {0} ) = x ( t) $ on $ T $, called a realization (or sample function or, when $ t $ denotes time, a trajectory) of $ X ( t) $; $ {\mathcal A} $ and $ {\mathsf P} $ induce a $ \sigma $- algebra of subsets and a probability measure defined on it in the function space $ \mathbf R ^ {T} = \{ {x ( t) } : {t \in T } \} $ of realizations $ x ( t) $, whose specification can also be regarded as equivalent to that of the random function. The specification of a random function as a probability measure on a $ \sigma $- algebra of subsets of the function space $ \mathbf R ^ {T} $ of all possible realizations $ x ( t) $ can be regarded as a special case of its general specification as a function of two variables $ X ( t , \omega ) $( where $ \omega $ belongs to the probability space $ ( \Omega , {\mathcal A} , {\mathsf P} ) $ in which $ \Omega = \mathbf R ^ {T} $), that is, elementary events (points $ \omega $ in the given probability space) are identified at the outset with the realizations $ x ( t) $ of $ X ( t) $. On the other hand, it is also possible to show that any other way of specifying $ X ( t) $ can be reduced to this form using a special determination of a probability measure on $ \mathbf R ^ {T} $. In particular, Kolmogorov's fundamental theorem on consistent distributions (see Probability space) shows that the specification of the aggregate of all possible finite-dimensional distribution functions $ F _ {t _ {1} \dots t _ {n} } ( x _ {1} \dots x _ {n} ) $ satisfying the above consistency conditions (1) and (2) defines a probability measure on the $ \sigma $- algebra of subsets of the function space $ \mathbf R ^ {T} = \{ {x ( t) } : {t \in T } \} $ generated by the aggregate of cylindrical sets (cf. Cylinder set) of the form $ \{ {x ( t) } : {[ x ( t _ {1} ) \dots x ( t _ {n} ) ] \in B ^ {n} } \} $, where $ n $ is an arbitrary positive integer and $ B ^ {n} $ is an arbitrary Borel set of the $ n $- dimensional space $ \mathbf R ^ {n} $ of vectors $ [ x ( t _ {1} ) \dots x ( t _ {n} ) ] $.
For references see Stochastic process.
Comments
References
[a1] | J.L. Doob, "Stochastic processes" , Wiley (1953) |
[a2] | M. Loève, "Probability theory" , Springer (1977) |
[a3] | I.I. [I.I. Gikhman] Gihman, A.V. [A.V. Skorokhod] Skorohod, "The theory of stochastic processes" , 1 , Springer (1974) (Translated from Russian) |
[a4] | A. Blanc-Lapierre, R. Fortet, "Theory of random functions" , 1–2 , Gordon & Breach (1965) (Translated from French) |
Random function. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Random_function&oldid=13830