# Martingale

2010 Mathematics Subject Classification: Primary: 60G42 Secondary: 60G44 [MSN][ZBL]

A stochastic process $X = ( X _ {t} , {\mathcal F} _ {t} )$, $t \in T \subseteq [ 0 , \infty )$, defined on a probability space $( \Omega , {\mathcal F} , {\mathsf P} )$ with a non-decreasing family of $\sigma$- algebras $( {\mathcal F} _ {t} ) _ {t \in T }$, ${\mathcal F} _ {s} \subseteq {\mathcal F} _ {t} \subseteq {\mathcal F}$, $s \leq t$, such that ${\mathsf E} | X _ {t} | < \infty$, $X _ {t}$ is ${\mathcal F} _ {t}$- measurable and

$$\tag{1 } {\mathsf E} ( X _ {t} \mid {\mathcal F} _ {s} ) = X _ {s}$$

(with probability 1). In the case of discrete time $T = \{ 1 , 2 ,\dots \}$; in the case of continuous time $T = [ 0 , \infty )$. Related notions are stochastic processes which form a submartingale, if

$${\mathsf E} ( X _ {t} \mid {\mathcal F} _ {s} ) \geq X _ {s} ,$$

or a supermartingale, if

$${\mathsf E} ( X _ {t} \mid {\mathcal F} _ {s} ) \leq X _ {s} .$$

Example 1. If $\xi _ {1} , \xi _ {2} \dots$ is a sequence of independent random variables with ${\mathsf E} \xi _ {j} = 0$, then $X = ( X _ {n} , {\mathcal F} _ {n} )$, $n \geq 1$, with $X _ {n} = \xi _ {1} + \dots + \xi _ {n}$ and ${\mathcal F} _ {n} = \sigma \{ \xi _ {1} \dots \xi _ {n} \}$ the $\sigma$- algebra generated by $\xi _ {1} \dots \xi _ {n}$, is a martingale.

Example 2. Let $Y = ( Y _ {n} , {\mathcal F} _ {n} )$ be a martingale (submartingale), $V = ( V _ {n} , {\mathcal F} _ {n} )$ a predictable sequence (that is, $V _ {n}$ is not only ${\mathcal F} _ {n}$- measurable but also ${\mathcal F} _ {n-} 1$- measurable, $n \geq 1$), ${\mathcal F} _ {0} = \{ \emptyset , \Omega \}$, and let

$$( V \cdot Y ) _ {n} = \ V _ {1} Y _ {1} + \sum _ { k= } 2 ^ { n } V _ {k} \Delta Y _ {k} ,\ \ \Delta Y _ {k} = Y _ {k} - Y _ {k-} 1 .$$

Then, if the variables $( V \cdot Y ) _ {n}$ are integrable, the stochastic process $( ( V \cdot Y ) _ {n} , {\mathcal F} _ {n} )$ forms a martingale (submartingale). In particular, if $\xi _ {1} , \xi _ {2} \dots$ is a sequence of independent random variables corresponding to a Bernoulli scheme

$${\mathsf P} \{ \xi _ {i} = \pm 1 \} = \frac{1}{2} ,\ \ Y _ {k} = \xi _ {1} + \dots + \xi _ {k} ,$$

$${\mathcal F} _ {k} = \sigma \{ \xi _ {1} \dots \xi _ {k} \} ,$$

and

$$\tag{2 } V _ {k} = \ \left \{ \begin{array}{ll} 2 &\textrm{ if } \xi _ {1} = \dots = \xi _ {k-} 1 = 1 , \\ 0 &\textrm{ otherwise } , \\ \end{array} \right .$$

then $( ( V \cdot Y ) _ {n} , {\mathcal F} _ {n} )$ is a martingale. This stochastic process is a mathematical model of a game in which a player wins one unit of capital if $\xi _ {k} = + 1$ and loses one unit of capital if $\xi _ {k} = - 1$, and $V _ {k}$ is the stake at the $k$- th game. The game-theoretic sense of the function $V _ {k}$ defined by (2) is that the player doubles his stake when he loses and stops the game on his first win. In the gambling world such a system is called a martingale, which explains the origin of the mathematical term "martingale" .

One of the basic facts of the theory of martingales is that the structure of a martingale (submartingale) $X = ( X _ {t} , {\mathcal F} _ {t} )$ is preserved under a random change of time. A precise statement of this (called the optimal sampling theorem) is the following: If $\tau _ {1}$ and $\tau _ {2}$ are two finite stopping times (cf. Markov moment), if ${\mathsf P} \{ \tau _ {1} \leq \tau _ {2} \} = 1$ and if

$$\tag{3 } {\mathsf E} | X _ {\tau _ {i} } | \ < \infty ,\ \lim\limits _ { t } \ \inf \int\limits _ {\{ \tau _ {i} > t \} } | X _ {t} | d {\mathsf P} = 0 ,$$

then ${\mathsf E} ( X _ {\tau _ {2} } \mid {\mathcal F} _ {\tau _ {1} } ) ( \geq ) = X _ {\tau _ {1} }$( with probability 1), where

$${\mathcal F} _ {\tau _ {1} } = \ \{ {A \in {\mathcal F} } : {A \cap \{ \tau _ {1} \leq t \} \in {\mathcal F} _ {t} \textrm{ for all } t \in T } \} .$$

As a particular case of this the Wald identity follows:

$${\mathsf E} ( \xi _ {1} + \dots + \xi _ \tau ) = {\mathsf E} \xi _ {1} {\mathsf E} \tau .$$

Among the basic results of the theory of martingales is Doob's inequality: If $X = ( X _ {n} , {\mathcal F} _ {n} )$ is a non-negative submartingale,

$$X _ {n} ^ {*} = \max _ {1 \leq j \leq n } X _ {j} ,$$

$$\| X _ {n} \| _ {p} = ( {\mathsf E} | X _ {n} | ^ {p} ) ^ {1/p} ,\ p \geq 1 ,\ n \geq 1 ,$$

then

$$\tag{4 } {\mathsf P} \{ X _ {n} ^ {*} \geq \epsilon \} \leq \ \frac{ {\mathsf E} X _ {n} } \epsilon ,$$

$$\tag{5 } \| X _ {n} \| _ {p} \leq \| X _ {n} ^ {*} \| _ {p} \leq \frac{p}{p-} 1 \| X _ {n} \| _ {p} ,\ p > 1 ,$$

$$\tag{6 } \| X _ {n} ^ {*} \| _ {p} \leq \frac{e}{e-} 1 [ 1 + \| X _ {n} \mathop{\rm ln} ^ {+} X _ {n} \| _ {p} ] ,\ p = 1 .$$

If $X = ( X _ {n} , {\mathcal F} _ {n} )$ is a martingale, then for $p > 1$ the Burkholder inequalities hold (generalizations of the inequalities of Khinchin and Marcinkiewicz–Zygmund for sums of independent random variables):

$$\tag{7 } A _ {p} \| \sqrt {[ X ] _ {n} } \| _ {p} \leq \| X _ {n} \| _ {p} \leq B _ {p} \| \sqrt {[ X ] _ {n} } \| _ {p} ,$$

where $A _ {p}$ and $B _ {p}$ are certain universal constants (not depending on $X$ or $n$), for which one can take

$$A _ {p} = \ \left ( \frac{18 p ^ {3/2} }{p - 1 } \right ) ^ {-} 1 ,\ \ B _ {p} = \ \frac{18 p ^ {3/2} }{( p - 1 ) ^ {1/2} } ,$$

and

$$[ X ] _ {n} = \ \sum _ { i= } 1 ^ { n } ( \Delta X _ {i} ) ^ {2} ,\ \ X _ {0} = 0 .$$

Taking (5) and (7) into account, it follows that

$$\tag{8 } A _ {p} \| \sqrt {[ X ] _ {n} } \| _ {p} \ \leq \| X _ {n} ^ {*} \| _ {p} \ \leq \widetilde{B} _ {p} \| \sqrt {[ X ] _ {n} } \| _ {p} ,$$

where

$$\widetilde{B} _ {p} = \ \frac{18 p ^ {5/2} }{( p - 1 ) ^ {3/2} } .$$

When $p = 1$ inequality (8) can be generalized. Namely, Davis' inequality holds: There are universal constants $A$ and $B$ such that

$$A \| \sqrt {[ X ] _ {n} } \| _ {1} \ \leq \| X _ {n} ^ {*} \| _ {1} \ \leq B \| \sqrt {[ X ] _ {n} } \| _ {1} .$$

In the proof of a different kind of theorem on the convergence of submartingales with probability 1, a key role is played by Doob's inequality for the mathematical expectation ${\mathsf E} \beta _ {n} ( a , b)$ of the number of upcrossings, $\beta _ {n} ( a , b )$, of the interval $[ a , b ]$ by the submartingale $X = ( X _ {n} , {\mathcal F} _ {n} )$ in $n$ steps; namely

$$\tag{9 } {\mathsf E} \beta _ {n} ( a , b ) \leq \ \frac{ {\mathsf E} | X _ {n} | + | a | }{b - a } .$$

The basic result on the convergence of submartingales is Doob's theorem: If $X = ( X _ {n} , {\mathcal F} _ {n} )$ is a submartingale and $\sup {\mathsf E} | X _ {n} | < \infty$, then with probability 1, $\lim\limits _ {n \rightarrow \infty } X _ {n}$( $= X _ \infty$) exists and ${\mathsf E} | X _ \infty | < \infty$. If the submartingale $X$ is uniformly integrable, then, in addition to convergence with probability $1$, convergence in $L _ {1}$ holds, that is,

$${\mathsf E} | X _ {n} - X _ \infty | \rightarrow 0 ,\ n \rightarrow \infty .$$

A corollary of this result is Lévy's theorem on the continuity of conditional mathematical expectations: If ${\mathsf E} | \xi | < \infty$, then

$${\mathsf E} ( \xi | {\mathcal F} _ {n} ) \rightarrow {\mathsf E} ( \xi | {\mathcal F} _ \infty ) ,$$

where ${\mathcal F} _ {1} \subseteq {\mathcal F} _ {2} \subseteq \dots$ and ${\mathcal F} _ \infty = \sigma ( \cup _ {n} {\mathcal F} _ {n} )$.

A natural generalization of a martingale is the concept of a local martingale, that is, a stochastic process $X = ( X _ {t} , {\mathcal F} _ {t} )$ for which there is a sequence $( \tau _ {m} ) _ {m \geq 1 }$ of finite stopping times $\tau _ {m} \uparrow \infty$( with probability 1), $m \geq 1$, such that for each $m \geq 1$ the "stopped" processes

$$X ^ {\tau _ {m} } = \ ( X _ {t \wedge \tau _ {m} } I ( \tau _ {m} > 0 ) , {\mathcal F} _ {t} )$$

are martingales. In the case of discrete time each local martingale $X = ( X _ {n} , {\mathcal F} _ {n} )$ is a martingale transform, that is, can be represented in the form $X _ {n} = ( V \cdot Y ) _ {n}$, where $V$ is a predictable process and $Y$ is a martingale.

Each submartingale $X = ( X _ {t} , {\mathcal F} _ {t} )$ has, moreover, a unique Doob–Meyer decomposition $X _ {t} = M _ {t} + A _ {t}$, where $M = ( M _ {t} , {\mathcal F} _ {t} )$ is a local martingale and $A = ( A _ {t} , {\mathcal F} _ {t} )$ is a predictable non-decreasing process. In particular, if $m = ( m _ {t} , {\mathcal F} _ {t} )$ is a square-integrable martingale, then its square $m ^ {2} = ( m _ {t} ^ {2} , {\mathcal F} _ {t} )$ is a submartingale in whose Doob–Meyer decomposition $m _ {t} ^ {2} = M _ {t} + \langle m \rangle _ {t}$ the process $\langle m \rangle = ( \langle m \rangle _ {t} , {\mathcal F} _ {t} )$ is called the quadratic characteristic of the martingale $m$. For each square-integrable martingale $m$ and predictable process $V = ( V _ {t} , {\mathcal F} _ {t} )$ such that $\int _ {0} ^ {t} V _ {s} ^ {2} d \langle m \rangle _ {s} < \infty$( with probability 1), $t > 0$, it is possible to define the stochastic integral

$$( V \cdot m ) _ {t} = \int\limits _ { 0 } ^ { t } V _ {s} d m _ {s} ,$$

which is a local martingale. In the case of a Wiener process $W = ( W _ {t} , {\mathcal F} _ {t} )$, which is a square-integrable martingale, $\langle m \rangle _ {t} = t$ and the stochastic integral $( V \cdot W ) _ {t}$ is none other than the Itô stochastic integral with respect to the Wiener process.

In the case of continuous time the Doob, Burkholder and Davis inequalities are still true (for right-continuous processes having left limits).

How to Cite This Entry:
Martingale. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Martingale&oldid=49256
This article was adapted from an original article by A.N. Shiryaev (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article