# Renewal processes

This article Renewal Processes was adapted from an original article by Kosto Valov Mitov, which appeared in StatProb: The Encyclopedia Sponsored by Statistics and Probability Societies. The original article ([http://statprob.com/encyclopedia/RENEWALPROCESSES.html StatProb Source], Local Files: pdf | tex) is copyrighted by the author(s), the article has been donated to Encyclopedia of Mathematics, and its further issues are under Creative Commons Attribution Share-Alike License'. All pages from StatProb are contained in the Category StatProb.

2010 Mathematics Subject Classification: Primary: 60K05 [MSN][ZBL]

Renewal Processes

\it Kosto V. Mitov

Professor, Faculty of Aviation - National Military University, D. Mitropolia, Pleven, Bulgaria

\it Michael A. Zazanis

Professor, Department of Statistics

Athens University of Economics and Business, Athens, Greece

Let $\{X_n;n\in \mathbb{N}\}$ be a sequence of independent, identically distributed random variables with values in $\mathbb{R}^+$ and distribution function $F$. The process $\{S_n;n \in \mathbb{N}_0\}$ defined by means of $S_0:=0$, $S_n:=S_{n-1}+X_n$, $n=1,2,\ldots$ is called an ordinary renewal process. The non--negative random variables $X_n$ are called increments or, in many applications, inter--event times. In connection with the sequence of random points in time, $\{S_n\}$, one can define the counting process $N_t = \sum_{n=0}^\infty 1(S_n \leq t)$, $t \in \mathbb{R}^+$, where $1(A)$ designates the indicator function of the event $A$ (which is 1 if $A$ occurs and 0 otherwise). The renewal function associated with a renewal process is the increasing, right--continuous function $U(t) :=EN_t= \sum_{n=0}^\infty F^{*n}(t)$ where $F^{*n}(t)$ denotes the $n$--fold convolution of the distribution function $F$ with itself (hence $F^{*n}(t) =P(S_n \leq t)$).

Renewal processes are intimately related to the theory of the so--called renewal equation which is a linear integral equation of the form $$\label{renewal_equation} Z(x) = z(x) + \int_0^x Z(x-y) F(dy) \tag{1}$$

where $z: \mathbb{R}^+ \rightarrow \mathbb{R}$ is a Borel function, bounded on finite intervals, and $F$ a probability distribution on $\mathbb{R}^+$. $F$ and $z$ are assumed to be given and the object is to determine the (unique) solution $Z$ which is bounded on finite intervals, and study its asymptotic behavior as $x\rightarrow \infty$. Its solution is given, in terms of the renewal function by the convolution $Z(x) = \int_0^xz(x-y) U(dy)$.

Renewal processes are important as special cases of random point processes. In this respect the Poisson process on the real line is the simplest and most important renewal process. They occur naturally in the theory of replacement of industrial equipment, the theory of queues, in branching processes, and in many other applications. In the framework of perpetual replacement of a single item, $X_n$ is the life of the $n$th such item which, as soon as it fails, is replaced by a new one with independent duration distributed according to $F$. Then $N_t$ is the number of items used in the time interval $[0,t]$ and $S_{N_t}$ is the time of the last replacement before $t$. We define three additional processes $\{A_t;t\geq 0\}$, $\{B_t;t \geq 0\}$, and $\{C_t;t\geq 0\}$ as follows: $A_t := t - S_{N_t-1}$ is the age, $B_t:=S_{N_t}-t$ is the remaining life, and $C_t :=A_t+B_t =X_{N_t}$ is the total life duration of the item currently in use. (The age and remaining life are also known as the backward and forward recurrence times.) The statistics of these processes can be described by means of appropriate renewal equations. For instance, if $W_x(t):=P(A_t \leq x)$ then conditioning on $S_1$ (using the so--called "renewal argument") we obtain $$\label{age} W_x(t) = (1-F(t))1(t \leq x) + \int_0^t W_{x}(t-s)dF(s). \tag{2}$$

If we allow the first increment to have a different distribution from all the others, i.e. if we set $S_0 = X_0$ and $S_n =S_{n-1}+X_n$, $n=1,2,\ldots$ where $X_0$ is independent of the $\{X_n\}$ and, unlike them, has distribution $F_0$, different from $F$, we obtain a delayed renewal process. This type of process is important because it provides additional flexibility in accommodating different initial conditions. Of course, its limiting properties are not affected by this modification. Of particular importance, assuming the mean $m$ to be finite, is the choice $F_0=F_I$, given by $$\label{integrated_tail} F_I(x):= \frac{1}{m} \int_0^x (1-F(y))dy. \tag{3}$$

With this choice, $\{S_n\}$ becomes a stationary point process. $F_I$ is called the integrated tail distribution associated with the distribution $F$.

Of fundamental importance are the limit theorems related to renewal processes. If $\displaystyle m:=\int_0^\infty x dF(x)$ denotes the mean of the increments, then the Elementary Renewal Theorem states that $\displaystyle \lim_{t \rightarrow \infty} t^{-1} U(t) = m^{-1}$. (The result holds also in the case $m=\infty$ provided that we interpret $m^{-1}$ as 0.) A refinement is possible if the increments have finite second moment, in which case $\displaystyle \lim_{t \rightarrow \infty} \left( U(t) - t/m \right) = EX_1^2/(2m^2)$. An analogous bound, due to Lorden (1970), also holds for all $t \geq 0$: $\displaystyle U(t) \leq t/m + EX_1^2/m^2$. When the second moment exists we also have a Central Limit Theorem for the number of events up to time $t$: As $t \rightarrow \infty$, $\displaystyle \frac{N_t - t/m}{\sigma \sqrt{t/m^3}} \stackrel{d}{\rightarrow} Z$ where $Z$ is a standard Normal random variable and $\sigma^2= Var(X_1)$.

Much deeper is Blackwell's Theorem which states that, if $F$ in non--lattice and the mean $m$ is finite then $$\label{Blackwell} \lim_{t \rightarrow \infty} \left( U(t+h) - U(t) \right) = h/m \hspace{0.3in} \mbox{ for all h >0}. \tag{4}$$

(A distribution $F$ on $\mathbb{R}^+$ is lattice with lattice size $\delta$ if there exists $\delta>0$ such that the support of $F$ is a subset of $\{n\delta; n=0,1,2,\ldots\}$ and $\delta$ is the largest such number.) If $F$ is lattice ($\delta$) then (4) still holds, provided that $h$ is an integer multiple of $\delta$. Also, if $m=\infty$ the theorem still holds with $m^{-1}=0$. Blackwell's original proof (1948) of (4) depended on harmonic analysis techniques. In the 1970's with the widespread use of coupling techniques simpler probabilistic proofs of the renewal theorem became available. (See [Lindvall] for a complete account.) An integral version of Blackwell's theorem, the Key Renewal Theorem, states that, if $z$ is directly Riemann integrable then the limit $\displaystyle \lim_{x \rightarrow \infty} \int_0^x z(x-y)dU(y)$ exists and equals $\displaystyle m^{-1} \int_0^\infty z(x)dx$. This then gives the limiting behavior of any function which satisfies a renewal equation (1). (Direct Riemann integrability is a direct extension of the Riemann integral from bounded intervals to unbounded ones: Fix $h>0$ and let $\overline{\gamma}_n(h)=\sup_{nh\leq x< (n+1)h}z(x)$, $\underline{\gamma}_n(h)=\inf_{nh\leq x< (n+1)h}z(x)$. Set $\overline{I}(h):=\sum_{n=0}^\infty h \overline{\gamma}_n(h)$ and $\underline{I}(h):=\sum_{n=0}^\infty h \underline{\gamma}_n(h)$. Clearly, if $h_1>h_2>0$ then $\underline{I}(h_1) \leq \underline{I}(h_2) \leq \overline{I}(h_2) \leq \overline{I}(h_1)$, though these quantities may not necessarily be finite. If $\lim_{h\rightarrow 0} \underline{I}(h)$ and $\lim_{h\rightarrow 0}\overline{I}(h)$ exist and are equal then $z$ is directly Riemann integrable. It should be noted that the direct Riemann integral is more restrictive than either the improper Riemann integral or the Lebesgue integral.)

The discrete version of the renewal theorem is simpler but not elementary. Suppose we are given a probability distribution $\{f_n;n=1,2,\ldots\}$ which is non--arithmetic, i.e. $\text{g.c.d.}\{n:f_n>0\}=1$ and has mean $m= \sum_{n=1}^\infty n f_n$, and define the renewal sequence $\{u_n;n=0,1,2,\ldots\}$ via $u_0=1$, $u_n =f_n+f_{n-1}u_1+\cdots+f_1 u_{n-1}$. Then $\lim_{n\rightarrow \infty}u_n = m^{-1}$ (interpreted as 0 when $m=\infty$). This is the celebrated Erdös--Feller--Pollard (1948) renewal theorem (see [Feller], vol. 1, ch. 13) which marks the beginning of modern renewal theory and played a central r\^ole in the treatment of Markov chains with countable state space. Interesting behavior arises if the non--arithmetic distribution function $\{f_n\}$ has infinite mean: Suppose that $\sum_{k=n+1}^\infty f_k = L(n)n^{-\alpha}$ where $0<\alpha<1$ and $L(n)$ is a slowly varying function. (A real function $L$ is said to be slowly varying if it is positive, measurable, and for every $\lambda>0$, $L(\lambda x)/L(x) \rightarrow 1$ as $x \rightarrow \infty$.) Then (Garsia and Lamperti, 1962) $\liminf_{n\rightarrow \infty} n^{1-\alpha}L(n) u_n = \pi^{-1}\sin \pi \alpha$. If $1/2 < \alpha <1$, this can be sharpened to $\lim_{n\rightarrow \infty} n^{1-\alpha}L(n)u_n =\pi^{-1}\sin \pi \alpha$. Analogous results in continuous time are also proved. Suppose that $F(.)$ is continuous, $F(0+)=0,$ $F(\infty)=1,$ $m=\infty,$ and \begin{eqnarray} &&\label{regvar}\tag{5} \displaystyle 1-F(t) \sim \frac{t^{-\alpha}L(t)}{\Gamma(1-\alpha)} \Leftrightarrow m(t):=\int_0^t (1-F(u))du \sim \frac{t^{1-\alpha}L(t)}{\Gamma(2-\alpha)}, t \to \infty,\end{eqnarray} where $\alpha \in [0,1)$ and $L(\cdot)$ is a slowly varying function at infinity. Under these conditions the growth rate of $U(t)$ is given by (see e.g. [bgt], Ch. 8), \begin{eqnarray*} U(t) \sim C_\alpha t/m(t), \mbox{ as } t \to \infty, \mbox{ where } C_\alpha=[\Gamma(\alpha+1)\Gamma(2-\alpha)]^{-1}. \end{eqnarray*} Erickson (1970) proved a version of Blackwell's theorem in the infinite mean cycle case. It states that if in (\ref{regvar}), $\alpha \in (\frac{1}{2}, 1],$ then for any fixed $h>0$ \begin{eqnarray*} && \lim_{t \to \infty}m(t)[U(t)-U(t-h)] = C_\alpha h. \end{eqnarray*} If $\alpha \in (0,\frac{1}{2}],$ then $\lim$ has to be replaced by $\liminf.$ Several versions of the Key Renewal Theorem in the infinite mean cycle case are also proved in Teugels (1968), Erickson (1970), and Anderson and Athreya (1987).

Using the Key Renewal Theorem one can obtain the asymptotic behavior of the age and the current and residual life. If $Y$ is a random variable with distribution $\displaystyle P(Y \leq y) = \frac{1}{m}\int_0^y x dF(x)$ and $V$ is uniform in $[0,1]$ and independent of $Y$, then $$(A_t,B_t,C_t) \stackrel{d}{\rightarrow} (VY,(1-V)Y,Y) \mbox{ as } t \rightarrow \infty.$$ In particular the limiting marginal distribution of the age (which is the same as that of the residual life) is $$\lim_{t\rightarrow \infty} P(A_t \leq x) = F_I(x),$$ the integrated tail distribution given in (3). The limiting behavior of these processes gives rise to the so called "renewal paradox". For instance, the limiting distribution of the item currently in use is $$\lim_{t\rightarrow \infty} P(C_t \leq x) = \frac{1}{m} \int_0^x ydF(y)$$ with corresponding mean, provided that the second moment of $F$ exists, given by $m+ \sigma^2/m$. Hence if we inspect such a process a long time after it has started operating (and is therefore in equilibrium) the part we are going to see will have longer life duration than average. Of course this is simply an instance of length--biased sampling and its effects are more pronounced when the variability of the distribution $F$ around its mean is large.

In the infinite mean cycle case the life time processes $A_t$ and $B_t$ have a linear growth to infinity, i.e. the normalized processes $A_t/t$ and $B_t/t$ have non-degenerate limit laws, jointly or separately. This result is usually called the Dynkin-Lamperti theorem (Dynkin, 1955, Lamperti, 1962). (See also [bgt], Ch. 8). The theorem states that the condition (\ref{regvar}) with $\alpha \in (0,1)$ is necessary and sufficient for the existence of non-degenerate limit laws for $A_t/t$, $B_t/t$, \begin{eqnarray*} && \displaystyle \lim_{t \to \infty}P\left(A_t/t \le x\right) = \pi^{-1}\sin \pi \alpha \int_0^xu^{-\alpha}(1-u)^{\alpha-1}, \ 0 <x< 1,\\ && \displaystyle \lim_{t \to \infty}P\left(B_t/t \le x\right)=\pi^{-1} \sin \pi \alpha \int_0^x u^{-\alpha}(1+u)^{-1}du, \ x > 0. \end{eqnarray*} An important and immediate generalization of the renewal equation (1) is to allow $F$ to be a general positive finite measure on $\mathbb{R}^+$. Setting $\Vert F \Vert:=F(\mathbb{R}^+)$ one distinguishes the excessive case where $\Vert F\Vert > 1$, the defective case where $\Vert F \Vert < 1$, and the proper case we have already discussed, where $\Vert F \Vert =1$. In the excessive case one can always find a (unique) $\beta >0$ such that $\displaystyle \int_0^\infty e^{-\beta x}dF(x) =1$. One can define then a probability distribution function $F^{\#}$ via the relationship $\displaystyle dF^{\#}(x) = e^{-\beta x} dF(x)$, $x \geq 0$. Multiplying both sides of (1) by $e^{-\beta x}$ and setting $\displaystyle z^\#(x)= e^{-\beta x}z(x)$, $Z^\#(x)= e^{-\beta x}Z(x)$, the proper renewal equation $\displaystyle Z^{\#}(x) = z^{\#}(x)+ \int_0^x z^{\#}(x-y) dF^\#(y)$ is obtained. The Key Renewal Theorem then yields $$\lim_{x\rightarrow \infty}e^{-\beta x}Z(x) = \frac{1}{m^\#}\int_0^\infty z^\#(y)dy,$$ which establishes that, asymptotically, $Z$ grows exponentially with rate $\beta$. We should point out that the defective case is not entirely similar. While formally one again tries to identify $\beta >0$ so that $\displaystyle \int_0^\infty e^{\beta x} dF(x) =1$, this may or may not be possible according to whether the distribution function $\displaystyle \frac{1}{\Vert F\Vert}F(x)$ is light--tailed or heavy--tailed. In the former case one proceeds just as in the excessive case. (For more details see [Feller], vol. 2 ch. 11). This type of analysis is characteristic of the applications of renewal theory to areas such as population dynamics, the theory of collective insurance risk, and to the economic theory or replacement and depreciation (Jorgenson (1974), Feldstein and Rothchild (1974)).

Alternating renewal processes arise in a natural way in many situations, like queueing systems and reliability of industrial equipment, where working(busy) periods ($X$) interchange with idle periods ($T$). Consider a sequence of random vectors with non-negative coordinates $(T_i,X_i), i=1,2,\ldots$. It defines an alternating renewal sequence $(S_n,$ $S_{n+1}')$ as follows $S_0=0, S_n'=S_{n-1}+T_n, \ S_{n}=S_n'+X_n=S_{n-1}+(T_n+X_n), \ n=1,2,\ldots.$ An interpretation in terms of the reliability theory is the following. There are two types of renewal events: $S_n$ is the moment when the installation of a new element begins (The installation takes time $T_n$); $S'_{n+1}$ is the moment when the installation ends and the new element starts working. (The working period has length $X_n$). The renewal process $N(t)=\sup\{n: S_n \le t\}$ counts the pairs of renewal events in the interval $[0,t].$ The processes $\sigma_t=\max\{0,t-S'_{N(t)+1}\}$ -- spent working time and $\tau_t=\min\{S_{N(t)+1}-t,X_{N(t)+1}\}$ -- residual working time generalize the lifetime processes $A_t$ and $B_t$. Their properties are derived in Mitov and Yanev (2001) in the infinite mean cycle case.

The central place that renewal theory holds in the analysis of stochastic systems is due to the concept of regeneration. Let $\{X_t; t \in \mathbb{R}^+\}$ be a process with values in $\cal S$ (e.g. a Euclidean space $\mathbb{R}^d$) and sample paths that are c\adl\ag (right--continuous with left--hand limits) a.s.. Such a process is called regenerative with respect to a (possibly delayed) renewal process $\{S_n\}$, defined on the same probability space, if, for each $n \in \mathbb{N}$ the post $S_n$ process $(\{X_{S_n+t}\}_{t \geq 0}, \{S_{n+k}-S_n\}_{k\in \mathbb{N}})$ is independent of $\{S_0, S_1, \ldots, S_n\}$ and its distribution does not depend on $n$, i.e.\ $(\{X_{S_n+t} \}_{t \geq 0},\{S_{n+k}-S_n\}_{k\in \mathbb{N}}) \stackrel{d}{=} (\{X_{S_0+t}\}_{t \geq 0}, \{S_k-S_0\}_{k\in \mathbb{N}})$ for all $n$. The existence of an embedded, non--lattice renewal process with respect to which the process $\{X_t\}$ is regenerative, together with the finiteness of the mean $m:=E[S_1-S_0]$ is enough to guarantee the existence of a "stationary version", say $\{\tilde{X}_t\}$, to which $\{X_t\}$ converges as $t$ goes to infinity. The statistics of $\{\tilde{X}_t\}$ can be determined by analyzing the behavior of $\{X_t\}$ over any regenerative cycle, i.e. a random time interval of the form $[S_n,S_{n+1})$. If $k \in \mathbb{N}$, $t_i \in \mathbb{R}^+$, $i=1,2,\ldots, k$, and $f:{\cal S}^k \rightarrow \mathbb{R}$ is any bounded, continuous function then \begin{equation*} \label{regenerative}\nonumber Ef(\tilde{X}_{t_1},\ldots,\tilde{X}_{t_k}) = \frac{1}{m} E \int_{S_0}^{S_1} f(X_{t_1+t},\ldots,X_{t_k+t}) dt. \end{equation*}

Nowadays, our view of whole areas of probability, including parts of the theory of Markov processes is influenced by renewal theoretic tools and related concepts of regeneration. The analysis of many stochastic models is greatly facilitated if one identifies certain embedded points in time that occur according to a renewal process and with respect to which the process is regenerative. The fact that these regeneration cycles are independent, identically distributed, also facilitates the statistical analysis of the simulation output of regenerative systems.

A detailed representation of the renewal theory and its applications could be found, for instance, in the following books [Asmussen], [bgt], [Feller], and [Resnick].

Reprinted with permission from Lovric, Miodrag (2011), International Encyclopedia of Statistical Science. Heidelberg: Springer Science +Business Media, LLC.

How to Cite This Entry:
Renewal processes. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Renewal_processes&oldid=37792