# Difference between revisions of "Divergent series"

A series for which the sequence of partial sums does not have a finite limit. For example, the series

$$1 + 1 + 1 + 1 + 1 + \dots ,$$

$$1 - 1 + 1 - 1 + 1 - \dots ,$$

$$1 - 2 + 3 - 4 + 5 - \dots ,$$

$$1 + x + x ^ {2} + x ^ {3} + x ^ {4} + \dots \ ( | x | \geq 1)$$

are divergent.

Divergent series first appeared in the works of mathematicians of the 17th century and 18th century. L. Euler first came to the conclusion that the question must be posed, not what the sum is equal to, but how to define the sum of a divergent series, and he found an approach to the solution of this problem close to the modern one. Up till the end of the 19th century, divergent series did not find any application and were almost forgotten. Towards the end of the 19th century an accumulation of various facts in mathematical analysis re-awakened interest in divergent series. The question was raised about the possibility of summing series in a sense different from the usual one.

### Examples.

1) If one multiplies two series

$$\sum _ {n = 0 } ^ \infty a _ {n} \ \textrm{ and } \ \ \sum _ {n = 0 } ^ \infty b _ {n}$$

that converge to $A$ and $B$, respectively, then the resulting series

$$\tag{1 } \sum _ {n = 0 } ^ \infty c _ {n} = \ \sum _ {n = 0 } ^ \infty ( a _ {0} b _ {n} + a _ {1} b _ {n - 1 } + \dots + a _ {n} b _ {0} )$$

can be divergent. However, if one defines the sum of the series (1) not as the limit of the partial sums $s _ {n}$, but as

$$\tag{2 } \lim\limits _ {n \rightarrow \infty } \ \frac{s _ {0} + \dots + s _ {n} }{n + 1 } ,$$

then in this sense the series (1) will always converge (i.e. the limit (2) exists) and its sum in this sense is equal to $C = AB$.

2) The Fourier series of a function $f$ continuous at a point $x _ {0}$( or having a discontinuity of the first kind) can diverge at this point. If the sum of the series is defined by formula (2), then in this sense the Fourier series of such a function will always converge and its sum in this sense will equal $f ( x _ {0} )$( or $\{ f ( x _ {0} + 0) + f ( x _ {0} - 0) \} /2$).

3) The power series

$$\tag{3 } \sum _ {n = 0 } ^ \infty z ^ {n}$$

converges for $| z | < 1$ to the sum $1/( 1 - z)$ and diverges for $| z | \geq 1$. If one defines the sum of the series to be

$$\tag{4 } \lim\limits _ {x \rightarrow \infty } \ e ^ {-} x \ \sum _ {n = 0 } ^ \infty \ \frac{s _ {n} x ^ {n} }{n! } ,$$

where the $s _ {n}$ are the partial sums of the series (3), then in this sense the series (3) will converge for all $z$ with $\mathop{\rm Re} z < 1$, and its sum is the function $1/( 1 - z)$.

To generalize the concept of a sum to the case of a divergent series, one takes some operator or rule which assigns a specific number to a divergent series, called its sum (in this definition). Such a rule is called a summation method (cf. Summation methods). Thus, the rule described in Example 1) is called summation by arithmetical averages (see Cesàro summation methods). The rule given in Example 2) is called the Borel summation method.