# series

A *series* is the sum of the terms of an infinite sequence, where the \(n^{th}\) summand is the \(n^{th}\) term of the sequence. A series is typically denoted using *sigma* notation, i.e.,

\[\sum_{n=0}^{\infty} a_n = a_0 + a_1 + a_2 + \cdots + a_n + \cdots \]

where \(a_i\) is the \(i^{th}\) term of the sequence. The index \(n\) may begin with 0, 1, or \(k\) for any natural number \(k\), as a matter of convenience. If the sum exists (i.e., is not infinite) then we say that the series *converges*. Even if the sum is unknown, it can often be shown that the series converges by showing that the sequence of partial sums is a convergent sequence, where the \(n^{th}\) partial sum is the sum of the first \(n\) summands of the series.

#### Types of Series

### Harmonic Series

\[\sum_{n=1}^{\infty} \frac{1}{n} = 1 + \frac{1}{2}+\frac{1}{3}+\frac{1}{4}+\frac{1}{5}+ \cdots + \frac{1}{n} + \cdots \]

This series does not converge because its sequence of partial sums is unbounded. To see this, notice that we have the following:

\[\begin{eqnarray*} & & 1 + \frac{1}{2}+\frac{1}{3}+\frac{1}{4}+\frac{1}{5}+\frac{1}{6}+\frac{1}{7}+\frac{1}{8}+\cdots \\ & & \\ & & \\ & = & 1 + \frac{1}{2}+\left(\frac{1}{3}+\frac{1}{4}\right) +\left(\frac{1}{5}+\frac{1}{6}+\frac{1}{7}+\frac{1}{8}\right)+\cdots \\ & & \\ & & \\ & > & 1 + \frac{1}{2}+\left(\frac{1}{4}+\frac{1}{4}\right) +\left(\frac{1}{8}+\frac{1}{8}+\frac{1}{8}+\frac{1}{8}\right)+\cdots \\ & & \\ & & \\ & = & 1 + \frac{1}{2}+\frac{1}{2}+\frac{1}{2}+\frac{1}{2}+\frac{1}{2}+\frac{1}{2}+\cdots \\ \end{eqnarray*} \]

and this last sum is clearly infinite.

### Geometric Series

\[\sum_{n=0}^{\infty} ar^n = a + ar + ar^2+ ar^3 + \cdots + ar^n + \cdots \]

where \(a\) and \(r\) are both constants, and \(r\) is called the *common ratio*. This series diverges if \(r \geq 1\), but if \(0

\[\sum_{n=0}^{\infty} ar^n = \frac{a}{1-r} \]

### P-Series

\[\sum_{n=0}^{\infty} \frac{1}{n^p} = 1+\frac{1}{2^p}+\frac{1}{3^p}+\frac{1}{4^p}+\cdots+\frac{1}{n^p}+\cdots\]

where the common exponent \(p\) is a positive real constant. This series diverges if \(p\) is less than or equal to 1, by comparison with the harmonic series. It converges if \(p\) is strictly greater than 1, though what the sum is in that case is known in only a few cases. It is notable that for \(p = 2\) (the sum of the inverses of the squares) the sum is \(\displaystyle\frac{p^2}{6}\).

### Alternating Series

Any series in which the terms alternate in sign, for example,

\[\sum_{n=0}^{\infty} (-1)^{n+1}\frac{1}{n} = 1- \frac{1}{2}+\frac{1}{3}-\frac{1}{4}+\cdots +\frac{(-1)^{n+1}}{n} +\cdots\]

Any such series converges if and only if its terms decrease (in absolute value) and converge to 0 in the limit.

### Power Series

\[\sum_{n=0}^{\infty} a_nx^n = a_0 + a_1x+a_2x^2+\cdots +a_nx^n+\cdots\]

where each \(a_i\) is a distinct constant. It can be shown that any such series either converges at \(x = 0\), or for all real \(x\), or for all \(x\) with \(-R radius of convergence. This important series should be thought of as a function in \(x\) for all \(x\) in the radius of convergence. Where defined, this function has derivatives of all orders, and may be differentiated and integrated ‘term-by-term.’ That is,

\[\begin{eqnarray*} \frac{d}{dx} \sum_{n=0}^{\infty} a_nx^n & = & \sum_{n=0}^{\infty} \frac{d}{dx} \left( a_nx^n \right) \\ & & \\ & & \\ \int \left( \sum_{n=0}^{\infty} a_nx^n\right) dx & = & \sum_{n=0}^{\infty} \int a_nx^n\,dx\ \end{eqnarray*}\]

#### Convergence Tests

A series *converges* when its sequence of partial sums converges, that is, if the sequence of values given by the first term, then the sum of the first two terms, then the sum of the first three terms, etc., converges as a sequence. A series is said to *converge absolutely* if the series still converges when all of the terms of the series are made non-negative (by taking their absolute value). A series which converges but does not converge absolutely is said to converge *conditionally* (cf. alternating series above). A series which does not converge is said to *diverge*. All of the tests below except for the \(n^{th}\)-term test are for series with non-negative terms only.

### \(n^{th}\)-Term Test

In any series, if the terms of the series do not diminish in absolute value to zero (i.e., if the limit of the sequence of terms is not zero), then the series diverges. This important test must be used with care: it is not a test for convergence, and must not be interpreted to mean that any series whose sequence of terms does limit to zero necessarily converges. The harmonic series (above) is a good counter example.

### Comparison Test

If \(\Sigma\, a_n\) and \(\Sigma\, b_n\) are any two series, all of whose terms are non-negative, and if \(a_n

### Limit Comparison Test

If \(\Sigma\, a_n\) and \(\Sigma\, b_n\) are any two series, all of whose terms are non-negative, and if the limit \(\displaystyle\lim_{n\rightarrow\infty} \frac{a_n}{b_n}\) is a finite real number greater than 0, then either both series converge or both series diverge.

### Integral Test

If the terms of a series are positive and decreasing, then the series \(\Sigma\, f(n)\) and the integral \(\int_1^{\infty} f(x)\,dx\) either both converge or both diverge.

### Ratio Test

If the terms of a give series are all non-negative, and if the limit \(\displaystyle\lim_{n\rightarrow\infty} \frac{a_{n+1}}{a_n} = L\) exists, then the series converges if \(L 1\). If \(L=1\) the test is inconclusive.

### Root Test

If the terms of a given series are all non-negative, and if the limit \(\displaystyle\lim_{n\rightarrow\infty} \sqrt[n]{a_n} = L\) exists, then the series converges if \(L 1\). If \(L=1\) the test is inconclusive.