This web page starts a new section of the course: infinite series. An infinite series is an expression such as $\sum _{n=1}^{\infty}{a}_{n}$ and we are going to learn how to understand and work with such series rigorously.

This page starts with the definition of convergence for infinite series and three basic but very important results about them. Examples of convergent series will appear in a later page.

An infinite series is an expression of the form
$\sum _{n=1}^{\infty}{a}_{n}$, where $\left({a}_{n}\right)$
is a sequence. We shall say this series converges to $l\in \mathbb{R}$
if the sequence formed from partial sums

,

converges to $l$, i.e., ${s}_{n}\to l$ as $n\to \infty $. The term ${s}_{n}$ is called the $n$th partial sum of $\sum _{n=1}^{\infty}{a}_{n}$. We say the series converges if it converges to some $l\in \mathbb{R}$.

Example.

The series $\sum _{n=1}^{\infty}{2}^{-n}$ converges to $1$. To see this, write ${a}_{n}={2}^{-n}$ and observe that ${s}_{n}=\sum _{k=1}^{n}{a}_{k}=1-{2}^{-n}$. Thus ${s}_{n}\to 1-0=1$ as $n\to \infty $ by the continuity of $-$ and the standard null sequence ${2}^{-n}$.

Example.

The series $\sum _{n=1}^{\infty}(-1{)}^{n}$ does not converge. To see this, write ${a}_{n}=(-1{)}^{n}$ and note that ${s}_{n}=\sum _{k=1}^{n}{a}_{k}$ is $-1$ if $n$ is odd and $0$ if $n$ is even. The sequence ${s}_{n}=\frac{1}{2}-\frac{(-1{)}^{n}}{2}$ does not converge so neither does the series $\sum _{n=1}^{\infty}(-1{)}^{n}$.

We shall prove three very important but straightfoward results about
series to get our theory off the ground. The first of these states that,
as with sequences, the convergence of
$\sum _{n=1}^{\infty}{a}_{n}$ only depends
on the behaviour of ${a}_{n}$ as
$n$ goes to infinity

, and
not on the first few terms—where few

means any finite number
of terms, such as 1000000000. In other words, to show that
$\sum _{n=1}^{\infty}{a}_{n}$ converges we can ignore
any finite number of terms $\sum _{n=1}^{m-1}{a}_{n}$
and just need to prove $\sum _{n=m}^{\infty}{a}_{n}$ converges.

Proposition on eventual convergence, or ignoring finitely many terms.

Suppose $m\in \mathbb{N}$ and the series $\sum _{k=m}^{\infty}{a}_{k}$ converges. Then the series $\sum _{k=1}^{\infty}{a}_{k}$ also converges.

**Proof.**

Let ${s}_{n}=\sum _{k=1}^{n}{a}_{k}$, and let $r=\sum _{k=1}^{m-1}{a}_{k}$. Then ${s}_{n}=r+{t}_{n}$ for $n\u2a7em$, where ${t}_{n}$ is the partial sum ${t}_{n}=\sum _{k=m}^{n}{a}_{k}$. But $\sum _{k=m}^{\infty}{a}_{k}$ converges, so ${t}_{n}\to l$ for some $l\in \mathbb{R}$. So by the convergence of the constant sequence $r$ and the continuity of $+$, ${s}_{n}\to r+l$ and hence $\sum _{k=1}^{\infty}{a}_{k}$ converges.

This result is similar to the very easy one for sequences
that says: if the subsequence $\left({a}_{m+n}\right)$
consisting of *all* terms in $\left({a}_{n}\right)$ after the
$m$th converges then the whole sequence $\left({a}_{n}\right)$
converges. The proposition on eventual convergence is rather useful,
however, as it shows that when investigating the convergence of a
series, we can always ignore the (perhaps erratic) behaviour
of finitely many terms at the beginning.

The next result is commonly used in its contrapositive form
to show that a series *does not* converge. It *cannot* be
used to show that a series does converge.

Null sequence test.

Suppose the series $\sum _{n=1}^{\infty}{a}_{n}$ converges. Then ${a}_{n}\to 0$ as $n\to \infty $.

**Proof.**

Let ${s}_{n}=\sum _{k=1}^{n}{a}_{k}$ be the $n$th partial sum, and suppose ${s}_{n}\to l\in \mathbb{R}$ as $n\to \infty $. Then ${a}_{n+1}={s}_{n+1}-{s}_{n}\to l-l=0$ as $n\to \infty $, by continuity. Therefore $\left({a}_{n+1}\right)$ is a null sequence and hence so is $\left({a}_{n}\right)$.

Example.

The series $\sum _{n=1}^{\infty}cos\left(n\right)$ does not converge.

**Proof.**

The sequence $cos\left(n\right)$ does not converge, by a previous result in the course, so in particular does not converge to $0$. Hence by the null sequence test $\sum _{n=1}^{\infty}cos\left(n\right)$ does not converge.

Our final result here is an application of the monotone convergence theorem to series, and will be used implicitly or explicitly a huge amount in the section on series.

Monotonicity Theorem.

Suppose the series $\sum _{n=1}^{\infty}{a}_{n}$ consists of nonnegative terms only. Then the sequence of partial sums, ${s}_{n}=\sum _{k=1}^{n}{a}_{k}$, is monotonic nondecreasing and hence the series $\sum _{n=1}^{\infty}{a}_{n}$ converges if and only if the sequence of partial sums $\left({s}_{n}\right)$ is bounded.

**Proof.**

There's almost nothing to say, as the argument is presented in the statement of the theorem, except that the final conclusion rests on the monotone convergence theorem for one direction and on the theorem on boundedness for the other direction.

By the proposition on eventual convergence, the monotonicity theorem also applies to any series with only finitely many negative terms: such a series is convergent if and only if the sequence of partial sums is bounded.