This page continues the work in a previous page about subsequences. and concentrates on how to use subsequences to prove a sequence does converge.
First, note that the Theorem on Subsequences shows that a subsequence of a convergent sequence does converge, and it can be used in this way.
The sequence defined by is a subsequence of the null sequence defined by . Hence is also a null sequence by the Theorem on Subsequences.
These arguments are easy to write down and quite convenient, but they are all very easy in the sense that the new sequence could have been proved convergent in exactly the same way as the original sequence was. This web pages addresses the use of subsequences to prove new sequences that we didn't previously know converged do in fact converge. The main result here concerns "covering a sequence by subsequences". If you use this result, please be very careful that you apply it correctly and that you do not confuse it with the Theorem on Subsequences (which is normally used to prove a sequence does not converge). It is very easy to make mistakes in this area.
The idea of "covering a sequence by subsequences" is to split all the terms of a sequence into two subsequences and prove these two subsequences tend to the same limit. Often a sequence is split into its odd-numbered and its even-numbered terms. But it is essential to consider all the terms of the original sequence and ensure both subsequences tend to the same limit. Just knowing a sequence has a convergent subsequence says nothing about the convergence of the whole sequence .
Theorem on covering by subsequences.
Suppose a sequence is given, and and are subsequences, where are increasing functions . Suppose also that
and and both converge to the same limit . Then converges to .
We have a number of assumptions here, which we shall write down formally in terms of , and .
and are increasing functions .
We have to prove so we start by letting be arbitrary, use the convergence of and to define a suitable and then let be arbitrary. All this is as usual. The definition of is the tricky bit.
Let be arbitrary.
Let satisfy .
Let satisfy .
Let be arbitrary.
Since we have or for some . For the moment, assume the first, that .
Since and , and is increasing, we must have ; this is because would imply .
It follows from the choice of that and hence .
The case when is handled by an identical argument, using the choice of and the fact that is increasing.
Hence , as was arbitrary.
One application of this result is given in an exercise sheet. We give one further example here.
defined by and
is called the continued fraction for . As the name suggests, this sequence converges to .
Full details are left as an exercise (or may be given in lectures). Only a sketch is given here.
The idea is to define subsequences and . It will turn out in a moment that the subsequence is increasing, is decreasing, and both converge to .
By some algebra (exercise!) you should be able to show that consecutive terms of are given by the formula .
Now by straightforward algebra (writing down expressions for and ) we can prove that
for each . The same algebra also shows that
By computing and and using induction, it follows from this that
hence the sequences converge to since, by the difference of two squares,
The conclusion that now follows by the theorem above.
A subsequence of a sequence is an infinite selection of terms from the sequence taken in the same order. You have seen how to notate and handle subsequences here. If the original sequences converges, then all subsequences converge to the same limit. This result is principally used to show a given sequence does not converge.
Occasionally, a sequence can be proved convergent by considering subsequences separately, and a general result to this effect was proved. However care must be taken in using this result and checking all its conditions. An example was given, and other examples are studied in the exercises.