A previous web page defined two particular sequences converging to a number - which by experiment can be shown to be approximately 2.718281828. The very reasonable question arises asking if this number really is the Euler number $e$ familiar from exponentials and natural logarithms. One approach is to define for each real number $x$ the value $E\left(x\right)={lim}_{n\to \infty}{\left(1+\frac{x}{n}\right)}^{n}$ and prove that this function has the expected properties of the expontential function $exp\left(x\right)$, such as

- $E\left(0\right)=1$
- $E(x+y)=E\left(x\right)\xb7E\left(y\right)$
- ${lim}_{h\to 0}\frac{E(x+h)-E\left(x\right)}{h}=E\left(x\right)$

for all $x,y\in \mathbb{R}$.

An alternative approach is to define the limit of the infinite series

$$1+x+\frac{{x}^{2}}{2}+\frac{{x}^{3}}{3}+\dots $$and prove that this limit is precisely $E\left(x\right)$. Both these methods can be made precise, and will (with some effort) work to make rigorous the real exponential function. Much of this work goes beyond the scope of the present sequence of web pages, however. In particular the notion of a limit of the form ${lim}_{h\to 0}\dots $ has not been discussed here.

For any doubting students of analysis, or indeed just for interested readers, this web page explores these ideas further. Just for fun, I have chosen to take a very different approach to either of the more normal ones just outlined and will define and use the natural logarithm function, and apply that to the function $E\left(x\right)$ defined above. Because logarithms tend to use additions where exponentials use multiplication, many of the algebraic manipulations turn out to be easier. However, the material in this web page is sketched in less detail than you will normally find on these web pages and is not part of an official sequences and series module, so is for interested readers only. You may safely skip to the next page if you wish.

We start by defining two functions using a limiting process similar to things you have already seen.

Suppose $x$ is a positive real number. Then we define

$$l\left(x\right)={lim}_{n\to \infty}n\left(1-{x}^{\frac{-1}{n}}\right)$$and

$$L\left(x\right)={lim}_{n\to \infty}n\left({x}^{\frac{1}{n}}-1\right)$$We will spend a bit of time proving that both these limits exist and in fact they are equal.

To this end, we fix some positive $x\in \mathbb{R}$, and let ${a}_{n}=n\left(1-{x}^{\frac{-1}{n}}\right)$ and ${b}_{n}=n\left({x}^{\frac{1}{n}}-1\right)$.

Lemma.

For all $n\in \mathbb{N}$, we have ${b}_{n}\u2a7e{a}_{n}$.

**Proof.**

Indeed, we have ${x}^{\frac{2}{n}}-2{x}^{\frac{1}{n}}+1=({x}^{\frac{1}{n}}-1{)}^{2}\u2a7e0$ whence ${x}^{\frac{1}{n}}-1\u2a7e1-{x}^{\frac{-1}{n}}$ which gives ${b}_{n}\u2a7e{a}_{n}$.

Lemma.

For all sufficiently large $k\in \mathbb{N}$ and all $n\u2a7ek$ we have ${b}_{n}\u2a7d{b}_{k}$.

**Proof.**

Let $k$ be sufficiently large so that $\left|{x}^{\frac{1}{k}}-1\right|<1$. Then

$${x}^{\frac{1}{n}}=(1+({x}^{\frac{1}{k}}-1){)}^{\frac{k}{n}}\u2a7d1+\frac{k}{n}({x}^{\frac{1}{k}}-1)$$by the exponential inequality, hence

$$n({x}^{\frac{1}{n}}-1)\u2a7dk({x}^{\frac{1}{k}}-1)$$as required.

Lemma.

For all sufficiently large $k\in \mathbb{N}$ and all $n\u2a7ek$ we have ${a}_{k}\u2a7d{a}_{n}$.

**Proof.**

Let $k$ be sufficiently large so that $\left|{x}^{\frac{1}{k}}-1\right|<1$. Then

$${x}^{\frac{-1}{n}}=(1-(1-{x}^{\frac{1}{k}}){)}^{\frac{k}{n}}\u2a7d1-\frac{k}{n}(1-{x}^{\frac{-1}{k}})$$by the exponential inequality, hence

$$n({x}^{\frac{-1}{n}}-1)\u2a7d-k(1-{x}^{\frac{-1}{k}})$$and so

$$n(1-{x}^{\frac{-1}{n}})\u2a7ek(1-{x}^{\frac{-1}{k}})$$as required.

Proposition.

For all real $x>0$ the functions $l\left(x\right)$ and $L\left(x\right)$ are defined and equal.

**Proof.**

The monotone convergence theorem together with the three preceding lemmas already show that the limits $l\left(x\right)$ and $L\left(x\right)$ are defined with $l\left(x\right)\u2a7dL\left(x\right)$, since they show that the sequences $\left({a}_{n}\right)$ and $\left({b}_{n}\right)$ are monotonic and bounded. If $x>1$ then $0<{a}_{1}\u2a7dl\left(x\right)$ so $l\left(x\right)\u2a7e{a}_{1}>0$, and if $x<1$ then $L\left(x\right)\u2a7d{b}_{1}<0$ so $L\left(x\right)\u2a7d{b}_{1}<0$. Moreover, it is easy to see that $l\left(1\right)=L\left(1\right)=0$. Suppose $x\ne 1$ and $a=lim{a}_{n}$ and $b=lim{b}_{n}$. Then

$$\frac{{b}_{n}}{{a}_{n}}=\frac{{x}^{\frac{1}{n}}-1}{1-{x}^{\frac{-1}{n}}}={x}^{\frac{1}{n}}\left(\frac{1-{x}^{\frac{-1}{n}}}{1-{x}^{\frac{-1}{n}}}\right)={x}^{\frac{1}{n}}\to 1$$Therefore by continuty of division at $(b,a)$ and uniqueness of limits $\frac{b}{a}=1$ and hence $a=b$.

Definition.

For $x>0$ we define $log\left(x\right)$, the natural logarithm of $x$, to be the value of $l\left(x\right)$ or $L\left(x\right)$.

Having defined this function, we want to explore some of its properties. In particular, we would like to know that it behaves as a logarithm function and is the inverse to the $E\left(x\right)$ function. Some properties emerge remarkably quickly. For example, if $x>0$ then $log\left({x}^{2}\right)=2log\left(x\right)$. More generally,

Lemma.

For each $x>0$ and $k\in \mathbb{N}$ we have $log\left({x}^{k}\right)=klog\left(x\right)$.

**Proof.**

$$log\left({x}^{k}\right)={lim}_{n\to \infty}n({x}^{\frac{k}{n}}-1)={lim}_{n\to \infty}\left(k\frac{n}{k}\right({x}^{\frac{k}{n}}-1\left)\right)=k{lim}_{n\to \infty}\frac{n}{k}({x}^{\frac{k}{n}}-1)=klog\left(x\right)$$

using the continuity of scalar multiplication.

Lemma.

For each $x>0$ we have $log\left({x}^{-1}\right)=-log\left(x\right)$.

**Proof.**

We have

$$L\left({x}^{-1}\right)={lim}_{n\to \infty}n\left({\left({x}^{-1}\right)}^{\frac{1}{n}}-1\right)={lim}_{n\to \infty}n\left({x}^{\frac{-1}{n}}-1\right)={lim}_{n\to \infty}-n\left(1-{x}^{\frac{-1}{n}}\right)=-l\left(x\right)$$and this together with $l\left(x\right)=L\left(x\right)$ proves the result.

Proposition.

The function $log:(0,\infty )\to \mathbb{R}$ is everywhere nondecreasing and unbounded.

**Proof.**

Suppose $x\u2a7ey\u2a7e1$. Then $n({x}^{\frac{1}{n}}-1)\u2a7en({y}^{\frac{1}{n}}-1)$ hence taking limits $log\left(x\right)\u2a7elog\left(y\right)$. A similar argument applies to $x,y\u2a7d1$. To see that $log$ is unbounded let $x=2$ and note that $log\left(x\right)>0$. So $log\left({x}^{k}\right)=klog\left(x\right)$ can be taken as large or as small as we like by suitable choice of large positive $k$ or negative $k$.

The argument just given does not show that $log$ is increasing as $<$ is not preserved in limits. However we will prove that $log$ is in fact increasing in a moment.

Proposition.

For each $x\in \mathbb{R}$ we have $log\left(E\right(x\left)\right)=x$.

**Proof.**

By definition $E\left(x\right)={lim}_{n\to \infty}(1+\frac{x}{n}{)}^{n}={lim}_{n\to \infty}(1+\frac{x}{n}{)}^{n+1}$ with the first limit being eventually monotonic nondecreasing and the second limit eventually monotonic nonincreasing. Furthermore $log\left(x\right)={lim}_{n\to \infty}n(1-{x}^{\frac{-1}{n}})={lim}_{n\to \infty}n({x}^{\frac{1}{n}}-1)$ with the first eventually monotonic nondecreasing and the second eventually monotonic nonincreasing. Putting these together we have

$$n\left({\left(1+\frac{x}{n}\right)}^{\frac{n\mathrm{+1}}{n}}-1\right)\u2a7elog\left(E\right(x\left)\right)\u2a7en\left(1-{\left({\left(1+\frac{x}{n}\right)}^{n}\right)}^{\frac{-1}{n}}\right)=n\left(1-\frac{1}{1+\frac{x}{n}}\right)$$for eventually all $n\in \mathbb{N}$. The right hand side here equals $n(1-\frac{n}{n+x})=\frac{nx}{n+x}$ which converges to $x$ as $n\to \infty $. The left hand side is

$$n\left(\left(1+\frac{x}{n}\right){\left(1+\frac{x}{n}\right)}^{\frac{1}{n}}-1\right)=n\left({\left(1+\frac{x}{n}\right)}^{\frac{1}{n}}-1\right)+x{\left(1+\frac{x}{n}\right)}^{\frac{1}{n}}$$and $x{\left(1+\frac{x}{n}\right)}^{\frac{1}{n}}\to x$ as $n\to \infty $ as $1+\frac{x}{n}\to 1$ and hence $\left({1+\frac{x}{n}}^{\frac{1}{n}}\right)\to 1$. Also $n\left({\left(1+\frac{x}{n}\right)}^{\frac{1}{n}}-1\right)\to 0$. This can be proved by taking an arbitrary $\epsilon >0$ and choosing $N\in \mathbb{N}$ such that $\frac{x}{N}<\epsilon $. Then for all $n\u2a7eN$ we have ${\left(1+\frac{x}{n}\right)}^{\frac{1}{n}}<1+\frac{1}{n}\frac{x}{n}$ by the exponential inequality and so $n\left({\left(1+\frac{x}{n}\right)}^{\frac{1}{n}}-1\right)<\frac{x}{n}<\epsilon $. This shows that

$${r}_{n}\u2a7elog\left(E\right(x\left)\right)\u2a7e{s}_{n}$$where both sequences ${r}_{n}$ and ${s}_{n}$ converge to $x$. So $log\left(E\right(x\left)\right)=x$, as required.

Proposition.

The function $log:(0,\infty )\to \mathbb{R}$ is everywhere increasing, bijective, and continuous.

**Proof.**

We already know $log$ is nondecreasing. To see that it is increasing suppose to get a contradiction $1<x<y$ with $log\left(x\right)=log\left(y\right)$. From ${x}^{\frac{1}{n}}\to 1$ we know that there is $n\in \mathbb{N}$ such that ${x}^{\frac{1}{n}}<\frac{y}{x}$. In particular letting $u\in \mathbb{R}$ be ${x}^{\frac{1}{n}}$ we have $u>1$ and $x={u}^{n}<{u}^{n+1}\u2a7dy$. By the fact that $log$ is nondecreasing we deduce that $log\left(x\right)=nlog\left(u\right)\u2a7d(n+1)log\left(u\right)\u2a7dlog\left(y\right)$. As by supposition $log\left(x\right)=log\left(y\right)$ we deduce that $log\left(u\right)=0$. But as $u>1$ the sequence $\left({u}^{n}\right)$ is unbounded so for each $z>1$ in reals there is $k\in \mathbb{N}$ such that $z\u2a7d{u}^{k}$ and hence $0\u2a7dlog\left(z\right)\u2a7dlog\left({u}^{k}\right)=klog\left(u\right)=0$. It follows that $log\left(z\right)=0$ for all $z>1$. This is a contradiction to the previous proposition.

The case when $x<y<1$ is similar.

Continuity follows from a general result that says a nondecreasing function $log:(0,\infty )\to \mathbb{R}$ which takes all values in $\mathbb{R}$ is automatically continuous.

The notion of derivative is outside the scope of these notes. However, you may be familiar enough with it to understand the following argument.

Proposition.

The function $log:(0,\infty )\to \mathbb{R}$ has derivative $log\prime \left(x\right)=\frac{1}{x}$.

**Proof.**

By definition,

or, as ${lim}_{n\to \infty}{x}^{\frac{1}{n}}=1$, we can use ${x}^{\frac{n\mathrm{+1}}{n}}-x$ as a possible value for a positive $h$. (This is justified by the facts that ${x}^{\frac{n\mathrm{+1}}{n}}-x\to 0$ and the fact that $log$ is continuous and increasing. Strictly speaking since we are only looking at positive $h$ we are computing a one-sided derivative. The derivative on the left can be computed and found equal to the right derivative by a similar argument.) This gives

which equals

as required.

Proposition.

The function $log:(0,\infty )\to \mathbb{R}$ satisfies $log\left(xy\right)=log\left(x\right)+log\left(y\right)$ for all $x,y>0$.

**Proof.**

(Sketch.)

Given $x$ and $y$ which for simplicity we assume are greater than $1$, let ${a}_{n}={x}^{\frac{1}{n}}$ so ${a}_{{n}^{n}}=x$ and let ${k}_{n}$ be the unique integer with ${a}_{{n}^{{k}_{n}}}\u2a7dy<{a}_{{n}^{1+{k}_{n}}}$. Then $y={lim}_{n\to \infty}{a}_{{n}^{{k}_{n}}}$ and

by continuity of various functions including $log$.