# Exponential Function Product Rule

## A first definition of exponential and logarithmic functions

### Motivation from compound interest

Increase the number of times that the interest is compounded each year ($n$), so as to increase the final amount of money ($A$)

$$A = P \left(1 + \frac{r}{n}\right)^{nt}.$$

$t$ and $r$ are the number of years and the interest rate per annum.

As $n$ becomes large, we can approximate the amount by

$$A = P e^{rt}.$$

The value of $A$ in the above formula is the amount compounded continuously.

Questions

1. What is (the definition of) the constant $e$?
2. In practice, $r$ is a rational number. (i.e. $r \in \Q$) What if $r$ is irrational? Then we would run into irrational indices.

### Goal

To store the technical details somewhere on the web.

1. To give a logical introduction to the concept of two elementary functions: exponential function and logarithms.
2. To fill the gap between rational and irrational indices.

### Raison d’être

In exam-oriented secondary education, students are required to make calculations using their scientific calculators. It would be convenient, for school teachers, to infer from the law of indices on $\Q$ to their counterpart on $\R$, and to ask students to write the calculator’s output (say, for $2^\pi$). Math teachers aren’t math major (like me), so some of them might not even know how these two elementary functions are defined and developed. Their job is to pass the classic formulae to students. As a result, students resort to rote or muscle memory. The former is prone to mistakes, especially under exam stress. This leads to low achievement.

This article aims to fill up the gap for interested students. In the first few big proofs, I only use integer exponents, so that knowing the basic definition of indices would suffice.

### Facts needed

1. Product rule of limits: $\lim (A \cdot B) = \left(\lim A \right) \left(\lim B \right)$, provided both limits on the RHS exist.

2. Binomial Theorem:

$$(a+b)^n = \sum_{k = 0}^n \binom{n}{k} a^{n-k} b^{k}.$$
3. Geometric sum formula (finite version):

$$\sum_{i = 0}^n ar^i = \frac{a(1-r^{n+1})}{1-r}, \; r \ne 1.$$
4. Limit exists implies boundedness: if $\lim_{n\to+\infty} a_n$ exists, then there exists a constant $M>0$ such that for all $n$, $\lvert a_n \rvert \le M$.

5. Monotone Convergence Theorem and monotonicity of limits:

$$\forall n: a_n \le b_n \implies \lim_n a_n \le \lim_n b_n.$$

### Recall the definition of exponential function

For any real number $s$, the limit $\lim_{n\to+\infty} (1+s/n)^n$ exists, and it’s denoted by $e^s$, or $\exp(s)$.

Proof:

We’ll show that the sequence with the general term $(1 + s/n)^n$ is bounded, then strictly increasing for $s > 0$, so that we can apply the Monotone Convergence Theorem to establish the existence of the desired limit for $s > 0$. After that, we’ll use the quotient rule for limits to finish.

#### Boundedness

The main idea is that factorial function $k \mapsto k!$ is much greater than the exponential function (with a fixed base $m$) $k \mapsto m^k$.

• Our definition of exponential function is a limit of a binomial,
• which can be expanded into a sum involving binomial coefficients,
• which is a fraction with $k!$ as the denominator.
• by bounding $1/k!$ above by a term from a well-chosen exponential function (with a base strictly smaller than $1$), we can apply the geometric sum formula (for finitely many terms, then discard the negative term in the numerator) to set up an upper bound independent of $n$.
\begin{align} &\;\left(1+\frac{s}{n}\right)^n \\ =& \sum_{k=0}^n \binom{n}{k} \left(\frac{s}{n}\right)^k \\ \le& \sum_{k=0}^n \binom{n}{k} \left(\frac{\lvert s \rvert}{n}\right)^k \\ =& 1 + \lvert s \rvert + \sum_{k=2}^n \frac{\lvert s \rvert^k}{k!} \, \frac{n (n-1) \cdots (n+1-k)}{n^k} \\ <& 1+\lvert s\rvert+\sum_{k=2}^n \left(\frac{\lvert s\rvert}{2}\right)^k (1) \\ =& 1 + \lvert s \rvert + \left(\frac{\lvert s \rvert}{2}\right)^2 \frac{1-(\lvert s \rvert/2)^{n-2}}{1-\lvert s \rvert/2} \end{align}

The above argument only works for $\lvert s \rvert < 2$. For larger $\lvert s \rvert$, we need to change ‘$2$’ to a larger integer $M$, so that the geometric sum formula (for finitely many terms) can be applied. We choose this $M \in \N$ so that $2 \le \lvert s \rvert < M$. It’s clear that $M \ge 3$.

For the case $\lvert s \rvert \ge 2$, in the sum (4), we discard the fraction on the right (by bounding it above by $1$), and we divide the sum into two parts

$$1 + \lvert s \rvert + \sum_{k=2}^{M-1} \frac{\lvert s \rvert^k}{k!} + \sum_{k=M}^n \frac{\lvert s \rvert^k}{k!}.$$

We’ll only focus on the last term (the “tail”).

It took me a while to work out the tedious details.

\begin{align} k! =& k (k - 1) \cdots M (M - 1) \cdots 2 \cdot 1 \\ \ge& M^{k - M + 1} \cdot 2^{M - 2} = \frac12 \left(\frac2M\right)^{M - 1} M^k \\ \frac{\lvert s \rvert^k}{k!} \le& 2 \left(\frac{M}{2}\right)^{M - 1} \left(\frac{\lvert s \rvert}{M}\right)^k \\ \sum_{k=M}^n \frac{\lvert s \rvert^k}{k!} \le& 2 \left(\frac{M}{2}\right)^{M - 1} \left(\frac{\lvert s \rvert}{M}\right)^M \frac{1-(\lvert s \rvert/M)^{n - M + 1}}{1-(\lvert s \rvert/M)} \\ <& \frac{4}{M - \lvert s \rvert} \left(\frac{\lvert s \rvert}{2}\right)^M \end{align}

#### Strictly increasing

Suppose that $s > 0$.

\begin{align} & \; \left(1 + \frac{s}{n + 1}\right)^{n + 1} \\ =& 1 + s + \sum_{k=2}^{n+1} \frac{s^k}{k!} \frac{n+1}{n+1} \, \frac{n}{n+1} \cdots \frac{n+2-k}{n+1} \\ >& 1 + s + \sum_{k=2}^n \frac{s^k}{k!} \frac{n}{n} \, \frac{n-1}{n} \cdots \frac{n+1-k}{n} \\ =& \left(1 + \frac{s}{n} \right)^{n} \end{align}

#### Negative exponents

For any $s > 0$, we’re going to show that

$$\lim_{n\to+\infty} \left(1+\frac{-s}{n}\right)^n \left(1+\frac{s}{n}\right)^n=1.$$
\begin{align} &\; \left(1+\frac{-s}{n}\right)^n \left(1+\frac{s}{n}\right)^n \\ =& \left(1-\frac{s^2}{n^2}\right)^n \\ =& 1 + \sum_{k=1}^n \binom{n}{k} (-1)^k \left(\frac{s}{n}\right)^{2k} \end{align}

We claim that the last term tends to zero as $n \to +\infty$.

\begin{align} &\; \left\lvert \sum_{k=1}^n \binom{n}{k} (-1)^k \left(\frac{s}{n}\right)^{2k} \right\rvert \\ \le& \sum_{k=1}^n \binom{n}{k} \left(\frac{s}{n}\right)^{2k} \\ =& \sum_{k=1}^n \frac{1}{k!} \, \frac{n \cdots (n+1-k)}{n^k} \left(\frac{s^2}{n}\right)^k \\ <& \frac{s^2}{n} \, \frac{1-(s^2/n)^n}{1-s^2/n} \\ <& \frac{s^2}{n - s^2} \xrightarrow{n \to +\infty} 0 \end{align}

This shows the desired limit in the beginning of this subsection. Apply this together with the case that $s>0$ (that we’ve just proved) and the quotient rule for limits to see that

$$\lim_{n\to+\infty} \left(1-\frac{s}{n}\right)^n = \frac{1}{e^s}.$$

Exercise: try to find a gap in the above argument.

From the above proof, we have the identity $e^s \cdot e^{-s} = 1$.

Remark: Here the condition that $s > 0$ is needed. You may experiment with large negative numbers for $s$ (say $s = -100$) and a short array of large consecutive values of $n$ (say $n \in \lbrace 61, 62, \dots, 80 \rbrace$) to see the oscillation of the sequence $\left((1+s/n)^n\right)_n$.

### Main result

The well-known basic identity $e^{s+t} = e^s \cdot e^t$.

### Derivation

\begin{align} & \; e^s \cdot e^t \\ =& \lim_{n\to+\infty} \left(1+\frac{s}{n}\right)^n \cdot \lim_{n\to+\infty} \left(1+\frac{t}{n}\right)^n \\ =& \lim_{n\to+\infty} \left(1+\frac{s+t}{n}+\frac{st}{n^2}\right)^n \\ =& \lim_{n\to+\infty} \sum_{k=0}^n \binom{n}{k} \left(1+\frac{s+t}{n}\right)^{n-k} \left(\frac{st}{n^2}\right)^k \end{align}

We’re going to see that only the first term in the above sum is relevant.

Claim:

$$\lim_{n\to+\infty} \sum_{k=1}^n \binom{n}{k} \left(1+\frac{s+t}{n}\right)^{n-k} \left(\frac{st}{n^2}\right)^k=0$$

Proof:

The trick is to find an upper bound which is a term of a geometric sequence.

\begin{align} &\;\binom{n}{k}\left(1+\frac{s+t}{n}\right)^{n-k}\left(\frac{st}{n^2}\right)^k\\ =& \frac{1}{k!} \left(\frac{n}{n}\,\frac{n-1}{n}\cdots\frac{n+1-k}{n}\right) \left(1+\frac{s+t}{n}\right)^n \left(\frac{st}{n+s+t}\right)^k \\ <& (1)(1 \cdot 1 \cdots 1)(M) \left(\frac{st}{n+s+t}\right)^k \end{align}

We used the fact (that allows the definition of the exponential function) that $\lim_{n\to+\infty} (1+(s+t)/n)^n$ exists.

From the above inequality, we have

\begin{align} &\;\sum_{k=1}^n \binom{n}{k} \left(1+\frac{s+t}{n}\right)^{n-k} \left(\frac{st}{n^2}\right)^k \\ <& \sum_{k=1}^n M \left(\frac{st}{n+s+t}\right)^k \\ =& M \left(\frac{st}{n+s+t}\right) \frac{1 - \left(\frac{st}{n+s+t}\right)^n}{1 - \frac{st}{n+s+t}} \\ <& M \left(\frac{st}{n+s+t}\right) \frac{1}{1 - \frac{st}{n+s+t}} \\ =& \frac{Mst}{n+s+t-st} \xrightarrow{n\to+\infty} 0 \end{align}

Apply this limit to the equation before the claim to see that only the first term of the sum has to be kept.

\begin{align} e^s \cdot e^t =& \lim_{n\to+\infty} \left(1+\frac{s}{n}\right)^n \lim_{n\to+\infty} \left(1+\frac{t}{n}\right)^n \\ =& \lim_{n\to+\infty} \left(1+\frac{s+t}{n}\right)^n = e^{s+t} \end{align}

### Definition of log function

To see that the exponential function $\exp:\R\to\R_{+}$ is strictly increasing, we apply the above multiplicative formula. Suppose that $s < t$, and we want to compare $e^s$ with $e^t$. We have $e^t = e^s \cdot e^{t-s}$. Once we can show that $e^{t-s}>1$, then we’re done. Note that the exponent $t-s>0$, so from the first four lines in the subsection Boundedness, we see that $e^{t-s} \ge 1+(t-s)>1$ (only the first two terms in the fourth line are taken).

Thus, the exponential function is bijective, and it has an inverse function, which is called the natural log and is denoted by $\ln$. It’s obvious that $e^s = a \iff s = \ln(a)$. Two useful forms of this “$\iff$” statement:

1. $\ln(e^s) (= \ln(a)) = s$ (i.e. $e^s \overset{\ln}{\mapsto} s$)
2. $e^{\ln(a)} (= e^s) = a$ (i.e. $\ln(a) \overset{\exp}{\mapsto} a$).

### Corollary

$$\ln(ab) = \ln(a) + \ln(b)$$

Proof: write $a = e^s$ and $b = e^t$. Then $ab = e^{s + t}$. Take log on both sides to see that $\ln(ab) = s + t = \ln(a) + \ln(b)$.

### General definition of exponents

Let $a > 0$ and $s \in \R$. We define $a^s := e^{s\ln(a)}$. It’s read as “$a$ to the power $s$”.

### Rediscovery of law of indices

Let $a > 0$, and $s,t \in \R$.

#### Zero power: $a^0 = 1$

$$a^0 = e^{0 \ln (a)} = e^0 = 1$$

The last equality is from our definition of the exponential function with $s=0$, so that the expression on the right of the $\lim$ symbol becomes the constant $1$.

Exercise: What is $a^1$?

#### Negative power: $a^{-s} = 1 / a^s$

Recall: at the end of the subsection Negative Exponents, we have the identity $e^t \cdot e^{-t} = 1$, so $e^{-t} = \left(e^t\right)^{-1} = 1 / e^t$ for all $t \in \R$. Put $t = s\ln(a)$, which is the exponent on the RHS of $a^s = e^{s\ln(a)}$ to see that $a^{-s} = 1 / e^{s\ln(a)} = 1 / a^s$.

#### Multiplication: $a^s \cdot a^t = a^{s + t}$.

$$a^s \cdot a^t = e^{s\ln(a)} \cdot e^{t\ln(a)} = e^{s \ln (a) + t \ln(a)} = e^{(s+t) \ln (a)}$$

#### Division: $a^s / a^t = a^{s - t}$

$$a^s / a^t = a^{s} \cdot a^{-t} = a^{s + (-t)} = a^{s - t}$$

#### Power of a power: $(a^s)^t = a^{st}$

$$(a^s)^t = (e^{s\ln(a)})^t = e^{t\ln\left(e^{s\ln(a)}\right)} = e^{t(s\ln(a))} = e^{st\ln(a)} = a^{st}$$

#### Multiplication with same index: $(ab)^s = a^s \cdot b^s$

\begin{align} (ab)^s =& e^{s\ln(ab)} \\ =& e^{s[\ln(a) + \ln(b)]} \\ =& e^{s\ln(a)} \cdot e^{s\ln(b)} \\ =& a^s \cdot b^s \end{align}

### Rediscovery of law of logarithms

Let $a,b > 0$ and $s \in \R$ such that $a^s = b$. A textbook defines $\log_a b$ to be $s$. Using our definition of exponents, we can easily retreive the change of base formula

\begin{align} e^{s \ln(a)} = a^s &= b \\ s \ln(a) &= \ln(b) \\ s &= \frac{\ln(b)}{\ln(a)}. \end{align}

Here’re two conventions.

base $a$ $\log_a$ written as name
$e$ $\ln$ natural log
$10$ $\log$ common log

I’ll left the derivation of remaining laws as an exercise, so that you can practice your understanding of the above definitions and formulae.