Muhammad Haris Rao
The logarithm function will be henceforth defined as \begin{align*} \log : \mathbb{R}_{> 0} \longrightarrow \mathbb{R}, x \longmapsto \int_1^x \frac{1}{t} \, dt \end{align*} This is a well-defined function by existence of Riemann integrals for functions which are continuous over the region of integration. We have by the fundamental theorem of calculus that the logarithm is not only continuous, but continuously differentiable with derivative \begin{align*} \frac{d}{dx} \log{x} &= \frac{d}{dx} \int_1^x \frac{1}{t} \, dt = \frac{1}{x} \end{align*} We first make the following claim:
Proposition: The logarithm is a bijection between $\mathbb{R}_{> 0}$ and $\mathbb{R}$.
Proof. First to show it is surjective. It suffices to show that $\log$ is unbounded above and below as then one can apply the intermediate value theorem to show that it surjects onto $\mathbb{R}$. For $N \in \mathbb{Z}_{> 0}$, we have by considering Riemann sums \begin{align*} \log{(N + 1)} &= \int_1^{N + 1} \frac{1}{t} \, dt \ge \sum_{k = 1}^N \frac{1}{k + 1} = -1 + \sum_{k = 1}^{N + 1} \frac{1}{k} \end{align*} Since the harmonic series diverges, the function is unbounded above. Differentiating using the chain rule yields \begin{align*} \frac{d}{dx} \log{\frac{1}{x}} &= - \frac{1}{x^2} \frac{1}{1/x} = - \frac{1}{x} \end{align*} Hence, we have \begin{align*} \frac{d}{dx} \left( \log{x} + \log{\frac{1}{x}} \right) &= \frac{1}{x} - \frac{1}{x} = 0 \end{align*} This can only mean $\log{x} + \log{1/x}$ is constant for all values of $x \in \mathbb{R}_{> 0}$. Taking $x = 1$ shows that this constant is 0, and so $\log{1/x} = - \log{x}$. Hence, for $N$ as before \begin{align*} \log{\frac{1}{N}} = - \log{N} \end{align*} Becaues $\log{N}$ can be made to take arbitrarily large values, $\log{1/N}$ can be made to go as far as desired in the negative direction. Hence, the logarithm is also unbounded below. By continuity, it surjects onto $\mathbb{R}$. To prove that it is injective, suppose $\log{x} = \log{y}$ for some $0 < x \le y$. If $x \ne y$, then by intermediate value theorem there is $z \in (x, y)$ such that \begin{align*} 0 = \left[ \frac{d}{dt} \log{t} \right]_{t = z} = \frac{1}{z} \end{align*} which is clearly absurd. So it must be actually $x = y$, and so $\log{x}$ is also injective.
The bijectivity of the logarithm allows us to define the exponential as \begin{align*} \text{exp} : \mathbb{R} \longrightarrow \mathbb{R}_{> 0} \end{align*} as the inverse function. This is continuous:
Proposition: The exponential function is continuous at $0$.
Proof. For this, let $\varepsilon \in (0, 1)$, and let $x \in \left( \log{(1 - \varepsilon)}, \log{(1 + \varepsilon)} \right)$. Then we have from the fact that the exponential is strictly increasing that \begin{align*} 1 - \varepsilon = \exp{\left( \log{(1 - \varepsilon)} \right)} < \exp{(x)} < \exp{\left( \log{(1 + \varepsilon)} \right)} = 1 + \varepsilon \end{align*} Thus, $x \in \left( \log{(1 - \varepsilon)}, \log{(1 + \varepsilon)} \right)$ implies that $|\exp(x) - 1| < \varepsilon$. Since $ \left( \log{(1 - \varepsilon)}, \log{(1 + \varepsilon)} \right)$ is open and contains 0, we can find $\delta > 0$ such that the $\delta$-neighborhood around 0 is contained in $\left( \log{(1 - \varepsilon)}, \log{(1 + \varepsilon)} \right)$. Then $|x| < \delta$ implies $|\exp{(x)} - 1| < \varepsilon$, which proves continuity at $x = 0$.
Proposition: The derivative of the exponential at the origin is 1.
See that if we take $h = \log{(1 + h^\prime)}$, then \begin{align*} \frac{\text{exp}(h) - 1}{h} &= \frac{\text{exp}{\left( \log{(1 + h^\prime)} \right)} - 1}{\log{(1 + h^\prime)}} = \frac{( 1 + h^\prime) - 1}{\log{(1 + h^\prime)} - \log{1}} = \left( \frac{\log{( 1 + h^\prime)} - \log{1}}{h^\prime} \right)^{-1} \end{align*} Recall that we have $h^\prime = \exp{(h)} - 1$. By continuity at zero, we have that $h^\prime \to 0$ as $h \to 0$. Thus, \begin{align*} \lim_{h \to 0} \frac{\text{exp}(h) - 1}{h} &= \lim_{h \to 0} \left( \frac{\log{( 1 + h^\prime)} - \log{1}}{h^\prime} \right)^{-1} = \left( \lim_{h^\prime \to 0} \frac{\log{( 1 + h^\prime)} - \log{1}}{h^\prime} \right)^{-1} = \left( \left[ \frac{d}{dx} \log{x} \right]_{x = 1} \right)^{-1} = 1 \end{align*}
Proposition: The logarithm is a group isomorphism from the mulitplicative group $\mathbb{R}_{> 0}$ to the additive group $\mathbb{R}$. That is, for all $x, y \in \mathbb{R}_{> 0}$, $\log{xy} = \log{x} + \log{y}$
Proof. Fix $y \in \mathbb{R}_{> 0}$ constant. Then we have \begin{align*} \frac{d}{dx} \log{xy} = \frac{y}{xy} = \log{x} = \frac{d}{dx} \log{x} \end{align*} Thus, there is $c \in \mathbb{R}$ such that $\log{xy} - \log{x} = c$ for all $x \in \mathbb{R}_{> 0}$. Taking $x = 1$ gives $c = \log{y} - \log{1} = \log{y}$, so $\log{xy} - \log{x} = \log{y}$ which proves the logarithm is a group homomorphism. It is an isomorphism because it is a bijection.
Corollary: The exponential map is a group isomorphism from the additive group $\mathbb{R}$ to the mulitplicative group $\mathbb{R}_{> 0}$.
Proof. This follows from the fact that inverses of bijective group isomorphisms are also isomorphisms.
We can now prove the follwoing funamental property of the exponential function:
Theorem: The exponential function is its own derivative. That is, \begin{align*} \frac{d}{dx} \exp{(x)} = \exp{(x)} \end{align*} Hence, it is a smooth function. Moreover, it admits a Taylor series expansion so that it is analytic on its entire domain as well.
Proof. We use the fact that the derivative of the exponential at 0 is 1, and the previous algebraic property: \begin{align*} \frac{d}{dx} \exp{(x)} = \lim_{h \to 0} \frac{\exp{(x + h)} - \exp{(x)}}{h} = \lim_{h \to 0} \frac{\exp{(x)} \left( \exp{(h)} - 1 \right)}{h} = \exp{(x)} \lim_{h \to 0} \frac{\exp{(h)} - 1}{h} = \exp{(x)} \end{align*}
From the above we see that all the derivatives of the exponential at $x = 0$ are 1. By Taylor's theorem with the Lagrange form of the remainder, we have for any given $x \in \mathbb{R}$ there exists $\xi_n \in \mathbb{R}$ between $0$ and $x$ such that \begin{align*} \exp{(x)} &= \sum_{k = 0}^n \frac{x^n}{n!} + \frac{\exp{(\xi_n)} x^{n + 1}}{(n + 1)!} \end{align*} We want to show that this remainder vanishes as $n \to \infty$. Indeed, because $\xi_n \le |\xi_n| \le |x|$, it follows that \begin{align*} \left| \frac{\exp{(\xi_n)} x^{n + 1}}{(n + 1)!} \right| \le \frac{\exp{(|x|)} |x|^n}{(n + 1)!} \longrightarrow 0 &\text{ as $n \to \infty$} \end{align*} where the last limit is a standard computation. Since the remainder vanishes in the limit, the exponential admits the Taylor expansion \begin{align*} \exp{(x)} = \sum_{n = 0}^{\infty} \frac{x^n}{n!} \end{align*} valid for all $x \in \mathbb{R}$.
We now also define for $s \in \mathbb{R}$, and $x \in \mathbb{R}_{> 0}$ $x^s = \exp{\left( s \log{x} \right)}$. It differentiates to \begin{align*} \frac{d}{dx} x^s &= \frac{d}{dx} \exp{\left( s \log{x} \right)} = \frac{s}{x} \exp{\left( s \log{x} \right)} = s \exp{ \left( - \log{x} \right)}\exp{\left( s \log{x} \right)} = s \exp{ \left( (s - 1) \log{x} \right)} = s x^{s - 1} \end{align*} We define Euler's constant as the real number $e = \exp{(1)}$. Then the exponential can be written as \begin{align*} \exp{(x)} &= \exp{\left( x \log{ \left( \exp{(1)} \right)} \right)} = \exp{\left( x \log{ e } \right)} = e^x \end{align*} Furthermore, Euler's constant has the series expansion \begin{align*} e &= \exp{(1)} = \sum_{n = 0}^{\infty} \frac{1}{n!} \end{align*} There is one more important representation of $e$:
Theorem: We have the following limit \begin{align*} e^x &= \lim_{n \to \infty} \left( 1 + \frac{x}{n} \right)^n \end{align*}
Proof. The terms of the sequence are strictly positive for all sufficiently large $n$, so it is valid to take logarithms. This yields \begin{align*} \lim_{n \to \infty} \left( 1 + \frac{x}{n} \right)^n = \lim_{n \to \infty} \exp{ \left( \log{\left( 1 + \frac{x}{n} \right)^n} \right) } = \lim_{n \to \infty} \exp{ \left( n \log{ \left( 1 + \frac{x}{n} \right)} \right) } = \exp{ \left( x \lim_{n \to \infty} \frac{\log{\left( 1 + \frac{x}{n} \right)} - \log{1}}{x/n} \right) } \end{align*} The expression inside the limit is a difference quotient for the logarithm at 1, and so tends to the derivative at 1. So the limit is equal to 1, and we have \begin{align*} \lim_{n \to \infty} \left( 1 + \frac{x}{n} \right)^n = \exp{\left( x \right)} \end{align*} which is what we claimed.
We have the following fact about Euler's constant as well:
Theorem: $e$ is irrational.
Proof. Suppose for contradiction that $e \in \mathbb{Q}$ so that $e = p/q$ for some $p, q \in \mathbb{Z}$. Since clearly $e > 0$, we can assume $p, q \ge 1$. It is clear from the Taylor series of the exponential function that $e > 2$. But see also that \begin{align*} e - 2 &= \sum_{n = 2}^\infty \frac{1}{n!} < \sum_{n = 2}^\infty \frac{1}{2^{n - 1}} = \sum_{n = 1}^\infty \frac{1}{2^n} = \frac{1/2}{1 - 1/2} = 1 \end{align*} so that $e < 3$. Hence, $e \notin \mathbb{Z}$. This implies that $q \ge 2$. We proceed as \begin{align*} \frac{p}{q} &= \sum_{n = 0}^\infty \frac{1}{n!} \\ p \left( q - 1 \right)! &= q! \sum_{n = 0}^q \frac{1}{n!} + q! \sum_{n = q + 1}^\infty \frac{1}{n!} \\ p \left(q - 1 \right)! - q! \sum_{n = 0}^q \frac{1}{n!} &= \frac{1}{(q + 1)} + \frac{1}{(q + 1)(q + 2)} + \frac{1}{(q + 1)(q + 2)(q + 3)} + \cdots \end{align*} Evidently, the left hand side above is an integer. However, see that the right hand side is positive but strictly less than 1 \begin{align*} \frac{1}{(q + 1)} + \frac{1}{(q + 1)(q + 2)} + \frac{1}{(q + 1)(q + 2)(q + 3)} + \cdots < \frac{1}{q} + \frac{1}{q^2} + \frac{1}{q^3} + \cdots = \frac{1/q}{1 - 1/q} = \frac{1}{q - 1} \le \frac{1}{2 - 1} = 1 \end{align*} so the right hand side cannot be an integer. This is a contradiction, so we are forced to conclude that no integers $p, q$ exist such that $e = p/q$. That is, $e$ is irrational.
Since the Taylor series for the real exponential converges on all of $\mathbb{R}$, it has an infinite radius of convergence as a function on the complex plane. Thus, we define the complex exponential using the same notation by the infinite series \begin{align*} \exp : \mathbb{C} \longrightarrow \mathbb{C} , z \longmapsto \sum_{n = 0}^\infty \frac{z^n}{n!} \end{align*} By standard properties of power series on the complex plane, the complex exponential is an entire function on $\mathbb{C}$. Further, if we fix $w \in \mathbb{R}$ then we have $\exp{(w + z)} = \exp{(x)} \exp{(z)}$ for all $z \in \mathbb{R}$. Since both sides of the equality are entire functions, the equality must hold for all complex $z \in \mathbb{C}$ by analytic continuation. By the same reasoning, we can allow $w \in \mathbb{C}$ as well. So the complex exponential is also a group homomorphism between the additive $\mathbb{C}$ and mulitplicative $\mathbb{C}^\times$. In particular, the exponential function never vanishes.
The sine and cosine functions $\cos, \sin : \mathbb{R} \longrightarrow \mathbb{R}$ are defined to be \begin{align*} \cos{x} &= \mathfrak{Re} \left( e^{ix} \right) \\ \sin{x} &= \mathfrak{Im} \left( e^{ix} \right) \end{align*} They have the following expressions as well:
Proposition: The sine and cosine functions can be expressed as \begin{align*} \cos{x} &= \frac{e^{ix} + e^{-ix}}{2} \\ \sin{x} &= \frac{e^{ix} - e^{-ix}}{2i} \end{align*}
Proof. See that we have \begin{align*} \left( \frac{e^{ix} + e^{-ix}}{2} \right) + i \left( \frac{e^{ix} - e^{-ix}}{2i} \right) = e^{ix} \end{align*} Since the real and imaginary parts of a complex number are uniquely determined, the result follows if we can show that the two terms above are real valued. We have \begin{align*} \frac{e^{ix} + e^{-ix}}{2} &= \frac{1}{2} \sum_{n = 0}^\infty \frac{(ix)^n}{n!} + \frac{1}{2} \sum_{n = 0}^\infty \frac{(-ix)^n}{n!} \\ &= \frac{1}{2} \sum_{n = 0}^\infty \left( (-1)^n \frac{x^{2n}}{(2n)!} + (-1)^n \frac{i x^{2n + 1}}{(2n + 1)!} \right) + \frac{1}{2} \sum_{n = 0}^\infty \left( (-1)^n \frac{x^{2n}}{(2n)!} + (-1)^{n + 1} \frac{i x^{2n + 1}}{(2n + 1)!} \right) \\ &= \sum_{n = 0}^\infty (-1)^n \frac{x^{2n}}{(2n)!} \\ \frac{e^{ix} - e^{-ix}}{2i} &= \frac{1}{2i} \sum_{n = 0}^\infty \frac{(ix)^n}{n!} - \frac{1}{2i} \sum_{n = 0}^\infty \frac{(- ix)^n}{n!} \\ &= \frac{1}{2i} \sum_{n = 0}^\infty \left( (-1)^n \frac{x^{2n}}{(2n)!} + (-1)^n \frac{i x^{2n + 1}}{(2n + 1)!} \right) - \frac{1}{2i} \sum_{n = 0}^\infty \left( (-1)^n \frac{x^{2n}}{(2n)!} + (-1)^{n + 1} \frac{i x^{2n + 1}}{(2n + 1)!} \right) \\ &= \sum_{n = 0}^\infty (-1)^n \frac{x^{2n + 1}}{(2n + 1)!} \end{align*} which are both real valued when $x \in \mathbb{R}$.
From above we also obtain the Taylor series expansions of sine and cosine. We have overall \begin{align*} \cos{x} &= \mathfrak{Re} \left( e^{ix} \right) = \frac{e^{ix} + e^{-ix}}{2} = \sum_{n = 0}^\infty (-1)^n \frac{x^{2n}}{(2n)!} \\ \sin{x} &= \mathfrak{Im} \left( e^{ix} \right) = \frac{e^{ix} - e^{-ix}}{2i} = \sum_{n = 0}^\infty (-1)^n \frac{x^{2n + 1}}{(2n + 1)!} \end{align*} There is also the following so-called Pythagorean identity:
Proposition: For all $x \in \mathbb{R}$, $\cos^2{x} + \sin^2{x} = 1$. Consequently, $|e^{ix}| = 1$.
Proof. A simple computation yields \begin{align*} \cos^2{x} + \sin^2{x} &= \left( \frac{e^{ix} + e^{-ix}}{2} \right)^2 + \left( \frac{e^{ix} - e^{-ix}}{2i} \right)^2 = \frac{e^{2ix} + 2 + e^{-2ix}}{4} - \frac{e^{2ix} - 2 + e^{-2ix}}{4} = 1 \end{align*}
Proposition: There exists $x > 0$ such that $\cos{x} = 0$.
Proof. By Taylor's theorem applied to the Taylor expansion of cosine up to the eigth term and the Lagrangian form of the remainder \begin{align*} \cos{2} &= \frac{1}{0!} - \frac{2^2}{2!} + \frac{2^4}{4!} - \frac{2^6}{6!} + \frac{\left( \cos{\xi} \right)2^8}{8!} \le 1 - 2 + \frac{16}{24} - \frac{64}{720} + \frac{256}{40320} = - \frac{1}{3} - \frac{64}{720} + \frac{256}{40320} < 0 \end{align*} for some $\xi \in (0, 2)$. Above we used the fact that always $|\cos{x}| \le 1$ which follows from the Pythagorean identity. Since $\cos{0} = 1$ and $\cos{2} < 0$, it follows from the intermediate value theorem that $\cos{y} = 0$ for some $y \in (0, 2)$.
The number $\pi \in \mathbb{R}$ will be defined as \begin{align*} \pi = 2 \text{inf} \left\{ x \in \mathbb{R}_{> 0} \mid \cos{x} = 0 \right\} \end{align*} By continuity of the cosine function, we will have $\cos{(\pi/2)} = 0$. Since $\pi/2$ is not larger than any positive root of the cosine function, we have $\cos{x} > 0$ for all $x \in (0, \pi/2)$. Because the cosine function is easily seen to be symmetric around 0, we have $\cos{(x)} > 0$ for all $x \in (-\pi/2, \pi/2)$. We can also evaluate $\sin{x}$ at $x = \pi/2$ From the Pythagorean relation, we have \begin{align*} \sin^2{\left( \frac{\pi}{2} \right)} = 1 - \cos^2{\left( \frac{\pi}{2} \right)} = 1 \end{align*} so $\sin{(\pi/2)} = \pm 1$. If we had $\sin{(\pi/2)} = -1$, then because also $\sin{0} = 0$, the mean value theorem tells us that the derivative of the sine function vanishes somewhere between 0 and $\pi/2$. But the derivative of the sine funciton is the cosine function which has already been argued to never vanish in this region, and so this is not possible. Hence, $\sin{(\pi/2)} = 1$. This also proves \begin{align*} e^{i \pi/2} &= \cos{ \left( \frac{\pi}{2} \right)} + i \sin{ \left( \frac{\pi}{2} \right)} = i \end{align*} Now to evaluate $\sin{\pi}$. This is \begin{align*} \sin{\pi} &= \frac{e^{i\pi} - e^{-i\pi}}{2i} = 2 \frac{\left( e^{i\pi/2} - e^{-i\pi/2} \right)}{2i} \frac{\left( e^{i \pi/2} + e^{-i \pi/2} \right)}{2} = 2 \cos{ \left( \frac{\pi}{2} \right)} \sin{ \left( \frac{\pi}{2} \right)} = 2 \cdot 0 \cdot 1 = 0 \end{align*} See also that \begin{align*} \sin{ \left( \frac{\pi}{2} - x \right)} &= \frac{e^{i \pi/2 - ix} - e^{-i \pi / 2 + ix}}{2i} = e^{-i \pi/2} \frac{e^{i \pi/2 - ix} - e^{-i \pi / 2 + ix}}{2} = \frac{e^{- ix} - e^{-i \pi + ix}}{2} = \frac{(- e^{-i\pi}) e^{ix} + e^{-ix}}{2} \end{align*} Taking $x = 0$, we have \begin{align*} 1 &= \sin{ \left( \frac{\pi}{2} \right)} = \frac{(- e^{-i\pi}) e^{0} + e^{0}}{2} = \frac{1 - e^{-i \pi}}{2} \end{align*} This proves that $e^{-i\pi} = -1$. Consequently, $e^{i \pi} = -1$ as well, and $\cos{(\pi / 2)} = \mathfrak{Re}(e^{i \pi}) = -1$. Then we have also \begin{align*} \sin{ \left( \frac{\pi}{2} - x \right)} = \frac{(- e^{-i\pi}) e^{ix} + e^{-ix}}{2} = \frac{e^{ix} + e^{-ix}}{2} = \cos{x} \end{align*} Replacing $x$ with $\pi/2 - x$ will yeild $\cos{(\pi/2 - x)} = \sin{x}$. Since $\cos{x}$ is positive between on $(-\pi/2, \pi/2)$, it follows from this that $\sin{x}$ is positive on $(0, \pi)$. Now see that \begin{align*} \sin{(x + \pi)} = \frac{e^{i \pi + ix} - e^{-i \pi - ix}}{2i} = \frac{- e^{ix} + e^{-ix}}{2} = - \sin{x} \end{align*} This means that $\sin{x} \ne 0$ for all $x \in (\pi, 2 \pi)$, for otherwise the sine function would take the value 0 somewhere in $(0, \pi)$. This cannot be since $\sin{x}$ must be strictly positive in this region. Hence, the only zeros of the sine function are \begin{align*} \left\{ x \in \mathbb{R} \mid \sin{x} = 0 \right\} = \left\{ n \pi \mid n \in \mathbb{Z} \right\} \end{align*} If the sine function is to be periodic, its period must then be an integer multiple of $\pi$. It is not $\pi$ because $\sin{(x + \pi)} = - \sin{(x)}$. But see that \begin{align*} \sin{ \left( x + 2 \pi \right) } = \sin{ \left( (x + \pi) + \pi \right)} = - \sin{(x + \pi)} = - \sin{x} \end{align*} So $\sin{x}$ is $2 \pi$-periodic, and this is the minimal period. Likewise, we have from the relation $\cos{x} = \sin{(\pi/2 - x)}$ that $\cos{x}$ is also $2 \pi$-periodic.