The Gamma Function

Muhammad Haris Rao


For $n \in \mathbb{Z}_{> 0}$, define the functions $I_n, J_n : \{ s \in \mathbb{C} \mid \mathfrak{Re}(s) > 0 \} \longrightarrow \mathbb{C}$ by \begin{align*} I_n(s) &= \int_{1/n}^1 t^{s - 1} e^{-t} \, dt \\ J_n(s) &= \int_1^n t^{s - 1} e^{-t} \, dt \end{align*}

Theorem: The sequences $\{ I_n \}_{n = 1}^\infty, \{ J_n \}_{n = 1}^\infty$ of functions converge uniformly on every compact subset of $\{ s \in \mathbb{C} \mid \mathfrak{Re}(s) \}$.

Proof. Let $K \subseteq \{ s \in \mathbb{C} \mid \mathfrak{Re}(s) > 0 \}$ be compact. Since $K$ is compact, we can denote $$ \sigma_0 = \min_{s \in K} \{ \mathfrak{Re}(s) \} \,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\, \sigma_0^* = \max_{s \in K} \{ \mathfrak{Re}(s) \} $$ and $\sigma_0, \sigma_0^* > 0$.

First, to show that $\{ I_n \}_{n = 1}^\infty$ converges uniformly on $K$. For this, we show that $\{ I_n \}_{n = 1}^\infty$ is a uniformly Cauchy sequence of functions on $K$.

So let $\varepsilon > 0$. Set $N = 2 \lceil (\sigma_0 \varepsilon)^{-1/\sigma_0}\rceil$. Then if $n, m \in \mathbb{Z}_{\ge N}$ with $n \ge m$ and $s \in K$, \begin{align*} \left| I_n(s) - I_m(s) \right| &= \left| \int_{1/n}^1 t^{s - 1} e^{-t} \, dt - \int_{1/m}^1 t^{s - 1} e^{-t} \, dt \right| \\ &= \left| \int_{1/n}^{1/m} t^{s - 1} e^{-t} \, dt \right| \\ &\le \int_{1/n}^{1/m} t^{\mathfrak{Re}(s) - 1} e^{-t} \, dt \\ &\le \int_{1/n}^{1/m} t^{\sigma_0 - 1} \, dt \\ &= \frac{1}{\sigma_0} \left( \frac{1}{m^\sigma_0} - \frac{1}{n^{\sigma_0}} \right) \\ &< \frac{1}{\sigma_0 N^{\sigma_0}} \\ &\le \frac{\varepsilon}{2} \end{align*} Thus, we can conclude that if $n, m \in \mathbb{Z}_{\ge N}$ then $$ \sup_{s \in K} \left| I_n(s) - I_m(s) \right| < \varepsilon $$ so that $\{ I_n \}_{n = 1}^\infty$ is uniformly Cauchy as a sequence of functions on $K$, so converges uniformly to some function on $K$. Thus, the sequence $\{ I_n \}_{n = 1}^\infty$ converges uniformly on every compact subset of $\{ s \in \mathbb{C} \mid \mathfrak{Re}(s) > 0 \}$.

Now to show $\{ J_n \}_{n = 1}^\infty$ converges uniformly to a limit as a sequence of functions on $K$. Again, we will do this by showing it is uniformly Cauchy.

Let $\varepsilon > 0$. Set $N = \lceil 2M!/\varepsilon \rceil$ where $M = \lceil \sigma_0^* + 1 \rceil$. Then if $n, m \in \mathbb{Z}_{\ge N}$ with $n \ge m$ and $s \in K$, \begin{align*} \left| J_n (s) - J_m(s) \right| &= \left| \int_1^n t^{s - 1} e^{-t} \, dt - \int_1^m t^{s - 1} e^{-t} \, dt \right| \\ &= \left| \int_n^m t^{s - 1} e^{-t} \, dt \right| \\ &\le \int_n^m t^{\mathfrak{Re}(s) - 1} e^{-t} \, dt \\ &\le \int_n^m t^{\sigma_0^* - 1} e^{-t} \, dt \\ \end{align*} Recall that $M = \lceil \sigma_0^* + 1 \rceil$. When $t \ge 0$ we have the bound $$ e^t = \sum_{k = 0}^\infty \frac{t^k}{k!} \ge \frac{t^M}{M!} $$ So it follows that $e^{-t} \le M! t^{-M} \le M! t^{-\sigma_0^* + 1}$ when $t \ge 1$. Hence, \begin{align*} \left| J_n (s) - J_m(s) \right| &\le \int_n^m t^{\sigma_0^* - 1} e^{-t} \, dt \\ &\le M! \int_n^m t^{\sigma_0^* - 1} t^{-\sigma_0^* + 1} \, dt \\ &\le M! \int_N^\infty \frac{1}{t^2} \, dt \\ &= \frac{M!}{N} \\ &\le \frac{\varepsilon}{2} \end{align*} Hence, we have that if $n, m \in \mathbb{Z}_{\ge N}$ then $$ \sup_{s \in K} \left| J_n(s) - J_m(s) \right| < \varepsilon $$ which means that $\{ J_n \}_{n = 1}^\infty$ is a uniformly Cuachy sequence of functions on $K$, so then converges uniformly to a limit. Thus, $\{ J_n \}_{n = 1}^\infty$ converges uniformly on every compact subset of $\{ s \in \mathbb{C} \mid \mathfrak{Re}(s) > 0 \}$.

This concludes the proof.$\blacksquare$


Recall the following theorem about integrals of holomorphic functions:

Theorem: Let $\Omega \subseteq \mathbb{C}$ be open, $I \subseteq \mathbb{R}$ a compact interval, and $f : I \times \Omega \longrightarrow \mathbb{C}$ continuous. If $g_t : \Omega \longrightarrow \mathbb{C}$ defined by $g_t(s) = f(t, s)$ is holomorphic on $\Omega$ for every $t \in I$, then \begin{align*} F : \Omega &\longrightarrow \mathbb{C} \\ s &\longmapsto \int_I f(t, s) \, dt \end{align*} is holomorphic on $\Omega$. Furthermore, if $\frac{\partial}{\partial s} f : I \times \Omega \longrightarrow \mathbb{C}$ is continuous, then $$ \frac{d}{ds} F(s) = \int_I \frac{\partial}{\partial s} f(t, s) \, dt $$

We have the following consequence:

Corollary: If $n \in \mathbb{Z}_{> 0}$, the functions $I_n, J_n$ are holomorphic on $\{ s \in \mathbb{C} \mid \mathfrak{Re}(s) > 0 \}$ with derivatives given by \begin{align*} \frac{d^k}{ds^k} I_n (s) &= \int_{1/n}^1 t^{s - 1} e^{-t} \log^k{t} \, dt \\ \frac{d^k}{ds^k} J_n (s) &= \int_1^n t^{s - 1} e^{-t} \log^k{t} \, dt \end{align*} for all $k \in \mathbb{Z}_{> 0}$.


We would now like to take the limit of the sequences $\{ I_n \}_{n = 1}^\infty$, $\{ J_n \}_{n = 1}^\infty$ and see what happens. For this, we apply the following theorem:

Theorem: Let $\Omega \subseteq \mathbb{C}$ be open, and $\{ f_n \}_{n = 1}^\infty$ a sequence of holomorphic on $\Omega$. If the sequence $\{ f_n \}_{n = 1}^\infty$ converges uniformly on every compact subset of $\mathbb{C}$ to some $f : \Omega \longrightarrow \mathbb{C}$, then $f$ is holomorphic on $\Omega$, and for all $k \in \mathbb{Z}_{> 0}$, the sequence $\{ f_n^{(k)} \}_{n = 1}^\infty$ converges uniformly to $f^{(k)}$ on all compact subsets of $\Omega$.

We have already shown that the sequences of functions $\{ I_n \}_{n = 1}^\infty$, $\{ J_n \}_{n = 1}^\infty$ converge uniformly on all compact subsets to a limit, so we can apply the above to obtain

Corollary: The functions $I, J : \{ s \in \mathbb{C} \mid \mathfrak{Re}(s) > 0 \} \longrightarrow \mathbb{C}$ is defined by \begin{align*} I(s) &= \lim_{n \to \infty} I_n(s) = \int_0^1 t^{s - 1} e^{-t} \, dt \\ J(s) &= \lim_{n \to \infty} J_n(s) = \int_1^\infty t^{s - 1} e^{-t} \, dt \end{align*} are holomorphic on $\{ s \in \mathbb{C} \mid \mathfrak{Re}(s) \}$ with derivatives given by \begin{align*} I^{(k)}(s) &= \lim_{n \to \infty} I_n^{(k)} (s) = \int_0^1 t^{s - 1} e^{-t} \log^k{t} \, dt \\ J^{(k)}(s) &= \lim_{n \to \infty} J_n^{(k)} (s) = \int_1^\infty t^{s - 1} e^{-t} \log^k{t} \, dt \end{align*} for all $k \in \mathbb{Z}_{> 0}$.


Finally, we have done all the preparation to define the Gamma function. It is $\Gamma = I + J$ with $I, J$ as above. That is, $\Gamma : \{ s \in \mathbb{C} \mid \mathfrak{Re}(s) > 0 \} \longrightarrow \mathbb{C}$ defined by $$ \Gamma(s) = I(s) + J(s) = \int_0^\infty t^{s - 1} e^{-t} \, dt + \int_1^\infty t^{s - 1} e^{-t} \, dt = \int_0^\infty t^{s - 1} e^{-t} \, dt $$ As the sum of two holomorphic functions, $\Gamma$ is also holomorphic with its derivatives given by $$ \Gamma^{(n)}(s) = I^{(n)}(s) + J^{(n)} (s) = \int_0^1 t^{s - 1} e^{-t} \log^n{t} \, dt + \int_1^\infty t^{s - 1} e^{-t} \log^n{t} \, dt = \int_0^\infty t^{s - 1} e^{-t} \log^n{t} \, dt \\ $$ for all $n \in \mathbb{Z}_{> 0}$

Analytic Continuation


Now we will analytically continue $\Gamma$ further to the left. We require the following fact:

Proposition: It holds that $\Gamma(s + 1) = s \Gamma(s)$ for all $s \in D$.

Proof. Integration by parts yields \begin{align*} \Gamma\left( s \right) &= \int_0^\infty t^{s - 1} e^{-t} \, dt = \left[ \frac{t^s e^{-t}}{s} \right]_0^\infty + \frac{1}{s} \int_0^\infty t^s e^{-t} \, dt = \frac{1}{s} \int_0^\infty t^s e^{-t} \, dt = \frac{\Gamma(s + 1)}{s} \end{align*} Which is $\Gamma(s + 1) = s \Gamma(s)$. $\blacksquare$

Now set $\Gamma_0 = \Gamma$, and write $D_n = \{ s \in \mathbb{C} \mid \mathfrak{Re}(s) > -n, s \notin \mathbb{Z}_{\le 0} \}$. Given $\Gamma_{n + 1} : D_{n + 1} \longrightarrow \mathbb{C}$, define \begin{align*} \Gamma_{n + 1} \left( s \right) &= \frac{ \Gamma_n \left( s + 1 \right) }{s} \end{align*} It can be shown that all the $\Gamma_n$ agree on their common domains:

Proposition: The functional equation $\Gamma_n(s + 1) = s \Gamma_n(s)$ holds for all $n \ge 0$, $s \in D_{n + 1}$. Furthermore, $\Gamma_{n + 1}(s) = \Gamma_n(s)$ for all $s \in D_n$.

We proceed by induction. The base case $\Gamma_0(s + 1) = s\Gamma_0(s)$ is already given. Supposing that the result holds for some $n \ge 0$, we have for the $n + 1$ case \begin{align*} \Gamma_{n + 1} \left( s + 1 \right) &= \frac{ \Gamma_n \left( s + 2 \right) }{s + 1} = \frac{ (s + 1) \Gamma_n \left( s + 1 \right) }{s + 1} = \Gamma_{n} (s + 1) = s \Gamma_{n + 1}(s) \end{align*} where the first and last equalities are from the definition of $\Gamma_{n + 1}$ in terms of $\Gamma_n$. The application of the induction step is valid because $s \in D_{n + 1}$ implies that $s + 1 \in D_n$. The above chain of equalities also contains $\Gamma_{n + 1} \left( s + 1 \right) = \Gamma_{n} (s + 1)$ for all $s \in D_n$, which gives the second part of the desired result. $\blacksquare$

The definition $\Gamma_{n + 1} \left( s \right) = \Gamma_n \left( s + 1 \right) / s$ also proves that $\Gamma_{n + 1}$ is holomorphic on $D_{n + 1}$ provided that $\Gamma_n$ is on $D_n$. A simple induction proves then that all the $\Gamma_n$ are holomorphic on their respective domains. Hence, each $\Gamma_{n + 1}$ is an analytic continuation of the last, and we have a whole sequence of analytic continuations of $\Gamma$ further and further to the left, avoiding only the points $\mathbb{Z}_{\le 0}$. Thus, we have a unique extension \begin{align*} \Gamma : \mathbb{C} - \{ 0, -1, -2, \cdots \} \to \mathbb{C} \end{align*} which is holomorphic on its domain. We will now show that $\Gamma$ cannot be analytically continued any further.

Theorem: $\Gamma$ has simple poles at each $\{ 0, -1, -2, \cdots \}$ with residue given by \begin{align*} \underset{s = -n}{\text{Residue}} \left\{ \Gamma \left( s \right) \right\} = \frac{(-1)^n}{n!} \end{align*}

Proof. We have for all $n \ge 0$ \begin{align*} \Gamma(s) &= \frac{\Gamma(s + 1)}{s} = \frac{\Gamma(s + 2)}{s(s + 1)} = \cdots = \frac{\Gamma(s + (n + 1))}{s(s + 1)(s + 2) \cdots (s + n)} \end{align*} Evidently, the numerator of the rightmost espression is holomorphic near $-n$, and the denominator has a simple zero. Thus, $\Gamma$ has a simple pole at $s = -n$ with residue \begin{align*} \lim_{s \to -n} \left(s + n \right) \Gamma\left( s \right) = \lim_{s \to -n} \frac{\Gamma(s + (n + 1))}{s(s + 1)(s + 2) \cdots (s + (n - 1))} = \frac{\Gamma(1)}{(-1)(-2)(-3)\cdots(-(n - 1)) (-n)} = \frac{(-1)^n}{n!} \end{align*} The fact that $\Gamma(1) = 1$ is easily calculated from the initial integral defnition.$\blacksquare$

A Characterisation


Recall that a function $f : \mathbb{R}_{> 0} \to \mathbb{R}$ is convex if \begin{align*} f\left( (1 - t) x + t y \right) \le (1 - t) f(x) + t f(y) \end{align*} for any $0 \le x < y$ and $t \in [0, 1]$. It is a standard fact that convexity is implied if $f$ is twice continuously differentiable and the second derivative is non-negative. Using this fact, we show that the logarithm of the $\Gamma : \mathbb{R}_{> 0} \to \mathbb{R}$ is convex.

Theorem: $\log \, \Gamma(s)$ is convex.

Proof. Since $\Gamma$ is holomorphic on the simply connected domain where the real part of $s$ is positive, its logarithm is also holomorphic. In particular, $\log{\Gamma} : \mathbb{R}_{> 0} \to \mathbb{R}$ is twice differentiable, and \begin{align*} \frac{d^2}{ds^2} \log{\Gamma(s)} &= \frac{d}{ds} \frac{\Gamma^\prime(s)}{\Gamma(s)} = \frac{\Gamma^{\prime\prime}(s) \Gamma(s) - \left( \Gamma^\prime(s) \right)^2}{\Gamma^2(s)} \end{align*} The denominator is clearly positive, so we only need to prove that $\Gamma^{\prime\prime}(s) \Gamma(s) > \left( \Gamma^\prime(s) \right)^2$ which is \begin{align*} \int_0^\infty \left( \log^2{t} \right) t^{s - 1} e^{-t} \, dt \, \int_0^\infty t^{s - 1} e^{-t} \, dt > \left( \int_0^\infty \left( \log{t} \right) t^{s - 1} e^{-t} \, dt \right)^2 \end{align*} This will follow from the Cauchy Schwartz inequality applied to the inner product \begin{align*} \langle f, g \rangle &= \int_0^\infty f (t) g (t) t^{s - 1} e^{-t} \, dt \end{align*} defined on a suitable class of functions. We prove the result here for completeness by simply adapting a proof the the general Cauchy-Schwartz inequaltiy. Let $\lambda \in \mathbb{R}$. Then, \begin{align*} 0 < \int_0^\infty \left( 1 - \lambda \log{t} \right)^2 t^{s - 1} e^{-t} \, dt = \int_0^\infty t^{s - 1} e^{-t} \, dt - 2 \lambda \int_0^\infty \left( \log{t} \right) t^{s - 1} e^{-t} \, dt + \lambda^2 \int_0^\infty \left( \log^2{t} \right) t^{s - 1} e^{-t} \, dt \end{align*} The right hand side can be viewed as a quadratic equation in $\lambda$. Since its value is strictly positive, the equation has no real roots and hence has strictly negative discriminant. Thus, \begin{align*} \left( - 2 \int_0^\infty \left( \log{t} \right) t^{s - 1} e^{-t} \, dt \right)^2 - 4 \left( \int_0^\infty t^{s - 1} e^{-t} \, dt \right) \left( \int_0^\infty \left( \log^2{t} \right) t^{s - 1} e^{-t} \, dt \right) < 0 \end{align*} Rearranging this gives the desired inequality \begin{align*} \left( \int_0^\infty \left( \log{t} \right) t^{s - 1} e^{-t} \, dt \right)^2 < \left( \int_0^\infty t^{s - 1} e^{-t} \, dt \right) \left( \int_0^\infty \left( \log^2{t} \right) t^{s - 1} e^{-t} \, dt \right) \end{align*} $\space$$\blacksquare$

We will use the following two facts for convex functions

Lemma: Let $f : I \to \mathbb{R}$ be convex for some open interval $I \subseteq \mathbb{R}$. Set \begin{align*} g : \{ (x, y) \in \mathbb{R}^2 \mid x \ne y \} \longrightarrow \mathbb{R} \\ (x, y) \longmapsto \frac{f(y) - f(x)}{y - x} \end{align*} Then $g$ is increasing in both arguments.

Lemma: Let $f : I \to \mathbb{R}$ be convex for some open interval $I \subseteq \mathbb{R}$. If $f$ is twice continuously differentiable with $f^{\prime\prime}(x) \ge 0$ for all $x \in I$, then $f$ is convex.

So far, we have the following three facts about $\Gamma : \mathbb{R}_{> 0 } \to \mathbb{R}$:

It turns out that these three properties completely charactersise $\Gamma$. That is,

Theorem: There is a unique function $f : \mathbb{R}_{> 0} \to \mathbb{R}$ which satisfies

Proof. Suppose that $f : \mathbb{R}_{> 0} \to \mathbb{R}$ satisfies the above three conditions. By logarithmic convexity and the fact that the difference quotient of a convex function is incereasign in both arguments, we have \begin{align*} \log{f(n + 1) - \log{f(n)}} \le \frac{\log{f(n + 1 + x)} - \log{f(n + 1)}}{x} \le \log{f(n + 2)} - \log{f(n + 1)} \end{align*} But see that \begin{align*} \log{f(n + 1)} - \log{f(n)} = \log{ n f(n)} - \log{f(n)} = \log{n} \end{align*} and likewise $\log{f(n + 2)} - \log{f(n + 1)} = \log{(n + 1)}$. So really \begin{align*} \log{n} \le \frac{\log{f(n + 1 + x)} - \log{f(n + 1)}}{x} \le \log{(n + 1)} \\ \end{align*} Taking off $\log{n}$ from each expression and mulitplying through by $x$ yields \begin{align*} 0 \le \log{f(n + 1 + x)} - \log{f(n + 1)}- \log{n^x} \le x\log{\left(1 + \frac{1}{n} \right))} \end{align*} Repeated applications of the relation $f(s + 1) = sf(x)$ tells us that \begin{align*} \log{f(n + 1 + x)} = \log{ \left( x(x + 1)(x + 2) \cdots (x + n) f(x) \right) } = \log{f(x)} + \log{ \left( x(x + 1)(x + 2) \cdots (x + n) \right)} \end{align*} Also, $\log{f(n + 1)} = \log{n!}$. Hence, \begin{align*} \log{f(n + 1 + x)} - \log{f(n + 1)}- \log{n^x} &= \log{f(x)} + \log{ \left( x(x + 1)(x + 2) \cdots (x + n) \right)} - \log{n!} - \log{n^x} \\ &= \log{f(x)} - \log{ \left( \frac{n^x n!}{x (x + 1) (x + 2) \cdots (x + n)} \right) } \end{align*} Hence, \begin{align*} 0 \le \log{f(x)} - \log{ \left( \frac{n^x n!}{x (x + 1) (x + 2) \cdots (x + n)} \right) } \le x \log{ \left( 1 + \frac{1}{n} \right) } \end{align*} Taking $n \to \infty$ an applying the sandwich theorem proves that \begin{align*} f(x) &= \lim_{n \to \infty} \frac{n^x n!}{x (x + 1) (x + 2) \cdots (x + n)} \end{align*} Thus, if such an $f$ exists, it is completely determined by the above equation. Since the $\Gamma$ function satisfies the desired properties, this completes the proof. $\blacksquare$

As a corollary we also have the alternate expression: \begin{align*} \Gamma(s) &= \lim_{n \to \infty} \frac{n^s n!}{s (s + 1) (s + 2) \cdots (s + n)} \end{align*} valid for at least $s > 0$.

Weierstrass Product for Gamma


Theorem: If $s \in \mathbb{C} - \mathbb{Z}_{\le 0}$ and $\Gamma(s) \ne 0$, then \begin{align*} \frac{1}{\Gamma(s)} &= s e^{\gamma s} \prod_{n = 1}^\infty \left( 1 + \frac{s}{n} \right) e^{-s/n} \end{align*} and the expression on the right defines an entire function. Consequently, $1/\Gamma$ extends to an entire function so that $\Gamma(s) \ne 0$ for all $s \in \mathbb{C} - \mathbb{Z}_{\le 0}$.

Proof. First to prove the equality for $s > 0$. See that \begin{align*} \frac{n!}{s (s + 1) (s + 2) \cdots (s + n)} = \frac{1}{s} \prod_{k = 1}^n \frac{k}{s + k} = \frac{1}{s} \prod_{k = 1}^n \left( 1 + \frac{s}{k} \right)^{-1} \end{align*} Moreover, \begin{align*} n^s = \exp{ \left( s \log{n} \right)} = \exp{ \left( s \left( \log{n} - \sum_{k = 1}^n \frac{1}{n} \right) \right)} \prod_{k = 1}^n e^{s/n} \end{align*} Hence, \begin{align*} \frac{n^s n!}{s (s + 1) (s + 2) \cdots (s + n)} &= \frac{\exp{ \left( s \left( \log{n} - \sum_{k = 1}^n \frac{1}{n} \right) \right)}}{s} \left( \prod_{k = 1}^n \left( 1 + \frac{s}{k} \right)^{-1} \right) \left( \prod_{k = 1}^n e^{s/n} \right) \\ &= \frac{\exp{ \left( s \left( \log{n} - \sum_{k = 1}^n \frac{1}{n} \right) \right)}}{s} \prod_{k = 1}^n \left( 1 + \frac{s}{k} \right)^{-1} e^{s/k} \end{align*} Taking $n \to \infty$, \begin{align*} \lim_{n \to \infty} \frac{n^s n!}{s (s + 1) (s + 2) \cdots (s + n)} &= \frac{\exp{ \left( s \lim_{n \to \infty} \left( \log{n} - \sum_{k = 1}^\infty \frac{1}{n} \right) \right)}}{s} \lim_{n \to \infty} \prod_{k = 1}^\infty \left( 1 + \frac{s}{k} \right)^{-1} e^{s/k}\\ &= \frac{e^{-\gamma s}}{s} \prod_{n = 1}^\infty \left( 1 + \frac{s}{n} \right)^{-1} e^{s/n} \end{align*} Where $\gamma$ is the Euler-Mascheroni constant. Taking reciprocals, we have now \begin{align*} \frac{1}{\Gamma(s)} &= \left( \lim_{n \to \infty} \frac{n^s n!}{s (s + 1) (s + 2) \cdots (s + n)} \right)^{-1} = s e^{\gamma s} \prod_{n = 1}^\infty \left( 1 + \frac{s}{n} \right) e^{-s/n} \end{align*} valid for $s > 0$. Now to extend this equality further. For this, we will prove that the infinite product on the right converges uniformly on all compact subsets of $\mathbb{C}$, and hence defines an entire function. Then, we will able to finish by applying the principle of analytic continuation.

For $R \in \mathbb{Z}_{> 0}$, let $D_R \subseteq \mathbb{C}$ be the open disk of radius $R$ around 0. Recall that the complex logarithm can be defined as \begin{align*} \log{(1 + s)} &= \sum_{n = 1}^\infty \frac{(-1)^{n + 1} s^n}{n} \end{align*} valid for $|s| < 1$. If $s \in D_R$ and $n \in \mathbb{Z}_{> 2R}$ then $|s/n| < 1/2$ so we are able to take logarithms and obtain the bound \begin{align*} \left| \log{ \left( 1 + \frac{s}{n}\right) } - \frac{s}{n} \right| &= \left| \sum_{k = 1}^\infty \frac{(-1)^{k + 1} (s/n)^k}{k} - \frac{s}{n} \right| = \left| \sum_{k = 2}^\infty \frac{(-1)^{k + 1} (s/n)^k}{k} \right| \le \frac{1}{2} \sum_{k = 2}^\infty \left| \frac{s}{n} \right|^k = \frac{1}{2} \frac{|s/n|^2}{1 - |s/n|} \le \frac{R^2}{n^2} \end{align*} Hence, we have uniform convergence because \begin{align*} \lim_{N \to \infty} \sup_{s \in D_R} \left| \sum_{n = 2R + 1}^\infty \left( \log{ \left( 1 + \frac{s}{n}\right) } - \frac{s}{n} \right) - \sum_{n = 2R + 1}^N \left( \log{ \left( 1 + \frac{s}{n}\right) } - \frac{s}{n} \right)\right| &\le \lim_{N \to \infty} \sup_{s \in D_R} \sum_{n = N + 1}^\infty \left| \log{ \left( 1 + \frac{s}{n}\right) } - \frac{s}{n} \right| \\ &\le \lim_{N \to \infty} \sup_{s \in D_R}\sum_{n = N + 1}^\infty \frac{R^2}{n^2} \\ &\le \lim_{N \to \infty} \sum_{n = N + 1}^\infty \frac{R^2}{n^2} \\ &= 0 \end{align*} This proves the infinite sum starting from $n = 2R + 1$ defines a holomorphic function on $D_R$. Since compositions and products of holomorphic functions are holomorphic, \begin{align*} s e^{\gamma s} \prod_{n = 1}^\infty \left( 1 + \frac{s}{n} \right) &= \left( s e^{\gamma s} \prod_{n = 1}^{2R} \left( 1 + \frac{s}{n} \right) e^{-s/n} \right) \exp{ \left( \sum_{n = 2R + 1}^\infty \left( \log{ \left( 1 + \frac{s}{n}\right) } - \frac{s}{n} \right) \right) } \end{align*} is holomorphic on $D_R$. Since $R \in \mathbb{Z}_{> 0}$ was arbitrary, this is an entire function. The equality with $1/\Gamma(s)$ has already been shown for $s > 0$, and now this equality extends by analytic continuation to all $s \in \mathbb{C} - \mathbb{Z}_{\le 0}$ such that $\Gamma(s) \ne 0$, that is, the points where $1/\Gamma$ is well-defined and holomorphic. But since we have identified $1/\Gamma$ with a function which we just showed is entire, it follows that $1/\Gamma$ extends to an entire function. This can only mean that $\Gamma$ vanishes nowhere in its domain. $\blacksquare$

The Reflection Formula


Here, we will need the identity \begin{align*} \sin{z} &= z \prod_{n = 1}^\infty \left(1 - \frac{z^2}{n^2 \pi^2} \right) \end{align*} valid for all $z \in \mathbb{C}$. Using this and the Weierstrass product formula for $\Gamma$ we have the reflection formula:

Theorem: For $s \notin \mathbb{Z}$, \begin{align*} \Gamma(s) \Gamma(1 - s) = \frac{\pi}{\sin{\pi s}} \end{align*}

Proof. If $s \in \mathbb{C}$, then \begin{align*} \frac{1}{\Gamma(s) \Gamma(-s)} &= \left( s e^{\gamma s} \prod_{n = 1}^\infty \left( 1 + \frac{s}{n} \right) e^{-s/n} \right) \left( -s e^{- \gamma s} \prod_{n = 1}^\infty \left( 1 - \frac{s}{n} \right) e^{s/n} \right) = -s^2 \prod_{n = 1}^\infty \left( 1 - \frac{s^2}{n^2} \right) \end{align*} Since $\Gamma(1 - s) = -s \Gamma(-s)$, it follows that \begin{align*} \frac{1}{\Gamma(s) \Gamma(1 - s)} &= s \prod_{n = 1}^\infty \left( 1 - \frac{s^2}{n^2} \right) = \frac{\sin{\pi s}}{\pi} \end{align*} $\space$$\blacksquare$

From the reflection formula, one obtains by taking $s = 1/2$ \begin{align*} \Gamma(1/2)^2 &= \frac{\pi}{\sin{\pi/2}} = \pi \end{align*} which means $\Gamma(1/2) = \sqrt{\pi}$. Recalling the original integral definition of $\Gamma$, this means \begin{align*} \Gamma(1/2) = \int_0^\infty \frac{e^{-t}}{\sqrt{t}} \, dt = \sqrt{\pi} \end{align*} Using this, we can compute the Gaussian integral. By using a change of variables from $t$ to $t^2$, \begin{align*} \int_0^\infty \frac{e^{-t}}{\sqrt{t}} \, dt &= \int_0^\infty \frac{e^{-t^2}}{\sqrt{t^2}} (2t) \, dt = 2 \int_0^\infty e^{-t^2} \, dt = \int_{-\infty}^\infty e^{-t^2} \, dt \end{align*} Hence, the Gaussian integral is \begin{align*} \int_{-\infty}^\infty e^{-t^2} \, dt = \sqrt{\pi} \end{align*}

Logarithms and Polygamma Functions


Proposition: For $|s| < 1$, \begin{align*} \prod_{n = 1}^\infty \left( 1 + \frac{s}{n} \right)^{-1} e^{s/n} &= \exp{ \left( \sum_{n = 1}^\infty \left( -\log{\left( 1 + \frac{s}{n} \right)} + \frac{s}{n} \right) \right) } \end{align*} where the logarithm is defined by the usual Taylor series around 1. The sum defines a holomorphic function on the open unit disk.

Proof. We will prove that the sum converges uniformly on the open unit disk denoted $D$. Using the bound below valid for all $s \in D$ and when $n \in \mathbb{Z}_{\ge 2}$, \begin{align*} \left| -\log{\left( 1 + \frac{s}{n} \right)} + \frac{s}{n} \right| &= \left| \sum_{k = 2}^\infty \frac{(-1)^{k + 1} (s/n)^k}{k} \right| \le \sum_{k = 2}^\infty \frac{|s/n|^k}{k} \le \sum_{k = 2}^\infty \frac{1}{n^k k} \le \frac{1}{2} \sum_{k = 2}^\infty \frac{1}{n^k} = \frac{1}{2} \frac{1/n^2}{1 - 1/n} \le \frac{1}{n^2} \end{align*} it follows that \begin{align*} \lim_{N \to \infty} \sup_{s \in D} \left| \sum_{n = 1}^\infty \left( -\log{\left( 1 + \frac{s}{n} \right)} + \frac{s}{n} \right) - \sum_{n = 1}^N \left( -\log{\left( 1 + \frac{s}{n} \right)} + \frac{s}{n} \right) \right| &\le \lim_{N \to \infty} \sup_{s \in D} \sum_{n = N + 1}^\infty \left| -\log{\left( 1 + \frac{s}{n} \right)} + \frac{s}{n} \right| \\ &\le \lim_{N \to \infty} \sup_{s \in D} \sum_{n = N + 1}^\infty \frac{1}{n^2} \\ &= \lim_{N \to \infty} \sum_{n = N + 1}^\infty \frac{1}{n^2} \\ &= 0 \end{align*} This proves that the infinite sum converges uniformly on the unit disk $D$. Since the logarithms in the sum define holomorphic functions on the open unit disk, this uniform convergence implies that the infinite sum itself is also holomorphic on $D$. Then, we have \begin{align*} \exp{ \left( \lim_{N \to \infty} \sum_{n = 1}^N \left( -\log{\left( 1 + \frac{s}{n} \right)} + \frac{s}{n} \right) \right) } &= \lim_{N \to \infty} \exp{ \left( \sum_{n = 1}^N \left( -\log{\left( 1 + \frac{s}{n} \right)} + \frac{s}{n} \right) \right) } \\ &= \lim_{N \to \infty} \prod_{n = 1}^N \exp{ \left( - \log{ \left( 1 + \frac{s}{n} \right)} \right) } e^{s/n} \\ &= \prod_{n = 1}^\infty \left( 1 + \frac{s}{n} \right)^{-1} e^{s/n} \end{align*} which is the desried equality. $\blacksquare$

The expression of the infinite product as an exponential of some other function also proves that the product is non-zero. We can now compute the logarithmic derivative of the Gamma function

Theorem: If $s \in \mathbb{C} - \mathbb{Z}_{\le 0}$, \begin{align*} \frac{\Gamma^\prime (s)}{\Gamma(s)} &= - \frac{1}{s} - \gamma + \sum_{n = 1}^\infty \frac{s}{n(s + n)} \end{align*} with the sum converging uniformly on every compact subset of $\mathbb{C} - \mathbb{Z}_{\le 0}$.

Proof. On the open unit disk $D$, we have \begin{align*} \frac{d}{ds} \prod_{n = 1}^\infty \left( 1 + \frac{s}{n} \right)^{-1} e^{s/n} &= \frac{d}{ds} \left( \exp{ \left( \sum_{n = 1}^\infty \left( -\log{\left( 1 + \frac{s}{n} \right)} + \frac{s}{n} \right) \right) } \right) \\ &= \exp{ \left( \sum_{n = 1}^\infty \left( -\log{\left( 1 + \frac{s}{n} \right)} + \frac{s}{n} \right) \right) } \frac{d}{ds} \sum_{n = 1}^\infty \left( -\log{\left( 1 + \frac{s}{n} \right)} + \frac{s}{n} \right) \\ &= \left( \prod_{n = 1}^\infty \left( 1 + \frac{s}{n} \right)^{-1} e^{s/n} \right) \sum_{n = 1}^\infty \frac{d}{ds} \left( -\log{\left( 1 + \frac{s}{n} \right)} + \frac{s}{n} \right) \\ \frac{\frac{d}{ds} \prod_{n = 1}^\infty \left( 1 + \frac{s}{n} \right)^{-1} e^{s/n}}{\prod_{n = 1}^\infty \left( 1 + \frac{s}{n} \right)^{-1} e^{s/n}} &= \sum_{n = 1}^\infty \left( - \frac{1/n}{1 + s/n} + \frac{1}{n} \right) \\ &= \sum_{n = 1}^\infty \frac{s}{n(s + n)} \end{align*} where the interchanging of summation and derivative operator is justified by the uniform convergence on $D$. The desired logarithmic derivative is then \begin{align*} \frac{\Gamma^\prime(s)}{\Gamma(s)} &= \frac{\frac{d}{ds} \left( \frac{e^{\gamma s}}{s} \prod_{n = 1}^\infty \left( 1 + \frac{s}{n} \right)^{-1} e^{s/n} \right)}{\frac{e^{\gamma s}}{s} \prod_{n = 1}^\infty \left( 1 + \frac{s}{n} \right)^{-1} e^{s/n}} \\ &= \frac{\frac{d}{ds} \frac{1}{s}}{1 / s} + \frac{\frac{d}{ds} e^{-\gamma s}}{e^{-\gamma s}} + \frac{\frac{d}{ds} \prod_{n = 1}^\infty \left( 1 + \frac{s}{n} \right)^{-1} e^{s/n}}{\prod_{n = 1}^\infty \left( 1 + \frac{s}{n} \right)^{-1} e^{s/n}} \\ &= - \frac{1}{s} - \gamma + \sum_{n = 1}^\infty \frac{s}{n(s + n)} \end{align*} valid for $0 < |s| < 1$. To extend to the domain $\mathbb{C} - \mathbb{Z}_{\le 0}$, we will have to prove that the sum converges uniformly on every compcact subset. For this, let $K \subseteq \mathbb{C} - \mathbb{Z}_{\le 0}$ be compact. Let $R \in \mathbb{Z}_{> 0}$ be such that $K \subseteq \overline{D_R}$ where $D_R$ is the open disk of radius $N$ around 0. Then when $n \in \mathbb{Z}_{> R}$ and $s \in \overline{D_R}$, we have $$ \left| \frac{s}{n(s + n)} \right| \le \frac{R}{n|s - (-n)|} \le \frac{R}{n \left| |s| - |-n| \right|} = \frac{R}{n(n - |s|)} \le \frac{R}{n(n - R)} \le \frac{R}{(n - R)^2} $$ So then, \begin{align*} \lim_{N \to \infty} \sup_{s \in K} \left| \sum_{n = 1}^\infty \frac{s}{n(s + n)} - \sum_{n = 1}^N \frac{s}{n(s + n)} \right| &\le \lim_{N \to \infty} \sup_{s \in \overline{D_R}} \sum_{n = N + 1}^\infty \left| \frac{s}{s(s + n)} \right| \\ &= \lim_{N \to \infty} \sum_{n = N + 1}^\infty \frac{R}{(n - R)^2} \\ &= \lim_{N \to \infty} \sum_{n = N - R + 1}^\infty \frac{R}{n^2} \\ &= 0 \end{align*} This proves that the infinite sum does converge uniformly on $K$ as was to be shown. Consequently, we have uniform convergence on every compact $K \subseteq \mathbb{C} - \mathbb{Z}_{\le 0}$ so the sum defines a holomorphic function on this larger domain. Hence, the equality $$ \frac{\Gamma^\prime(s)}{\Gamma(s)} = - \frac{1}{s} - \gamma + \sum_{n = 1}^\infty \frac{s}{n(s + n)} $$ is now valid for all $s \in \mathbb{C} - \mathbb{Z}_{\le 0}$ by analytic continuation.$\blacksquare$


Define the digamma function $\psi : \mathbb{C} - \mathbb{Z}_{\le 0} \longrightarrow \mathbb{C}$ as the logarithmic derivative of the Gamma function \begin{align*} \psi(s) &= \frac{\Gamma^\prime(s)}{\Gamma(s)} = - \frac{1}{s} - \gamma + \sum_{n = 1}^\infty \frac{s}{n(s + n)} \end{align*}

As a simple corollary, we can compute the derivative of the Gamma function at 1. We have \begin{align*} \Gamma^\prime(1) &= \psi(1) = - 1 - \gamma + \sum_{n = 1}^\infty \frac{1}{n(n + 1)} = - 1 - \gamma + \sum_{n = 1}^\infty\left( \frac{1}{n} - \frac{1}{n + 1} \right) = -1 - \gamma + 1 = -\gamma \end{align*} Recalling the integral expressions for the derivatives of the Gamma function, the following identity follows as well: \begin{align*} \int_0^\infty e^{-t} \log{t} \, dt &= - \gamma \end{align*} For $m \in \mathbb{Z}_{\ge 0}$, the polygamma function $\psi_m : \mathbb{C} - \mathbb{Z}_{\le 0} \longrightarrow \mathbb{C}$ is defined as the $m$th derivative fo the digamma function. That is, \begin{align*} \psi_m (s) = \frac{d^m}{ds^m} \psi(s) \end{align*} Recall the Hurwitz Zeta function defined for $s \in \mathbb{C}$ with $\mathfrak{Re}(s) > 1$ and $a \in \mathbb{C} - \mathbb{Z}_{\le 0}$ by $$ \zeta(s, a) = \sum_{n = 0}^\infty \frac{1}{(n + a)^s} $$ In particular, the usual Riemann Zeta function is $\zeta(s) = \zeta(s, 1)$.

Theorem: If $s \in \mathbb{C}$ with $\mathfrak{Re}(s) > 0$ and $m \in \mathbb{Z}_{> 0}$, then $$ \psi_m(s) = (-1)^{m + 1} m! \zeta(m + 1, s) $$ Consequently, the digamma function admits the following Taylor expansion in an open disk of radius 1 around 1 $$ \psi(s) = - \gamma + \sum_{n = 1}^\infty (-1)^{n + 1} \zeta(n + 1) (s - 1)^n $$

Proof. Let $m \in \mathbb{Z}_{> 0}$. Since the sum in the series representation for the digamma function converges uniformly on every compact subset of the domain, we are justified in interchaning it with the differential operator to obtain \begin{align*} \psi_m(s) &= - \frac{d^m}{ds^m} \frac{1}{s} + \sum_{n = 1}^\infty \frac{d^m}{ds^m} \frac{s}{n(s + n)} \\ &= \frac{(-1)^{m + 1} m!}{s^{m + 1}} - \sum_{n = 1}^\infty \frac{d^m}{ds^m} \left( \frac{1}{s + n} - \frac{1}{n} \right) \\ &= \frac{(-1)^{m + 1} m!}{s^{m + 1}} - (-1)^m \sum_{n = 1}^\infty \frac{m!}{(s + n)^{m + 1}} \\ &= (-1)^{m + 1} m! \sum_{n = 0}^\infty \frac{1}{(s + n)^{m + 1}} \\ &= (-1)^{m + 1} m! \zeta(m + 1, s) \end{align*} In particular, $\psi_m(1) = (-1)^{m + 1} m! \zeta(m + 1)$. This implies the following remarkable Taylor series expansion for the digamma function $$ \frac{\Gamma^\prime(s)}{\Gamma(s)} = -\gamma + \sum_{n = 1}^\infty (-1)^{n + 1} \zeta(n + 1) (s - 1)^n $$ valid when $|s - 1| < 1$ by the usual results about Taylor series of holomorphic functions. $\blacksquare$

Taking $s = 1/2$ gives $$ \frac{\Gamma^\prime(1/2)}{\Gamma(1/2)} = - \gamma - \sum_{n = 1}^\infty \frac{\zeta(n + 1)}{2^n} $$ The exact value of $\psi(1/2)$ can also be computed as follows: \begin{align*} \psi(1/2) &= - \frac{1}{1/2} - \gamma + \sum_{n = 1}^\infty \frac{1/2}{n(1/2 + n)} \\ &= -2 - \gamma + 2 \sum_{n = 1}^\infty \frac{1}{2n(2n + 1)} \\ &= -2 - \gamma + 2 \lim_{N \to \infty} \sum_{n = 1}^N \left( \frac{1}{2n} - \frac{1}{2n + 1} \right) \\ &= -2 - \gamma + 2 \lim_{N \to \infty} \left( \frac{1}{2} \sum_{n = 1}^N \frac{1}{n} - \sum_{n = 1}^N \frac{1}{2n + 1} \right) \\ &= -2 - \gamma + 2 \lim_{N \to \infty} \left( \frac{1}{2} \sum_{n = 1}^N \frac{1}{n} - \sum_{n = 1}^{2N + 1} \frac{1}{n} + 1 + \frac{1}{2} \sum_{n = 1}^N \frac{1}{n} \right) \\ &= -2 - \gamma + 2 \lim_{N \to \infty} \left( \left( \sum_{n = 1}^N \frac{1}{n} - \log{N} \right) - \left( \sum_{n = 1}^{2N} \frac{1}{n} - \log{2N} \right) - \frac{1}{2N + 1} + 1 - \log{2} \right) \\ &= -2 - \gamma + 2 \left( \gamma - \gamma - 0 + 1 - \log{2} \right) \\ &= - \gamma - \log{4} \end{align*} So now equating with the Taylor expansion yields $$ - \gamma - \log{4} = - \gamma - \sum_{n = 1}^\infty \frac{\zeta(n + 1)}{2^n} $$ which implies the identity $$ \log{2} = \sum_{n = 2}^\infty \frac{\zeta(n)}{2^n} = \frac{\zeta(2)}{4} + \frac{\zeta(3)}{8} + \frac{\zeta(4)}{16} + \frac{\zeta(5)}{32} + \cdots $$

Gauss Multiplication Formula


Theorem (Gauss Multiplication Formula): If $n \in \mathbb{Z}_{> 0}$ then \begin{align*} \prod_{k = 0}^{n - 1} \Gamma \left( s + \frac{k}{n} \right) &= (2 \pi)^{(n - 1)/2} n^{1/2 - ns} \Gamma(ns) \end{align*}

Proof. We assume that $s > 0$ for now. Then if $k \in \{ 0, 1, 2, \cdots, n - 1 \}$, \begin{align*} \Gamma \left( s + \frac{k}{n} \right) &= \left( s + \frac{k}{n} - 1\right) \Gamma \left( s + \frac{k}{n} - 1\right) \\ &= \left( s + \frac{k}{n} - 1\right) \lim_{N \to \infty} \frac{N! N^{s + \frac{k}{n} - 1}}{\left( s + \frac{k}{n} - 1\right) \left( \left( s + \frac{k}{n} - 1 \right) + 1 \right) \left( \left( s + \frac{k}{n} - 1 \right) + 2 \right) \cdots \left( \left( s + \frac{k}{n} - 1 \right) + N \right)} \\ &= \lim_{N \to \infty} \frac{N! N^{s + \frac{k}{n} - 1}}{\left( s + \frac{k}{n} \right) \left( s + \frac{k}{n} + 1 \right) \left( s + \frac{k}{n} + 2 \right) \cdots \left( s + \frac{k}{n} + N - 1 \right)} \\ &= \lim_{N \to \infty} \frac{n^N N! N^{s + \frac{k}{n} - 1}}{\left( ns + k \right) \left( ns + k + n \right) \left( ns + k + 2n \right) \cdots \left( ns + k + (N - 1)n \right)} \end{align*} By Stirling's approximation for $N!$, we have \begin{align*} n^N N! N^{s + \frac{k}{n} - 1} \sim N^{s + \frac{k}{n} - 1} \sqrt{2 \pi N} \left( \frac{N n}{e} \right)^N \end{align*} So now, \begin{align*} \Gamma \left( s + \frac{k}{n} \right) &= \lim_{N \to \infty} \frac{N^{s + \frac{k}{n} - 1} \sqrt{2 \pi N} \left( \frac{N n}{e} \right)^N}{\left( ns + k \right) \left( ns + k + n \right) \left( ns + k + 2n \right) \cdots \left( ns + k + (N - 1)n \right)} \end{align*} Taking product from $k = 0$ to $k = n - 1$ will give \begin{align*} \prod_{k = 0}^{n - 1} \Gamma \left( s + \frac{k}{n} \right) &= \lim_{N \to \infty} \frac{N^{ns} N^{(n - 1)/2} N^{-n} (2 \pi N)^{n/2} \left( \frac{Nn}{e} \right)^{nN}}{ns(ns + 1)(ns + 2)(ns + 3) \cdots (ns + nN - 1)} \\ &= \lim_{N \to \infty} \frac{ N^{ns} N^{-1/2} (2 \pi)^{n/2} \left( \frac{Nn}{2} \right)^{ns} }{ns(ns + 1)(ns + 2)(ns + 3) \cdots (ns + nN - 1)} \\ &= \lim_{M \to \infty} \frac{ \left( \frac{M}{n} \right)^{ns} \left( \frac{M}{n} \right)^{-1/2} (2 \pi)^{n/2} \left( \frac{M}{2} \right)^{M} }{ns(ns + 1)(ns + 2)(ns + 3) \cdots (ns + M - 1)} \\ &= (2 \pi)^{(n - 1)/2} n^{1/2 - ns} \lim_{M \to \infty} \frac{M^{ns - 1} \sqrt{2 \pi M} \left( \frac{M}{e} \right)^M}{ns(ns + 1)(ns + 2)(ns + 3) \cdots (ns + M - 1)} \\ &= (2 \pi)^{(n - 1)/2} n^{1/2 - ns} (ns - 1) \lim_{M \to \infty} \frac{M^{ns - 1} M! }{(ns - 1)((ns - 1) + 1)((ns - 1) + 2)((ns - 1) + 3) \cdots ((ns - 1) + M )} \\ &= (2 \pi)^{(n - 1)/2} n^{1/2 - ns} (ns - 1) \Gamma(ns - 1) \\ &= (2 \pi)^{(n - 1)/2} n^{1/2 - ns} \Gamma ( ns ) \end{align*} which is what we wanted to show. $\blacksquare$

As a simple corollary we have Legendre's duplication formula:

Corollary (Legendre Duplication Formula): If $n \in \mathbb{Z}_{> 0}$ then \begin{align*} \Gamma\left( s \right) \Gamma \left( s + 1/2 \right) &= \sqrt{2 \pi} n^{1/2 - 2s} \Gamma \left( 2s \right) \end{align*}