Note for Gamma Distribution
Motivation for Gamma Function
We all know how to compute the factorial of integer. BUT what is the factorial of 1/2?
In other words, how to interpolate the factorial function?
The gamma function can be seen as a solution to the following interpolation problem:
“Find a smooth curve that connects the points (x, y) given by y = (x − 1)! at the positive integer values for x.”
More details, check the wiki page.
Definition of Gamma Function
Definition 1 (Gamma Function) \[\begin{align} \Gamma(z) = \int_{0}^{\infty}x^{z-1}e^{-x}dx, \ \ z \in \mathbb{R}^+ \end{align}\]
For the the Gamma function, it is enough to know the following properties for now.
Lemma 1 \[\begin{align} & \Gamma(z+1) = z\Gamma(z), \ \ z \in \mathbb{R}^+ \\ & \Gamma(n) = (n-1)!, \ \ n = 1,2,3,... \end{align}\]
Easy to prove using integration by parts.
Now, what is \(\Gamma(\frac{1}{2})\)?
\[ \Gamma(\frac{1}{2}) = \int_{0}^{\infty}x^{-1/2}e^{-x}dx = ? \]
Recall that, \(\int_{0}^{\infty} e^{-x^2} = \frac{1}{2}\sqrt{\pi}\), let \(u = x^2\) and will get the result \(\Gamma(\frac{1}{2} )= \sqrt{\pi}\).
Gamma Distribution
From the Gamma function, it is pretty natural to get Gamma pdf. JUST normalizing!
Clearly, \[ 1 = \int_0^\infty \frac{x^{r-1}e^{-x}}{\Gamma(r)}dx = \int_0^\infty f_X(x)dx, \ \ \ X := Gamma(r, 1) \]
What is the pdf for the general \(Gamma(r, \lambda)\) ? Let
\[ Y = \frac{X}{\lambda}, \ Y \sim Gamma(r, \lambda) \] \[ f_Y(y) = f_X(x)\frac{dx}{dy} \]
We’ll get
\[ f(y; r, \lambda ) = \frac{\lambda ^{r}y^{r-1}e^{-\lambda y}}{\Gamma(r)} \]
Here, \(r\) is called the shape parameter and \(\lambda\) is called the rate parameter.
How to remember the Gamma pdf?
That’s my trick: exponential density times the power rise to (shape-1), then divided by normalizer.
Exponential density (very familiar): \(\lambda e^{-\lambda x}\)
power rise to (shape-1): \((\lambda x)^{r-1}\)
normalizing constant (using shape): \(\Gamma(r)\)
\[ \begin{align} f(x; r, \lambda ) &= \frac{\text{exp density} \cdot \text{power}^\text{shape-1} }{normalizer} \\ & = \frac{\lambda e^{-\lambda x}(\lambda x)^{r-1}}{\Gamma(r)} \\ & = \frac{\lambda ^{r}x^{r-1}e^{-\lambda x}}{\Gamma(r)} \end{align} \]
Another way to remember is this:
Exponential key part: \[ e^{-\lambda x} \]
Add Power part: \[ \lambda^{\square} x^{\square} e^{- \lambda x} \]
- multiply the power part in exponential
- rate rises to shape
- variable rises to shape-1
\[ \lambda^{r} x^{r-1} e^{- \lambda x} \]
Add Normalizing part: \[ \frac{\lambda^{r} x^{r-1} e^{- \lambda x}}{\Gamma(r)} \]
Gamma & Exponential Connection
Let’s recall the Poisson Process, \[ N_t = \text{number of arrials up to time t} \sim Pois(\lambda t) \] The number of arrivals in the disjoint intervals are independent.
Let \(T_1\) be the time of 1st arrival, \[ P(T_1 > t) = P(N_t = 0) = e^{-\lambda t} \ \implies T_1 \sim Exp(\lambda) \] Now, let \(T_n\) be the time of nth arrival, that is, \(T_n = \sum_{i=1}^n X_i\) , where \(X_i \overset{\text{iid}}{\sim}Exp(\lambda)\)
What is the pdf of \(T_n\)? Answer is Gamma!
Proposition 1 Gamma is the sum of iid Exponentials.
Proof:
Since \(M_X(t) = \frac{\lambda}{\lambda-t}\), where \(t < \lambda\),
\[ M_{\sum_{i=1}^n X_i}(t) = (M_X(t))^n = (\frac{\lambda}{\lambda-t})^n \]
It is enough to show the MGF of Gamma equals the above value.
Let \(Y \sim Gamma(n, \lambda)\),
\[ \begin{align} M_Y(t) = E(e^{ty}) &= \int_0^\infty e^{ty} \frac{1}{\Gamma(n)} \lambda^{n}e^{-\lambda y}y^{n-1} dy \\ &= \frac{\lambda^n}{\Gamma(n)} \int_0^\infty e^{-(\lambda - t)y}y^{n-1}dy, \text{ let } u = (\lambda - t)y \\ &= \frac{\lambda^n}{\Gamma(n)} (\frac{1}{\lambda-t})^n \int_0^\infty e^{-u}u^{n-1}du \\ &= \frac{\lambda^n}{\Gamma(n)} (\frac{1}{\lambda-t})^n\Gamma(n) \\ &= (\frac{\lambda}{\lambda-t})^n \end{align} \]
Proved!
Remark. The Exponential is the continous analog of the Geometric. Similarly, the Gamma is the continous analog of the Negative Binomial.