Skip to main content

The Gamma Distribution

The Gamma Distribution

If a random variable XX has the density

f(x)=xα1exβΓ(α)βαf(x) = \displaystyle\frac{x^{\alpha-1} e^{\displaystyle\frac{-x} {\beta}}} {\Gamma(\alpha) \beta^{\alpha}}

where x>0x>0 for some constants α\alpha, β>0\beta>0, then XX is said to have a gamma distribution.

Details

The function Γ\Gamma is basically chosen so that ff integrates to one, i.e.

Γ(α)=0tα1etdt\Gamma(\alpha) = \displaystyle\int_0^\infty t^{\alpha-1} e^{-t}dt

It is not too hard to see that Γ(n)=(n1)!\Gamma(n)=(n-1)! if nNn \in \mathbb{N}. Also, Γ(α+1)=αΓ(α)\Gamma(\alpha + 1) = \alpha \Gamma(\alpha) for all α>0\alpha > 0.

The Mean, Variance and Mgf of the Gamma Distribution

Suppose XG(α,β)X \sim G (\alpha, \beta) i.e. XX has density

f(x)=xα1ex/βΓ(α)βα,x>0f(x) = \displaystyle\frac{x^{\alpha -1} e^{-x/\beta}} {\Gamma (\alpha) \beta^{\alpha}}, x > 0

Then,

E[X]=αβE[X] = \alpha\beta

M(t)=(1βt)αM(t) = (1-\beta t)^{-\alpha}

Var[X]=αβ2Var[X] = \alpha \beta^2

Details

The expected value of XX can be computed as follows:

E[X]=xf(x)dx=0xxα1ex/βΓ(α)βαdx=Γ(α+1)βα+1Γ(α)βα0x(α+1)1ex/βΓ(α+1)βα+1dx=αΓ(α)βα+1Γ(α)βα\begin{aligned} E[X] &= \displaystyle\int_{-\infty}^{\infty} xf(x)dx \\ &= \displaystyle\int_{0}^{\infty} x \displaystyle\frac{x^{\alpha -1} e^{-x/\beta}} {\Gamma (\alpha) \beta^{\alpha}} dx \\ &= \displaystyle\frac{\Gamma(\alpha+1)\beta^{\alpha+1}}{\Gamma(\alpha)\beta^{\alpha}} \displaystyle\int_{0}^{\infty} \displaystyle\frac{x^{(\alpha+1) -1} e^{-x/\beta}} {\Gamma (\alpha+1) \beta^{\alpha+1}} dx \\ &= \displaystyle\frac{\alpha\Gamma(\alpha)\beta^{\alpha+1}}{\Gamma(\alpha)\beta^{\alpha}} \end{aligned}

so E[X]=αβE[X] = \alpha\beta

Next, the moment-generating function is given by

E[etX]=0etxxα1ex/βΓ(α)βαdx=1Γ(α)βα0xα1etxx/βdx=Γ(α)ϕαΓ(α)βα0x(α1)ex/ϕΓ(α)ϕαdx\begin{aligned} E[e^{tX}] &= \displaystyle\int_{0}^{\infty} e^{tx} \displaystyle\frac{x^{\alpha-1}e^{-x/\beta}} {\Gamma(\alpha)\beta^{\alpha}} dx \\ &= \displaystyle\frac{1}{\Gamma(\alpha)\beta^{\alpha}} \displaystyle\int_{0}^{\infty} x^{\alpha-1} e^{tx - x/\beta} dx \\ &= \displaystyle\frac{\Gamma(\alpha) \phi^{\alpha} } {\Gamma(\alpha) \beta^{\alpha}} \displaystyle\int_{0}^{\infty} \displaystyle\frac{x^{(\alpha-1)} e^{-x/\phi}} {\Gamma (\alpha) \phi^{\alpha}}dx \end{aligned}

if we choose ϕ\phi so that xϕ=txx/β\displaystyle\frac{-x}{\phi} = tx - x/\beta i.e. 1ϕ=t1β\displaystyle\frac{-1}{\phi} = t - \displaystyle\frac{1}{\beta} i.e. ϕ=1t1/β=β1βt\phi = - \displaystyle\frac{1}{t-1/\beta} = \displaystyle\frac{\beta}{1 - \beta t} then we have

M(t)=(ϕβ)α=(β/(1βt)β)α=1(1βt)α\begin{aligned} M(t) &= \left(\displaystyle\frac{\phi}{\beta}\right)^{\alpha} \\ &= \left(\displaystyle\frac{\beta / (1-\beta t)}{\beta}\right)^{\alpha} \\ &= \displaystyle\frac{1}{(1-\beta t)^{\alpha}} \end{aligned}

or M(t)=(1βt)αM(t) = (1-\beta t)^{-\alpha}. It follows that

M(t)=(α)(1βt)α1(β)=αβ(1βt)α1M'(t) = (-\alpha) (1-\beta t)^{-\alpha-1} (-\beta) = \alpha\beta(1-\beta t)^{-\alpha-1}

so M(0)=αβM'(0) = \alpha\beta. Further,

M(t)=αβ(α1)(1βt)α2(β)=αβ2(α+1)(1βt)α2\begin{aligned} M''(t) &= \alpha\beta (-\alpha-1)(1-\beta t)^{-\alpha-2} (-\beta) \\ &= \alpha\beta^2 (\alpha+1)(1-\beta t)^{-\alpha-2} \end{aligned}
E[X2]=M(0)=αβ2(α+1)=α2β2+αβ2\begin{aligned} E[X^2] &= M''(0) \\ &= \alpha\beta^2 (\alpha+1) \\ &= \alpha^2 \beta^2 + \alpha\beta^2 \end{aligned}

Hence,

Var[X]=E[X]2E[X]2=α2β2+αβ2(αβ)2=αβ2\begin{aligned} Var[X] &= E[X]^2 - E[X]^2 \\ &= \alpha^2 \beta^2 + \alpha \beta^2 - (\alpha\beta)^2 \\ &= \alpha \beta^2 \end{aligned}

Special Cases of the Gamma Distribution: The Exponential and Chi-squared Distributions

Consider the gamma density,

f(x)=xα1exβΓ(α)βα,x>0f(x) = \displaystyle\frac {x^{\alpha - 1} e^{\displaystyle\frac{-x}{\beta}}} {\Gamma(\alpha) \beta^{\alpha}}, x > 0

For parameters α,β>0\alpha, \beta > 0

If α=1\alpha = 1 then

f(x)=1βexβ,x>0f(x) = \displaystyle\frac {1} {\beta} e^{\displaystyle\frac{-x}{\beta}}, x > 0

and this is the density of exponential distribution

Consider next the case α=v2\alpha = \displaystyle\frac{v}{2} and β=2\beta = 2 where vv

is an integer, so the density becomes,

f(x)=xv21ex2Γ(v2)Zv2,x>0f(x) = \displaystyle\frac {x^ {\displaystyle\frac{v}{2}- 1} e^{\displaystyle\frac{-x}{2}}} {\Gamma (\displaystyle\frac{v}{2}) Z^{\displaystyle\frac{v}{2}}}, x > 0

This is the density of a chi-squared random variable with vv degrees of freedom.

Details

Consider, α=v2\alpha = \displaystyle\frac{v}{2} and β=2\beta = 2 where vv is an integer, so the density becomes,

f(x)=xv21ex2Γ(v2)Zv2,x>0f(x) = \displaystyle\frac {x^ {\displaystyle\frac{v}{2}- 1} e^{\displaystyle\frac{-x}{2}}} {\Gamma (\displaystyle\frac{v}{2}) Z^{\displaystyle\frac{v}{2}}}, x > 0

This is the density of a chi-squared random variable with vv degrees of freedom

This is easy to see by starting with ZN(0,1)Z \sim N(0,1) and defining W=Z2W = Z^2 so that the c.d.f. is:

H(w)=P[Ww]=P[Z2w]H _{(w)} = P [W \leq w] = P [ Z^2 \leq w]

=P[wZw]= P [ - \sqrt{w}\leq Z \leq \sqrt{w}]

=1P[Z>w]= 1 - P [|Z| > \sqrt{w}]

=12p[Z<w]= 1-2p [Z< - \sqrt{w}]

=12αwet222wdt=12ϕ(w)= 1 - 2 \displaystyle\int_{-\alpha}^{\sqrt{w}} \displaystyle\frac{e \displaystyle\frac{-t^2}{2}} {\sqrt{2w}} dt = 1 - 2\phi (\sqrt{w})

The p.d.f. of ww is therefore

h(w)=H(w)h(w) = H ^\prime(w)

=02ϕ(w)12w121= 0 - 2\phi ^\prime (\sqrt{w}) \displaystyle\frac{1} {2} w ^ {\displaystyle\frac{1} {2} -1}

but

ϕ(x)=αxet222Πdtϕ(x)=ddxαxet222Πdt=ex222Π\phi (x) = \displaystyle\int_{-\alpha}^{x} \displaystyle\frac{e \displaystyle\frac{-t^2}{2}} {2\Pi} dt \phi ^\prime (x) = \displaystyle\frac {d}{dx}\displaystyle\int_{\alpha}^{x}\displaystyle\frac{e \displaystyle\frac{-t^2}{2}} {2\Pi} dt = \displaystyle\frac{e \displaystyle\frac{-x^2}{2}} {2\Pi}

So

h[w]=2ew22Π.12.w121h[w] = -2 \displaystyle\frac{e \displaystyle\frac{-w}{2}} {2\Pi} . \displaystyle\frac {1} {2} . w^{\displaystyle\frac {1}{2} -1}

h[w]=w121ew22Π,w>0h[w] = \displaystyle\frac{w^ {\displaystyle\frac{-1}{2}-1} e \displaystyle\frac{-w}{2}} {2\Pi}, w > 0

We see that we must have h=fh=f with v=1v = 1. We have also shown Γ(12)212=2π\Gamma (\displaystyle\frac {1}{2}) 2^{\displaystyle\frac{1}{2}} = \sqrt{2\pi}, i.e. Γ(12)=π\Gamma (\displaystyle\frac {1}{2}) = \sqrt{\pi}.

Hence we have shown the χ2\chi^2 distribution on 1 df to be G(α=v2,β=2)G (\alpha = \displaystyle\frac {v}{2}, \beta = 2) when v=1v = 1.

The Sum of Gamma Variables

In the general case if X1XnG(α,β)X_1 \ldots X_n \sim G (\alpha, \beta) are independent identically distributed then X1+X2+XnG(nα,β)X_1 + X_2 + \ldots X_n \sim G (n\alpha, \beta). In particular, if X1,X2,,Xvχ2X_1, X_2, \ldots, X_v \sim \chi^2 independent identically distributed then Σi=1vXiχv2\Sigma_{i=1}^v X_i \sim \chi^2_{v}.

Details

If XX and YY are independent identically distributed G(α,β)G (\alpha, \beta), then

MX(t)=MY(t)=1(1βt)αM_X (t) = M_Y (t) = \displaystyle\frac {1} {(1- \beta t)^\alpha}

and

MX+Y(t)=MX(t)MY(t)=1(1βt)2αM_{X+Y} (t) = M_X (t) M_Y (t) = \displaystyle\frac {1} {(1- \beta t)^{2 \alpha}}

So

X+YG(2α,β)X + Y \sim G (2\alpha, \beta)

In the general case, if X1XnG(α,β)X_1 \ldots X_n \sim G (\alpha, \beta) are independent identically distributed then X1+X2+XnG(nα,β)X_1 + X_2 + \ldots X_n \sim G (n\alpha, \beta). In particular, if X1,X2,,Xvχ2X_1, X_2, \ldots, X_v \sim \chi^2 independent identically distributed, then i=1vXiχv2\displaystyle\sum_{i=1}^v X_i \sim \chi^2_{v}.