Skip to main content

Independence, Expectations and the Moment Generating Function

Independent Random Variables

Recall that two events, AA and BB, are independent if,

P[AB]=P[A]P[B]P [A \cap B] = P[A] P[B]

Since the conditional probability of AA given BB is defined by:

P[AB]=P[AB]P[B]P [A|B] = \displaystyle\frac {P [A \cap B]} {P[B]}

We see that AA and BB are independent if and only if

P[AB]=P[A](when P[B]>0)P[A|B] = P[A] \quad (\text{when } P [B] > 0 )

Two continuous random variables, XX and YY, are similarly independent if,

P[XA,YB]=P[XA]P[YB]P [X \in A, Y \in B] = P [X \in A] P[Y \in B]

Details

Two continuous random variables, XX and YY, are similarly independent if,

P[XA,YB]=P[XA]P[YB]P [X \in A, Y \in B] = P [X \in A] P[Y \in B]

Now suppose XX has p.d.f. fXf_X, and YY has p.d.f. fYf_Y. Then,

P[XA]=AfX(x)dx,P [X \in A] = \displaystyle\int_{A} f_X (x) dx,

P[YB]=BfY(y)dyP [Y \in B] = \displaystyle\int_{B} f_Y (y) dy

So XX and YY are independent if:

P[X,YB]=AfX(x)dxBfY(y)dy=AfX(x)(BfY(y)dy)dx=ABfX(x)fY(y)dydx\begin{aligned} P [X \in, Y \in B] &= \displaystyle\int_{A} f_X (x) dx \displaystyle\int_{B} f_Y (y) dy \\ &= \displaystyle\int_{A}f_X (x) (\displaystyle\int_{B} f_Y (y) dy) dx \\ &= \displaystyle\int_{A}\displaystyle\int_{B} f_X (x)f_Y (y) dydx \end{aligned}

But, if ff is the joint density of XX and YY then we know that

P[XA,YB]P [X \in A, Y \in B]

ABf(x,y)dydx\displaystyle\int_{A}\displaystyle\int_{B} f (x,y) dydx

Hence XX and YY are independent if and only if we can write the joint density in the form of,

f(x,y)=fX(x)fY(y)f(x,y) = f_X (x)f_Y (y)

Independence and Expected Values

If XX and YY are independent random variables then E[XY]=E[X]E[Y]E[XY]=E[X]E[Y].

Further, if XX and YY are independent random variables then E[g(X)h(Y)]=E[g(X)]E[h(Y)]E[g(X)h(Y)]=E[g(X)]E[h(Y)] is true if gg and hh are functions in which expectations exist.

Details

If XX and YY are random variables with a joint distribution function f(x,y)f(x,y), then it is true that for h:R2Rh:\mathbb{R}^2\to\mathbb{R} we have

E[h(X,Y)]=h(x,y)f(x,y)dxdyE[h(X,Y)]=\displaystyle\int\displaystyle\int h(x,y)f(x,y)dxdy

for those hh such that the integral on the right exists

Suppose XX and YY are independent continuous random variables, then

f(x,y)=fX(x)fY(y)f(x,y) = f_X (x) f_Y (y)

Thus,

E[XY]=xyf(x,y)dxdy=xyfX(x)fY(y)dxdy=xfX(x)dxyfY(y)dy=E[X]E[Y]\begin{aligned} E[XY] &= \displaystyle\int\displaystyle\int xy f (x,y) dxdy \\ &= \displaystyle\int\displaystyle\int xy f_X (x) f_Y (y) dxdy \\ &= \displaystyle\int xf_X (x) dx \displaystyle\int yf_Y (y) dy \\ &= E[X] E[Y] \end{aligned}
Note

Note that if XX and YY are independent then E[h(X)g(Y)]=E[h(X)]E[g(Y)]E[h(X) g(Y)] = E [h(X)] E[g(Y)] is true whenever the functions hh and gg have expected values.

Examples

Example

Suppose X,YU(0,2)X,Y \in U (0,2) are independent identically distributed then,

fX(x)={12if 0x20otherwisef_X(x) = \begin{cases} \displaystyle\frac{1}{2} & \text{if } 0 \leq x \leq 2 \\ 0 & \text{otherwise} \end{cases}

and similarly for fYf_Y.

Next, note that,

f(x,y)=fX(x)fY(y)={14if 0x,y20otherwisef(x,y) = f_X(x) f_Y(y) = \begin{cases} \displaystyle\frac{1}{4} & \text{if } 0 \leq x,y \leq 2 \\ 0 & \text{otherwise} \end{cases}

Also note that f(x,y)0f(x,y) \geq 0 for all (x,y)R2(x,y) \in \mathbb{R}^2 and

f(x,y)dxdy=020214dxdy=14.4=1\displaystyle\int\displaystyle\int f(x,y)dxdy = \displaystyle\int_{0}^{2}\displaystyle\int_{0}^{2} \displaystyle\frac {1}{4} dxdy = \displaystyle\frac {1}{4}.4 = 1

It follows that

E[XY]=f(x,y)xydxdy=020214xydxdy=1402y(02xdx)dy=1402yx2202dy=1402y(222022)dy=14022ydy=1202ydy=12y2202=12(222022)=122=1\begin{aligned} E[XY] &= \int\int f(x,y) xy dxdy \\ &= \int_{0}^{2}\int_{0}^2 \frac{1}{4} xy dxdy \\ &= \frac{1}{4} \int_{0}^{2} y (\int_{0}^{2} x dx) dy \\ &= \frac{1}{4} \int_{0}^{2} y \frac{x^{2}}{2} \vert_{0}^{2} dy \\ &= \frac{1}{4} \int_{0}^{2} y (\frac{2^{2}}{2} - \frac{0^{2}}{2}) dy \\ &= \frac{1}{4} \int_{0}^{2} 2 y dy \\ &= \frac{1}{2} \int_{0}^{2} y dy \\ &= \frac{1}{2} \frac{y^{2}}{2} \vert_{0}^{2} \\ &= \frac{1}{2} (\frac{2^{2}}{2} - \frac{0^{2}}{2}) \\ &= \frac{1}{2} 2 \\ &= 1 \end{aligned}

but

E[X]=E[Y]=02x12dx=1E[X] = E[Y] = \displaystyle\int_{0}^{2} x \displaystyle\frac{1}{2} dx = 1

so

E[XY]=E[X]E[Y]E[XY] = E[X] E[Y]

Independence and the Covariance

If XX and YY are independent then Cov(X,Y)=0Cov(X,Y)=0

In fact, if XX and YY are independent then Cov(h(X),g(Y))=0Cov(h(X),g(Y))=0 for any functions gg and hh in which expected values exist.

The Moment Generating Function

If XX is a random variable we define the moment generating function when tt exists as: M(t):=E[etX]M(t):=E[e^{tX}].

Examples

Example

If XBin(n,p)X\sim Bin(n,p) then M(t)=x=0netxp(x)=x=0netx(nx)p(1p)nxM(t)=\displaystyle\sum_{x=0}^{n} e^{tx}p(x) = \displaystyle\sum_{x=0}^{n} e^{tx} \displaystyle\binom{n}{x}p\cdot (1-p)^{n-x}

Moments and the Moment Generating Function

If MX(t)M_{X}(t) is the moment generating function (mgf) of XX, then MX(n)(0)=E[Xn]M_{X}^{(n)}(0)=E[X^n].

Details

Observe that M(t)=E[etX]=E[1+X+(tX)22!+(tX)33!+]M(t)=E[e^{tX}]=E[1+X+\displaystyle\frac{(tX)^2}{2!}+\displaystyle\frac{(tX)^3}{3!}+\dots] since ea=1+a+a22!+a33!+e^a=1+a+\displaystyle\frac{a^2}{2!}+\displaystyle\frac{a^3}{3!}+\dots. If the random variable etXe^{|tX|} has a finite expected value then we can switch the sum and the expected valued to obtain:

M(t)=E[n=0(tX)nn!]=n=0E[(tX)n]n!=n=0tnE[Xn]n!M(t)=E\left[\displaystyle\sum_{n=0}^{\infty}\displaystyle\frac{(tX)^n}{n!}\right]=\displaystyle\sum_{n=0}^{\infty}\displaystyle\frac{E[(tX)^n]}{n!}=\displaystyle\sum_{n=0}^{\infty}t^n\displaystyle\frac{E[X^n]}{n!}

This implies that the nthn^{th} derivative of M(t)M(t) evaluated at t=0t=0 is exactly E[Xn]E[X^n].

The Moment Generating Function of a Sum of Random Variables

MX+Y(t)=MX(t)MY(t)M_{X+Y}(t)=M_{X}(t)\cdot M_{Y}(t) if XX and YY are independent.

Details

Let XX and YY be independent random vaiables, then

MX+Y(t)=E[eXt+Yt]=E[eXteXt]=E[eXt]E[eXt]=MX(t)MY(t)M_{X+Y}(t)=E[e^{Xt+Yt}]=E[e^{Xt}e^{Xt}]=E[e^{Xt}]E[e^{Xt}]=M_{X}(t)M_{Y}(t)

Uniqueness of the Moment Generating Function

Moment generating functions (m.g.f.) uniquely determine the probability distribution function for random variables. Thus, if two random variables have the same moment-generating function, then they must also have the same distribution.