Example: For a cylinder selected at random from a production line, let X be the cylinder’s height and Y the cylinder’s radius. Suppose X and Y have the joint PDF below. Are X and Y independent?

\[ f(x, y) = \begin{cases}{3\over 8}{x\over y^2}, 1 \le x \le 3,{1\over 2}\le y{3\over 4}\\0, elsewhere\end {cases} \]

Note: Continuous random variables X and Y will be independent if and only if we can write the joint PDF as

\[ \newcommand{\indep}{\perp \!\!\! \perp} f(x, y) = g(x)h(y).~~~g(x)\indep h(y) \]

The support of x and y shall also be independent

Example:

\[ f(x,y)={3\over 8}{x\over y^2} \]

If the support of x and y are independent, they are independent from each other with constant \(c={3\over 8}\).

Theorem: If X and Y are jointly discrete, then X and Y are independent if and only if

  1. \(p_{Y |X=x}(y) = p_Y (y)\), probability of Y does not depends on X=x

  2. \(p_{X|Y =y}(x) = p_X(x)\), probability of X does not depends on Y=y

Note: The preceding theorem holds for jointly continuous random variables with PDFs replacing PMFs.

Theorem: Let X and Y be independent. Then

  1. \(E(Y | X = x) = E(Y )\) does not depend on the value of x.

  2. \(g(X)\) and \(h(Y )\) are independent.

  3. \(E[g(X)h(Y )] = E[g(X)]E[h(Y )]\), for \(E(XY)=E(X)E(Y)=\int \int xyf(xy)dxdy=\int xf(x)dx\int yf(y)dy\)

Example: Find the expected volume of a randomly selected cylinder form the production line.

\(volume=\pi r^2 h=\pi y^2x\)

Let X denote the height of cylinder, Y denotes the radius

\[ \newcommand{\indep}{\perp \!\!\! \perp} X\indep Y\\ E(\pi Y^2 X)=\pi E(Y^2)E(X)=\pi{26\over 12}{3\over 8}={13\over 16}\pi\\ E(X)=\int^3_1x{1\over 4}xdx={26\over 12}\\ E(Y^2)=\int^{3\over 4}_{1\over 2}y^2{3\over 2}{1\over y^2}dy={3\over 8} \]

Note: Independence extends to 3 or more random variables.

  1. Jointly discrete random variables X1, . . . , Xn are independent if and only if

\[ p(x1, . . . , xn) = p_{X1}(x1)· · · p_{Xn}(xn) \]

  1. Jointly continuous random variables X1, . . . , Xn are independent if and only if

\[ f(x1, . . . , xn) = f_{X1}(x1)· · · f_{Xn}(xn) \]

If X1, . . . , Xn are independent and have the same distribution we say they are independent and identically distributed, or iid.

Expected Value of Functions of Random Variables

We can use joint PMFs and PDFs to obtain expected values of functions of random variables.

  1. If X and Y are discrete

\[ E[h(X, Y )] = \sum_{x\in S_X}\sum_{y\in S_Y}h(x, y)p(x, y) \]

  1. If X and Y are continuous

\[ E[h(X, Y )] =\int^{\infty}_{-\infty}\int^{\infty}_{-\infty}h(x,y)f(x,y)dx~dy \]

Example: Suppose X and Y have the joint PMF below. Find \(E(XY )\).

x\y -1 3 8
-1 0.1 0 0.4
2 0.2 0.3 0

The support of x depends on Y (When y=3, x cannot be -1). \(P(X=-i|Y=3)\)

\[ E(XY)=\sum_{x\in S_X}\sum_{y\in S_Y}xyp_{x,y}=(-1)\times (-1)\times 0.1...+2\times 8\times 0=-1.7 \]

Example: Suppose X and Y have the joint PDF below. Find E(XY ).

\[ f(x, y) = \begin{cases}{1\over 2}, 0 \leq x \leq 2, 0 \leq y \leq 1 \\0, elsewhere\end{cases} \]

$$ XY\

E(XY)=1_02_0xy{1}dx~dy={1} $$