Recall: \(P(B | A)\) is the probability that event B occurs conditional on knowing that event A has occurred.
\[ P(B|A)={P(B\cap A)\over P(A)} \]
A & B are independent
if \(P(B\cap A)=P(B)\times P(A)\)
We can extend the concept of conditional probability to PMFs and PDFs.
For jointly discrete random variables X and Y , the conditional PMF of Y given X = x (given a fixed value of X) is
\[ p_{Y|X=x}(Y)=P(Y=y|X=x)={p(x,y)\over p_X(x)} \]
The conditional PMF is a proper PMF:
\(pY|X=x(y)=1\)
\(\sum_y p_{Y|X=x}(y)=1\)
We can obtain conditional means and variances.
Example: CD example with joint PMF
| x\ y | 120 | 121 | \(PX(x)\) |
|---|---|---|---|
| 129 | 0.12 | 0.08 | 0.2 |
| 130 | 0.42 | 0.28 | 0.7 |
| 131 | 0.06 | 0.04 | 0.1 |
Find the marginal distribution first
| y | 120 | 121 |
|---|---|---|
| \(p_{Y|X=130}(y)\) | 0.42/0.4 | 0.28/0.7 |
| 0.6 | 0.4 |
\[ P_{Y|X=130}(120)={P(X=130,Y=120)\over P(X=130)}=0.6 \]
They are valid because they add up to one and none of them are negative.
\[ E(Y|X=130)=\sum y\times p_{Y|X=130}(y)=120\times 0.6+121\times 0.4=120.4 \]
\[ Var(Y|X=130)=E(Y^2|X=130)-[E(Y|X=130)]^2=0.24\\ E(Y^2|X=130)=\sum y^2\times p_{Y|X=130}(y)=120^2\times 0.6+121^2\times 0.4 \]
Note: We can compute the marginal PMF using the conditional PMF.
\[ p_Y (y) =\sum_x P_{Y|X=x}(y)P_X(x) \]
For jointly continuous random variables X and Y , the conditional PDF of Y given X = x is
\[ f_{Y |X=x}(y) = {f(x, y)\over f_X(x)} \]
The conditional PDF is a proper PDF:
\(\int_{-\infty}^{\infty} f_{Y |X=x}(y) dy = 1\)
\(P(a < Y < b | X = x) = \int_{a}^{b} f_{Y |X=x}(y) dy\), which is the area under the conditional PDF curve, \(f_{Y|X=x}(y)\) from a to b.
We can obtain conditional means and variances.
Example: Let (X, Y ) be jointly continuous random variables with joint PDF
\[ f(x,y)=\begin{cases}{12\over 7}(x^2+xy),~~~0\leq x\leq1,0\leq y\leq1\\0,~~~~~~elsewhere\end{cases} \]
\[ P_X(x)={12\over 7}(x^2+{1\over 2}x) \]
\[ f_{Y|X=0.5}(y)={f(x,y)\over f_X(x)}={{12\over 7}(0.5^2+0.5y)\over {12\over 7}(0.5^2+{1\over 2}0.5)}={0.25+0.5y\over 0.25+0.25}={1\over 2}+y~~~~~ 0\leq y\leq1 \]
You can also write the equation in this way
\[ F_{Y|X=0.5}(0.5)-F_{Y|X=0.5}(0) \]
\[ \int_0^{0.5}f_{Y|X=0.5}(y)dy=\int_0^{0.5}{1\over 2}+y~dy=0.375 \]
\[ \int^{\infty}_{-\infty}y\times f_{Y|X=0.5}(y)dy=\int ^1_0y\times ({1\over 2}+y)~dy=0.583 \]
Note: We an computer the marginal PDF using the conditional PDF:
\[ f_Y(y)=\int^{\infty}_{-\infty}f_{Y|X=x}(y)f_X(x)~dx \]
weighted average of \(f_{Y|X=x}(y)\)
Recall:
The joint PMF for two jointly discrete random variables X and Y is \(p(x, y)\).
The joint PDF for two jointly continuous random variables X and Y is \(f(x, y)\).
Definition: Two random variables X and Y are independent if
\[ P(X\in A, Y\in B)=P(X\in A)P(Y\in B) \] For any set of A and B..
Result
\[ p(x, y) = p_X(x)p_Y (y) \]
\[ f(x, y) = f_X(x)f_Y (y) \]
Example: Let X and Y be discrete random variables with the joint PDF below. Are X and Y independent?
| x | 1 | 2 | 3 |
|---|---|---|---|
| 0 | 0.10 | 0.20 | 0.15 |
| 1 | 0.15 | 0.15 | 0.25 |