A university uses an admissions index to rank applicants. The admissions index is a numerical score calculated by multiplying the ACT score by 10, the high school GPA by 100, and then adding the two. Suppose that for this university the mean ACT score is 29 and the mean high school GPA is 3.72. What is the expected value of the admissions index?

Let $X_1 $ denote the ACT score, \(X_2\) to denote GPA

\[ E(X_1)=29,E(X_2)=3.72\\ I=10X_1+100X_2\\ E(I)=10E(X_1)+100E(X_2)=10\times 29+100\times 3.72=662 \]

Result 3: (continuous on previous chapters):

Suppose \(X_1,...X_n\) are iid Bern(p), the \(E(\hat p)=p\)

\[ E(\hat p)=E({X_1+...+X_n\over n})\\ E(X_1)=\mu=p \]

Covariance

Question: Is \(Var(X+Y)=Var(X)+Var(Y)\) ?

Not true in general. only true if X and Y are uncorrelated/independent

When random variables X and Y are dependent, computing Var(X + Y ) involves the covariance.

\[ Cov(X,Y)=\sigma_{X,Y}=E[(X-\mu_X)(Y-\mu_Y)]=E(XY)=\mu_X\mu_Y\\ Var(X)=E[(X-\mu_X)(X-\mu_X)]\\ Cov(X,X)=Var(X) \]

Greater value of X, mainly correspond to greater values of Y.

Greater value of X, mainly correspond to less values of Y.

Properties:

  1. \(Cov(X, Y ) = Cov(Y, X)\)

  2. \(Cov(X, X) = Var(X)\)

  3. If X and Y are independent, then \(Cov(X, Y ) = 0\).

\[ Cov(X,Y)=E(XY) -\mu_X\mu_y=E(X)E(Y)-\mu_X\mu_Y=0 \]

When \(Cov(X,Y)=0\), it doesn’t means that the X, Y are independent…

\[ P(X=-1)=P(X=1)={1\over2}\\Y=X^2 \]

  1. $Cov(aX + b, cY + d) = acCov(X, Y) $ for any real numbers a, b, c, and d.

\[ Cov(ax+b,cY+d)=E[(ax+b-aE(X)-b)(cY+d-cE(Y)-d)]\\ =E[a(X-E(X))(cY-E(Y))]=acCov(X,Y) \]

Example: Suppose X and Y have the joint PMF below. Find Cov(X, Y ).

x\y -1 3 8
-1 0.1 0 0.4
2 0.2 0.3 0

\[ E(X)=-1\times 0.5+2\times -0.5=0.5\\ E(Y)=-1\times 0.3+3\times 0.3+8\times0.4=3.8\\ E(XY)=-1.7\\ Cov(X,Y)=E(XY)-E(X)E(Y)\\ =-1.7-0.5\times 3.8=-3.6 \]

Variance of Sums of Random Variables

Let \(\sigma_1^2\) and \(\sigma_2^2\) be the variances of X1 and X2, respectively.

  1. If X1 and X2 are independent,

\[ Var(X1 + X2) = \sigma^2_1 + \sigma^2_2\\ Var(X1 − X2) = \sigma^2_1 + \sigma^2_2 \]

  1. If X1 and X2 are dependent,

\[ Var(X1 + X2) = \sigma^2_1 + \sigma^2_2 + 2Cov(X1, X2)\\ Var(X1 - X2) = \sigma^2_1 + \sigma^2_2 - 2Cov(X1, X2) \]

  1. If \(X_1, . . . , X_n\) are random variables with variances \(\sigma_1^2...,\sigma_n^2\)

\[ Var(a_1X_1+...+a_nX_n)=a^2_1\sigma_1^2+...+a^2_n\sigma^2_n+\sum_i\sum_{j\neq i}a_ia_jCov(X_i,X_j) \]

for constants \(a_1, . . . a_n\).

If \(X_1,....X_n\) are independent, \(Var(\sum a_iX_i)=\sum^n_{i=1}a_i^2\sigma_i^2\)

If \(X_1,....X_n\) are iid, \(Var(\sum a_iX_i)=\sigma^2\sum^n_{i=1}a_i^2\), \(Var(\bar X)=\sigma^2{1\over n}\)

Example: Suppose X and Y have the joint PMF below. Find Var(2X + Y ).

|x\y-1 3 8 |-1 0.1 0 0.4 2 0.2 0.3 0

\[ Var(2X+Y)=Var(2X)+Var(Y)+2Cov(2X,Y)\\ =4Var(X)+Var(Y)+4Cov(X,Y)\\ =8.76 \]

Result 1: Let \(X_1,....X_n\) be iid with common variance \(\sigma^2\). then

\[ Var(\sum^n_{i=1}X_i)=n\sigma^2\\ Var(\bar X)=Var({1\over n}\sum^n_{i=1}X_i)={\sigma^2\over n} \]

Result 2: Suppose \(X_1,....X_n\), are iid Bern(p). Then

\[ Var(\hat p)={p(1-p)\over n} \]

Person’s Correlation Coefficient

The correlation coefficient of X and Y is

\[ \row \]