Estimation

Goal: to estimate unknown population parameters

We will focus first on point estimation.

Notation:

Unbiased Estimators

Often there are multiple ways of estimating a parameter.

  • How do we choose one estimator over another?

  • What are some properties of “good” estimators?

Definition: An estimator \(\hat \theta\) of \(\theta\) is unbiased for θ if

\[ E_{\theta}(\hat\theta)=\theta \]

Defnition: The bias of \(\hat \theta\) is

\[ bias(\hat\theta)=E_{\theta}(\hat\theta)-\theta \]

Examples:

  1. \(\bar X\) is an unbiased estimator of \(\mu=E(X_i)\).

  2. \(\hat p\) is an unbiased estimator of p.

  3. Let \(X_i,..,X_n\) be iid with common variance \(\sigma^2\). Then

\[ S^2={1\over n-1}\sum^n_{i=1}(X_i-\bar X)^2 \]

is unbiased for \(\sigma^2\)

Note: S is a biased estimator of \(\sigma\)

Definition: The standard error of an estimator \(\hat \theta\) is

\[ \sigma_{\hat \theta}=\sqrt{Var_{\theta}(\hat \theta)} \]

The estimated standard error is denoted by \(S_{\hat\theta}\)

Example: Let \(\bar X_1\) and \(S_1^2\) be the sample mean and variance of a simple random sample of size \(n_1\) from a population with mean \(\mu_1\) and variance \(\sigma_1^2\). Let \(\bar X_2\) and \(S_2^2\) be the sample mean and variance of a simple random sample of size \(n_2\) from a population with mean \(\mu_2\) and variance \(\sigma^2_2\). Assume the two samples are independent.

  1. Show that\(\bar X_1-\bar X_2\) is an unbiased estimator of \(\mu_1-\mu_2\).