Unbiasedness of an Estimator

This is probably the most important property that a good estimator should possess. According to this property, if the statistic \widehat \alpha is an estimator of \alpha ,\widehat \alpha , it will be an unbiased estimator if the expected value of  \widehat \alpha equals the true value of the parameter \alpha

i.e.

E\left( {\widehat \alpha } \right) = \alpha

Consider the following working example.

 

Example:

Show that the sample mean \overline X is an unbiased estimator of the population mean\mu .

Solution:

In order to show that \overline X is an unbiased estimator, we need to prove that

E\left( {\overline X } \right) = \mu

 

We have

\begin{gathered} \overline X = \frac{{\sum X}}{n} = \frac{{{X_1} + {X_2} + {X_3} + \cdots + {X_n}}}{n} \\ \,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\, = \frac{{{X_1}}}{n} + \frac{{{X_2}}}{n} + \frac{{{X_3}}}{n} + \cdots + \frac{{{X_n}}}{n} \\ \end{gathered}

Therefore,

E\left( {\overline X } \right) = E\left( {\frac{{{X_1}}}{n} + \frac{{{X_2}}}{n} + \frac{{{X_3}}}{n} + \cdots + \frac{{{X_n}}}{n}} \right)

 

From the rule of expectation, the expected value of a linear combination is equal to the linear combination of their expectations. So we have:

E\left( {\overline X } \right) = \frac{1}{n}E\left( {{X_1}} \right) + \frac{1}{n}E\left( {{X_2}} \right) + \frac{1}{n}E\left( {{X_3}} \right) + \cdots + \frac{1}{n}E\left( {{X_n}} \right)

Since {X_1},{X_2},{X_3}, \ldots ,{X_n} are each random variables, their expected values will be equal to the probability mean \mu ,

\begin{gathered} E\left( {\overline X } \right) = \frac{1}{n}\mu + \frac{1}{n}\mu + \frac{1}{n}\mu + \cdots + \frac{1}{n}\mu \\ \,\,\,\,\,\,\,\,\,\,\,\,\,\,\, = \frac{{n\mu }}{n} = \mu \\ \end{gathered}

Therefore, E\left( {\overline X } \right) = \mu .

Hence \overline X is an unbiased estimator of the population mean \mu .