## FANDOM

1,091 Pages

The unbiased estimator for the variance of the distribution of a random variable $X$ , given a random sample $X_1,\ldots,X_n$ is

$\frac{\displaystyle\sum\left(X_i-\overline{X}\right)^2}{n-1}$

That $n-1$ rather than $n$ appears in the denominator is counterintuitive and confuses many new students. Here it is proven that this form is the unbiased estimator for variance, i.e., that its expected value is equal to the variance itself.

Let [1] be

[2]
the estimator for the variance of some random variable [3]. Then

[4]
[5]
Then

[6]
[7]
[8]
[9]
[10] [1]
[11]
[12]
And so

[13]
The expectations on the right-hand side are not known immediately. However, from the definition of variance

[14]
where [15] and [16] are the variance and mean of [17], respectively.

The other term can be represented thus, according to the definition of variance and the Central Limit Theorem

[18]
[19]
where [20] and [21] are the variance and mean of [22], respectively.

These indirect forms are then substituted back into the previously unsolvable equation

[23]
[24]
[25]
Thus the expected value of the estimator is equal to the variance and the estimator is therefore unbiased.

QED


== ==