Encyclopedia > Mean squared error

  Article Content

Mean squared error

In statistics the mean squared error of an estimator T of an unobservable parameter θ is
<math>\operatorname{MSE}(T)=\operatorname{E}((T-\theta)^2),</math>
i.e., it is the expected value of the square of the "error". The "error" is the amount by which the estimator differs from the quantity to be estimated. The mean squared error satisfies the identity
<math>\operatorname{MSE}(T)=\operatorname{var}(T)+(\operatorname{bias}(T))^2</math>
where
<math>\operatorname{bias}(T)=E(T)-\theta,</math>
i.e., the bias is the amount by which the expected value of the estimator differs from the unobservable quantity to be estimated.

Here is a concrete example. Suppose

<math>X_1,\dots,X_n\sim\operatorname{N}(\mu,\sigma^2),</math>
i.e., this is a random sample of size n from a normally distributed population. Two estimators of σ2 are sometimes used (as are others):
<math>\frac{1}{n}\sum_{i=1}^n\left(X_i-\overline{X}\,\right)^2\ {\rm and}\ \frac{1}{n-1}\sum_{i=1}^n\left(X_i-\overline{X}\,\right)^2 </math>
where
<math>\overline{X}=(X_1+\cdots+X_n)/n</math>
is the "sample mean". The first of these estimators is the maximum likelihood estimator, and is biased, i.e., its bias is not zero, but has a smaller variance than the second, which is unbiased. The smaller variance compensates somewhat for the bias, so that the mean squared error of the biased estimator is slightly smaller than that of the unbiased estimator.



All Wikipedia text is available under the terms of the GNU Free Documentation License

 
  Search Encyclopedia

Search over one million articles, find something about almost anything!
 
 
  
  Featured Article
Great River, New York

... 5.0% from 18 to 24, 29.0% from 25 to 44, 24.2% from 45 to 64, and 12.7% who are 65 years of age or older. The median age is 39 years. For every 100 females there are ...

 
 
 
This page was created in 30.3 ms