next up previous
Next: Degrees of Freedom Up: Deriving the Optimal Scoring Previous: Expected value of the

Variance of the estimator

The variance $ \sigma^2(d^*) = E[(d^* - E[d^*])^2]$ can be derived as follows. With the asymptotic value of E[d*] we compute the Taylor series of (d* - E[d*])2 in powers of (w/n - SE(d)). Then we take expected values, replacing the powers of (w/n - SE(d))k by the central moments described in equations 8, 9 and 10. Truncating after two terms we obtain

\begin{displaymath}\sigma^2(d^*) = \frac{\mu_2}{S_E^{'}(d)^2 n} -\frac{S_E^{''}(...
..._3 S_E^{'}(d)^2
\right)}{S_E^{'}(d)^6 n^2} + O(\frac{1}{n^3}) \end{displaymath}

The most effective estimator is the one among all possible estimators, which has minimal variance. The variance is

\begin{displaymath}\sigma^2(d^*) = \frac{F(d)}{n} + O(\frac{1}{n^2}) \end{displaymath}

where $F(d)= \frac{\mu_2}{S_E^{'}(d)^2}$. We can compute E so that it minimizes F(d). This will give us asymptotically (in n) optimal estimators. We have several choices for the minimization: we can derive the best E for There is no hope of finding a closed formula for this optimal E, but the first and second cases can be computed numerically without much difficulty.
next up previous
Next: Degrees of Freedom Up: Deriving the Optimal Scoring Previous: Expected value of the
Chantal Korostensky
1999-07-14