|
\[
\begin{aligned}
M_n &= {X_1 + \cdots + X_n \over n} & \text{where } X\text{'s are iid's} \\
\end{aligned}
\] |
We have the familiar CLT as \(n \rightarrow \infty\):
  |
\[
\begin{aligned}
\bf{P}\left( \frac{M_n - \mu}{\sigma \sqrt{n}} \le \frac{\epsilon}{\sigma \sqrt{n}} \right) &= \bf{P}\left( z \le \frac{\epsilon}{\sigma \sqrt{n}} \right)
\approx \mathbb{\phi} \left(\frac{\epsilon}{\sigma \sqrt{n}} \right) \\
\end{aligned}
\] |
But how about the absolute value of z ?
  |
\[
\begin{aligned}
\bf{P}\left( \big|\frac{M_n - \mu}{\sigma \sqrt{n}}\big| \le \frac{\epsilon}{\sigma \sqrt{n}} \right) &= \bf{P}\left( |z| \le \frac{\epsilon}{\sigma \sqrt{n}} \right)
\approx 1 - 2\bigg[1 - \mathbb{\phi} \left(\frac{\epsilon}{\sigma \sqrt{n}} \right)\bigg] = 2 \cdot \mathbb{\phi}\left(\frac{\epsilon}{\sigma \sqrt{n}} \right) - 1 \\
\end{aligned}
\] |
In case we know the range \([a, b]\) of \(X\) but not the variance, we can make use of the upper bounds. That is,
  |
\[
\begin{aligned}
\displaystyle \sigma^2 \le \frac{(b-a)^2}{4} \\
\end{aligned}
\] |
which can be verified via the use of Chebyshev's Inequality. A proof can be found at p.268 of
Introduction to Probability, 2nd Edition.
# posted by rot13(Unafba Pune) @ 10:30 AM