Saturday, April 12, 2014
Inequalities
Markov inequality
If X≥0 and a>0, then
P(X≥a)≤E[X]a |
Chebyshev inequality
P(|X−μ|≥c)≤σ2c2c>0 |
Weak law of large numbers (WLLN)
X1,X2,⋯ i.i.d; finite mean μ and variance σ2
Mn=X1+⋯+Xnnsample meanE[Mn]=μvar(Mn)=σ2nP(|Mn−μ|≥ϵ)≤σ2nϵ2→0as n→∞ for ϵ>0 |
Convergence in probability
A sequence Yn, not necessarily independent, converges in probability to a number a if:
lim |
\quad How to check for convergence ? Guess a, express the CDF as above, and then check if the limit approaches zero.
Central Limit Theorem
\quad X_1,\cdots,X_n i.i.d with finite mean \mu and variance \sigma^2.
\begin{aligned} S_n &= X_1 + \cdots + X_n & \text{variance: } n\sigma^2 \\ \frac{S_n}{\sqrt{n}} &= \frac{X_1 + \cdots + X_n}{\sqrt{n}} & \text{variance: } \sigma^2 \\ \color{red}{Z_n} &\color{red}{=} \color{red}{\frac{S_n - n\mu}{\sqrt{n}\sigma}} & S_n \thicksim N(n\mu, n\sigma^2) \\ M_n &= \frac{S_n}{n} & \mathbb{E}[M_n] = \mu \quad var(M_n) = \frac{\sigma^2}{n} \\ \color{red}{Z_n} &\color{red}{=} \color{red}{\frac{M_n - \mu}{\sigma/\sqrt{n}}} & M_n \thicksim N(\mu, \frac{\sigma^2}{n}) \\ \color{red}{\lim_{n \rightarrow \infty} \mathbb{P}(Z_n \le z)} &\color{red}{=} \color{red}{\mathbb{P}(Z \le z)} = \Phi(z) & \text{for every }z, \text{ where } Z \thicksim N(0,1) \\ \end{aligned} |
Moivre-Laplace 1/2 correction
\quadWhen a is discrete.
\begin{aligned} \mathbb{P}(S_n \le a) &\ge p & \text{under estimate} \\ \mathbb{P}(S_n \lt a+1) &\ge p & \text{over estimate} \\ \mathbb{P}(S_n \lt a+0.5) &\ge p & \text{1/2 correction} \\ \end{aligned} |
Jensen's inequality
\quadIf g is a convex function (ie a smiley face). Note g''(x) \ge 0
\begin{aligned} \color{blue}{g(x)} &\color{blue}{\ge g(c) + g'(c) (x-c)} & \text{1st order Taylor expansion} \\ g(X) &\ge g(\mathbb{E}[X]) + g'(\mathbb{E}[X]) (x-\mathbb{E}[X]) & \text{let } c = \mathbb{E}[X] \\ \mathbb{E}[g(X)] &\ge \mathbb{E}[g(\mathbb{E}[X])] + \mathbb{E}\left[g'\big(\mathbb{E}[X]\big) \big(X-\mathbb{E}[X]\big)\right] & \text{note }\mathbb{E}[X] \text{ is a constant} \\ \mathbb{E}[g(X)] &\ge g(\mathbb{E}[X]) + g'(\mathbb{E}[X]) (\mathbb{E}[X]-\mathbb{E}[X]) \\ \color{blue}{\mathbb{E}[g(X)]} &\color{blue}{\ge g(\mathbb{E}[X])} \\ \end{aligned} |
Hoeffding's inequality
\quad X_i are random variables with i.i.d, and each is equally likely to be either -1 or 1. Let a \gt 0
\begin{aligned} \mathbb{P}(X_1 + \cdots + X_n \ge na) &\le e^{-na^2/2} & \text{a much stronger bound than Chebyshev inequality} \\ \end{aligned} |
Source: MITx 6.041x, Lecture 18.