Google
 
Web unafbapune.blogspot.com

Saturday, April 12, 2014

 

Inequalities

Markov inequality

\(\quad\) If \(X \ge 0\) and \(a > 0\), then
  \[ \begin{aligned} \mathbb{P}(X \ge a) &\le \frac{\mathbb{E}[X]}{a} \\ \end{aligned} \]

Chebyshev inequality

  \[ \begin{aligned} \mathbb{P}(|X-\mu| \ge c) &\le \frac{\sigma^2}{c^2} & c > 0 \\ \end{aligned} \]

Weak law of large numbers (WLLN)

\(\quad X_1, X_2, \cdots \) i.i.d; finite mean \(\mu\) and variance \(\sigma^2\)
  \[ \begin{aligned} M_n &= \frac{X_1 + \cdots + X_n}{n} & \text{sample mean} \\ \mathbb{E}[M_n] &= \mu \\ var(M_n) &= \frac{\sigma^2}{n} \\ \mathbb{P}(|M_n - \mu| \ge \epsilon) &\le \frac{\sigma^2}{n\epsilon^2} \rightarrow 0 \quad \text{as } n \rightarrow \infty &\text{ for } \epsilon > 0 \\ \end{aligned} \]

Convergence in probability

\(\quad\) A sequence \(Y_n\), not necessarily independent, converges in probability to a number \(a\) if:
  \[ \begin{aligned} \lim_{n->\infty}\mathbb{P}(|Y_n - a| \ge \epsilon) = 0 \\ \end{aligned} \]
\(\quad\) Caveat: Suppose \(X_n \rightarrow a\) in probability, \(\mathbb{E}[X_n]\) need not converge to \(a\) !
\(\quad\) How to check for convergence ? Guess \(a\), express the CDF as above, and then check if the limit approaches zero.

Central Limit Theorem

\(\quad X_1,\cdots,X_n\) i.i.d with finite mean \(\mu\) and variance \(\sigma^2\).
  \[ \begin{aligned} S_n &= X_1 + \cdots + X_n & \text{variance: } n\sigma^2 \\ \frac{S_n}{\sqrt{n}} &= \frac{X_1 + \cdots + X_n}{\sqrt{n}} & \text{variance: } \sigma^2 \\ \color{red}{Z_n} &\color{red}{=} \color{red}{\frac{S_n - n\mu}{\sqrt{n}\sigma}} & S_n \thicksim N(n\mu, n\sigma^2) \\ M_n &= \frac{S_n}{n} & \mathbb{E}[M_n] = \mu \quad var(M_n) = \frac{\sigma^2}{n} \\ \color{red}{Z_n} &\color{red}{=} \color{red}{\frac{M_n - \mu}{\sigma/\sqrt{n}}} & M_n \thicksim N(\mu, \frac{\sigma^2}{n}) \\ \color{red}{\lim_{n \rightarrow \infty} \mathbb{P}(Z_n \le z)} &\color{red}{=} \color{red}{\mathbb{P}(Z \le z)} = \Phi(z) & \text{for every }z, \text{ where } Z \thicksim N(0,1) \\ \end{aligned} \]

Moivre-Laplace 1/2 correction

\(\quad\)When \(a\) is discrete.
  \[ \begin{aligned} \mathbb{P}(S_n \le a) &\ge p & \text{under estimate} \\ \mathbb{P}(S_n \lt a+1) &\ge p & \text{over estimate} \\ \mathbb{P}(S_n \lt a+0.5) &\ge p & \text{1/2 correction} \\ \end{aligned} \]

Jensen's inequality

\(\quad\)If \(g\) is a convex function (ie a smiley face). Note \(g''(x) \ge 0\)
  \[ \begin{aligned} \color{blue}{g(x)} &\color{blue}{\ge g(c) + g'(c) (x-c)} & \text{1st order Taylor expansion} \\ g(X) &\ge g(\mathbb{E}[X]) + g'(\mathbb{E}[X]) (x-\mathbb{E}[X]) & \text{let } c = \mathbb{E}[X] \\ \mathbb{E}[g(X)] &\ge \mathbb{E}[g(\mathbb{E}[X])] + \mathbb{E}\left[g'\big(\mathbb{E}[X]\big) \big(X-\mathbb{E}[X]\big)\right] & \text{note }\mathbb{E}[X] \text{ is a constant} \\ \mathbb{E}[g(X)] &\ge g(\mathbb{E}[X]) + g'(\mathbb{E}[X]) (\mathbb{E}[X]-\mathbb{E}[X]) \\ \color{blue}{\mathbb{E}[g(X)]} &\color{blue}{\ge g(\mathbb{E}[X])} \\ \end{aligned} \]

Hoeffding's inequality

\(\quad X_i\) are random variables with i.i.d, and each is equally likely to be either -1 or 1. Let \(a \gt 0\)
  \[ \begin{aligned} \mathbb{P}(X_1 + \cdots + X_n \ge na) &\le e^{-na^2/2} & \text{a much stronger bound than Chebyshev inequality} \\ \end{aligned} \]

Source: MITx 6.041x, Lecture 18.


Comments: Post a Comment

<< Home

This page is powered by Blogger. Isn't yours?