Limit theorems#

Law of large numbers#

Sample average of an i.i.d. sample \(X_1, \ldots, X_n\) with finite mean \(\mathbb EX_1 = \mu\) converges to \(\mu\) in probability:

\[ \overline X_n \stackrel{P}{\to} \mu, \quad n\to \infty, \]

i.e., for all \(\varepsilon > 0\)

\[ \lim\limits_{n\to\infty}\mathbb P(\vert \overline X_n - \mu \vert > \varepsilon) = 0. \]
../../_images/f0e76082d09850ca083349d4450841fcbf548d5d9c164024f062e39dddff7bef.svg

Central limit theorem#

If i.i.d. samples \(X_1, \ldots, X_n\) come from a distribution with finite variance \(\mathbb VX_1 = \sigma ^2\), then

\[ \mathbb E \overline{X}_n = \frac 1n \sum\limits_{k=1}^n \mathbb EX_k = \mu, \quad \mathbb V \overline{X}_n = \frac 1{n^2} \sum\limits_{k=1}^n \mathbb VX_k = \frac{\sigma^2}n. \]

Central limit theorem claims that \(\overline{X}_n\) looks like \(\mathcal N\big(\mu, \frac{\sigma^2}n\big)\) for big values of \(n\):

\[ Z_n := \frac{\sqrt n(\overline X_n - \mu)}{\sigma} \approx \mathcal N(0, 1) \text{ при } n \gg 1. \]

More precisely, \(Z_n\) converges to \(\mathcal N(0,1)\) in distribution, i.e.,

\[ \lim\limits_{n\to\infty}\mathbb P(Z_n \leqslant z) = \Phi(z), \quad \Phi(z) = \frac 1{\sqrt{2\pi}}\int\limits_{-\infty}^z e^{-\frac{t^2}2}dt. \]
../../_images/d4df9820bb7e8d2eb5b4ec510cbfccde1d32a6c40244ee2e6a8f34d2f6cc390d.svg
../../_images/31e774f4930d8721475e7b98d93b161ba6ca512a89286172772664ad3e055e2f.svg
../../_images/54e1aba69834cdbb8b7df3c86aa6e3080f69f38d48e3e213c01d454f42c11fa8.svg

Exercises#

  1. Show that \(\overline S_n = \overline{X^2}_n - \big(\overline X_n\big)^2\) where \(\overline{X^2}_n = \frac 1n\sum\limits_{i=1}^n X_i^2\).

  2. CLT and histograms from above show that \(\overline X_n \sim \mathcal N\big(\mu, \frac {\sigma^2}n\big)\). What about sample variance? How would look distribution of \(\overline S_n\) for large \(n\)?