Independence and random vectors#
Independent events#
Events \(A\) and \(B\) are called independent if
Conditional probability#
The probability of the event \(A\) given that another event \(B\) has happened is called conditional probability \(\mathbb P(A \vert B)\). If \(\mathbb P(B) > 0\) then
Note that \(\mathbb P(A\vert B) = \mathbb P(A)\) if the events \(A\) and \(B\) are independent. This is quite natural: for independent events the knowledge about happening or not happening of the event \(B\) does not affect \(\mathbb P(A)\).
Law of total probability#
If \(\Omega = A_1 \bigsqcup A_2 \bigsqcup \ldots \bigsqcup A_n\) then
This law of total probability also holds after conditioning on some event \(C\):
Bayes’ rule#
From the equality
one can deduce Bayes’ theorem:
Bayes’ theorem often combined with law of total probability (72):
Random vectors#
A random vector \(\boldsymbol \xi = (\xi_1, \ldots, \xi_n) \in \mathbb R^n\) is just a vector of random variables. It can be either discrete or countinuos depending on the types of \(\xi_k\). Pmf of a discrete random vector is a tensor
with properties
Pdf \(p_{\boldsymbol \xi}(x_1, \ldots, x_n)\) of a continuous vector is often called joint density of the random vector \(\boldsymbol \xi\):
Joint density is a nonnegative function which integrates to \(1\).
Expectation of a random vector is calculated elementwise:
Covariance matrix of a random vector \(\boldsymbol \xi = (\xi_1, \ldots, \xi_n)\) is an \(n\times n\) matrix \(\mathrm{cov}(\boldsymbol \xi , \boldsymbol \xi) = \mathbb E(\boldsymbol\xi - \mathbb E \boldsymbol \xi)(\boldsymbol\xi - \mathbb E \boldsymbol \xi)^\mathsf{T}\). In other words,
Independent random variables#
Discrete random variables \(\xi\) and \(\eta\) are called independent if
Similarly, two continuous random variables \(\xi\) and \(\eta\) are independent if their joint density \(p(x, y)\) is equal to product of densities:
If \(\xi\) and \(\eta\) are independent, their covariance is \(0\), and \(\mathbb V(\xi + \eta) = \mathbb V \xi + \mathbb V \eta\).
Random variables \(\xi_1\ldots, \xi_n\) are called mutually independent if their joint pmf or pdf equals to the product of one-dimensional ones. In this case the random vector \(\boldsymbol \xi = (\xi_1, \ldots, \xi_n)\) has mutually independent coordinates. Covariance matrix of such random vector is diagonal:
Multivariate normal distribution#
Multivariate normal (gaussian) distribution \(\mathcal{N}(\boldsymbol\mu, \boldsymbol\Sigma)\) is specified by its joint density
where \(\boldsymbol x, \boldsymbol \mu\in\mathbb{R}^n\), \(\boldsymbol\Sigma\) is a symmetric invertible matrix of shape \(n\times n\).
If a random vector \(\boldsymbol \xi \sim \mathcal{N}(\boldsymbol\mu, \boldsymbol\Sigma)\) then
Any linear transformation of a gaussian random vector is also gaussian: if \(\boldsymbol \xi \sim \mathcal{N}(\boldsymbol\mu, \boldsymbol\Sigma)\) and \(\boldsymbol \eta = \boldsymbol{A\xi} + \boldsymbol b\), then
Exercises#
Given that a family has a boy, what is the probability that both children are boys?
Среди населения \(33.7\%\) имеют первую группу крови, \(37.5\%\) — вторую, \(20.9\%\) — третью, \(7.9\%\) — четвёртую. При переливании крови надо учитывать группы крови донора и рецепиента:
реципиенту с четвёртой группой крови можно перелить кровь любой группы;
реципиентам со второй и третьей группами можно перелить кровь той же группы или первой;
реципиентам с первой группой крови можно перелить только кровь первой группы.
С какой вероятностью допустимо переливание в случайно взятой паре донор—реципиент?
Suppose that \(5\) men out of \(100\) and \(25\) women out of \(10000\) are colorblind. A colorblind person is chosen at random. What is the probability of them being male?
Show that covariance matrix of any random vector is symmetric and semi-positive definite.
Find pdf of a gaussian random vector \(\boldsymbol \xi \in \mathbb R^n\) with mutually independent coordinates.