Definition of probability definition (Kolmogorov axioms)

\[P(A) \in \mathbb{R}, P(A) \geq 0 \> \forall A \in F\]

where \(F\) is the event space.


\[P(\Omega) = 1 \]

where \(\Omega\) is the sample space.


\[ P(A_1 \cup A_2 \cup ...) = P(A_1) + P(A_2) + ...\]

Consequences


\[if A\subseteq B \, then \, P(A) \leq P(B)\]


\[ 0 \leq P(E) \leq 1 \, \forall E \in F\]

\[P(A^c) = 1 - P(A)\]

\[P(A \cup B) = P(A) + P(B) - P(A \cap B)\]

Definition of independency

We say that \(A\) and \(B\) are indepedent if

\[P(A \cap B) = P(A) P(B)\]

Definition of conditional probabilities and Bayes rule

We define the probability of \(A\) given \(B\) as

\[P(A/B) \equiv \frac{P(A \cap B)}{P(B)}\]

so

\[P(B/A) \equiv \frac{P(A \cap B)}{P(A)}\]

Bayes rules inmediately follows from the definition

\[P(A/B) P(B) = P(B/A) P(A)\]

\[P(A/B) = \frac{P(B/A) P(A)}{P(B)}\]

If \(A_1, A_2, \dotsc, A_n\) is a partition of \(\Omega\) then

\[P(A_i|B) = \frac{P(B|A_i) P(A_i)}{\sum_{i=1}^{n} P(B|A_i) P(A_i)}\]

Independency and conditional probabilities

If \(A\) and \(B\) are independent \[P(A | B) = P(A)\]