Definition of probability definition (Kolmogorov axioms)

• Probability of an event $$A$$ is a non-negative real number

$P(A) \in \mathbb{R}, P(A) \geq 0 \> \forall A \in F$

where $$F$$ is the event space.

• Unitarity

$P(\Omega) = 1$

where $$\Omega$$ is the sample space.

For mutually exclusive events $$A_1, A_2 ... (A_i \cap A_j = \varnothing \quad \forall i \neq j)$$

$P(A_1 \cup A_2 \cup ...) = P(A_1) + P(A_2) + ...$

Consequences

• Probability of the empty set $P(\varnothing) = 0$

• Monotonicity

$if A\subseteq B \, then \, P(A) \leq P(B)$

• Numeric bound

$0 \leq P(E) \leq 1 \, \forall E \in F$

• Complement

$P(A^c) = 1 - P(A)$

• Union

$P(A \cup B) = P(A) + P(B) - P(A \cap B)$

Definition of independency

We say that $$A$$ and $$B$$ are indepedent if

$P(A \cap B) = P(A) P(B)$

Definition of conditional probabilities and Bayes rule

We define the probability of $$A$$ given $$B$$ as

$P(A/B) \equiv \frac{P(A \cap B)}{P(B)}$

so

$P(B/A) \equiv \frac{P(A \cap B)}{P(A)}$

Bayes rules inmediately follows from the definition

$P(A/B) P(B) = P(B/A) P(A)$

$P(A/B) = \frac{P(B/A) P(A)}{P(B)}$

If $$A_1, A_2, \dotsc, A_n$$ is a partition of $$\Omega$$ then

$P(A_i|B) = \frac{P(B|A_i) P(A_i)}{\sum_{i=1}^{n} P(B|A_i) P(A_i)}$

Independency and conditional probabilities

If $$A$$ and $$B$$ are independent $P(A | B) = P(A)$