Independence

If A,B \subset S and we have P(A) > 0, P(B) > 0, then in general it is not the case that P(A | B) = P(A). We then wish to ask when is this the case? With some thought we can see that if B does not effect A then this equality will hold. If this is the case we say A is independent of B.

We can also see that if A is independent of B then we have

    \begin{align*} P(A | B) = \frac{P(A \cap B)}{P(B)} = P(A) \end{align*}

we see that this implies that P(A \cap B) = P(A)P(B), and in fact we have the following definition:

Definition: Two events A and B are called independent if

    \[P(A \cap B) = P(A)P(B)\]

and called dependent otherwise.


Exercise: Prove that if A is independent of itself then P(A) = 0 or P(A) = 1.


Theorem: If A and B are independent, then A and B^c are independent as well.

Proof:


Definition: Events A,B,C are called independent if:

    \begin{align*}  P(A \cap B) &= \\  P(A \cap C) &= \\  P(B \cap C) &= \\  P(A \cap B \cap C) &=  \end{align*}