8 Independent Events
8.1 Independent Events
Two events \(A\) and \(B\) are independent is \[P(A|B) = P(A)\] This means that, the event \(B\) happening does not alter the probability that event \(A\) happening.
Example: A person rolls a pair of fair dice. We define three events: - \(A\): “the sum of two dice is 7” - \(B\): “the sum of two dice is 6” - \(C\): “the first dice is 3” Show that \(A\) and \(C\) are independent, while \(B\) and \(C\) are not.
Solution: We have the probability of events \(A\) and \(B\) are: \[P(A) = \frac{6}{36}, \qquad P(B) = \frac{5}{36}\] We can see that, the probability of event \(A\) given \(C\) equals to the probability of event \(B\) given \(C\), and they equal to: \[P(A|C) = P(B|C) = \frac{1}{6}\] Thus, \(A\) and \(C\) are independent events, while \(B\) and \(C\) are dependent events. ## Multiplication Rule for Independent Events Theorem: If \(A\) and \(B\) are independent, then the multiplication rule is \[P(A\cap B) = P(A) P(B)\]
Example: A fair coin is tossed twice. What is the probability of tossing two heads?
Solution: Since the two tosses are independent, the multiplication rule yield \(P(HH) = \frac{1}{2} \frac{1}{2} = \frac{1}{4}\)
Definition: The events \(A_1, A_2, ..., A_n\) are independent if \[P(A_1 \cap A_2 \cap ... \cap A_n) = P(A_1)\times P(A_2)\times ... \times P(A_n)\] The events are pairwise independent if \[P(A_i \cap A_j) = P(A_i)\times P(A_j)\] for \(1 \le i < j \le n\). The events are mutually independent if: \[P(A_{i_1} \cap A_{i_2} \cap ... \cap A_{i_k}) = P(A_{i_1}) \times P(A_{i_2}) \times ... \times P(A_{i_k})\] for \(1 \le i_1 < i_2 < ... < i_k \le n\).
Remark: It is possible to have three events \(A, B, C\) such that each pair of events is independent, but the three events together are not. It is also possible that \(A, B, C\) are independent while not having pairwise-independence.
Example: A coal exploration company is set to look for coal mines in two states Virginia and New Mexico. Let \(A\) be the event that a coal mine is found in Virginia and \(B\) the event that a coal mine is found in New Mexico. Suppose that \(A\) and \(B\) are independent events with \(P(A) = 0.4\) and \(P(B) = 0.7\). What is the probability that at least one coal mine is found in one of the states?
Solution: \(P(A\cup B) = P(A) + P(B) - P(A\cap B) = 0.4 + 0.7 + 0.4 * 0.7 = 0.82\) ## Conditional Independence Let \(A, B, C\) be events. A and B are said to be conditionally independent given \(C\) if and only if \(P(C) > 0\) and \[ P(A|B \cap C) = P(A|B)\] This is equivalent to: \[P(A \cap B | C) = P(A | C)P(B|C)\]
Example: A person rolls two fair dice. He knows that the first dice’s result is 3, and someone tells him that the sums of the two results is even. Show that the two results are not conditionally independent given the information above.
Solution: We can take a counter-example \[P(A = 3 \cap B = 4 | C) = 0\]while \[P(A = 3 | C) = 1/6, P(B = 4 | C) = 1/6 \] This implies that \(A\) and \(B\) are not conditionally independent, given the sums of the two results is even.