9  Bayes’ Theorem

9.1 Law of Total Probability

Definition (Partition of the Sample Space): The events \(A_1, A_2, ..., A_n\) partition the sample space \(\Omega\) if \(A_1 \cap A_2 \cap ... \cap A_n = \Omega\) and \(A_i \cup A_j = \emptyset\) for \(i \ne j\).

Theorem (Law of Total Probability): Let \(E\) be an event. If \(A_1, A_2,..., A_n\) partition the sample space, then \[P(E) = P(A_1 \cap E) + P(A_2 \cap E) + ... + P(A_n \cap E)\] This can be shown by \(E = (A_1 \cap E) \cup (A_2 \cap E) \cup ... \cup (A_n \cap E)\) We can expand the terms in the expression above into conditional probability form: \[P(E) = \sum_{i=1}^n P(E | A_i) P(A_i)\] ## Bayes’ Theorem Theorem (Bayes’ theorem): Let \(A, B\) are events and \(P(B) \ne 0\). Then \[P(A|B) = \frac{P(B|A)P(A)}{P(B)}\] where * \(P(A|B)\) is called the posterior probability of \(A\) given \(B\) * \(P(B|A)\) is called the likelihood of \(A\) given a fixed \(B\) * \(P(A)\) is the prior probability * \(P(B)\) is the marginal probability

Combined with the law of total probability, we have the below theorem.
Theorem: Suppose that the sample space \(\Omega\) is partitioned by mutually exclusive events \(A_i\), with \(P(A_i) > 0\). Then for any \(E\) and each \(1 \ge i \ge n\), \[\begin{align}P(A_i | E) &= \frac{P(E \cap A_i)}{P(E)} = \frac{P(E | A_i) P(A_i)}{P(E)} \\ &= \frac{P(E|A_i)P(A_i)}{\sum_{i=1}^n P(A_i)P(E|A_i)} \end{align}\]

Example: A survey about a measure to legalize medical marijuana is taken in three states: Kentucky, Maine and Arkansas. In Kentucky, 50% of voters sup- port the measure, in Maine, 60% of the voters support the measure, and in Arkansas, 35% of the voters support the measure. Of the total population of the three states, 40% live in Kentucky , 25% live in Maine, and 35% live in Arkansas.
Given that a voter supports the measure, what is the probability that he/she lives in Maine?

Solution: Let \(L_K, L_M, L_A\) denote the events that a voter lives in Kentucky, Maine, or Arkansas, respectively. Let \(S\) denote the event that a voter supports the measure. We want to find \(P (L_M |S).\) By Bayes’ formula we have: \[\begin{align}P(L_M|S) &= \frac{P(S|L_M)P(L_M) }{P(S|L_K)P(L_K) + P(S|L_M)P(L_M) + P(S|L_A)P(L_A)} \\ &= \frac{(0.6)(0.25)}{(0.5)(0.4)+(0.6)(0.25)+(0.35)(0.35)} = 0.3175 \end{align}\]

Example: A student takes an examination consisting of 20 true-false questions. The student knows the answer to \(N\) of the questions, which are answered correctly, and guesses the answers to the rest. The conditional probability that the student knows the answer to a question, given that the student answered it correctly, is \(0.824\). Calculate \(N\).

Solution: Let K be the event that the student knows the answer to a particular question, and C be the event that the student answer the question correctly. Then \[ P(K|C) = \frac{P(K)}{P(C)} = \frac{P(K)}{P(C|K)P(K) + P(C|K^c)P(K^c)} = \frac{N/20}{N/20 + 0.5*(1-N/20)} = 0.824\] Solving for \(N\), we obtained \(N = 14\).