3.11 Bayes’ rule

Suppose we have a probability assigned for an event \(A\) and we observe an event \(E\) that is relevant to \(A\). How should we update the probability assigned to \(A\) in light of the relevant information from \(E\)? An example is a personal probability that it will rain next Saturday. Then on Friday, it rains throughout the day and into the evening. Your personal probability might increase now that you have observed rain just prior to your original probability assignment. But by how much should it increase exactly? This is a useful question that is answered using Bayes’ rule.

Bayes’ rule is a mathematical rule for updating probability assignments in light of new information. For an event of interest \(A\) and new information from an event \(E\), the probability version of Bayes’ rule is: \[ \text{probability of }A\text{ conditioned on }E = \frac{\text{probability of }E\text{ conditioned on }A \times \text{probability of }A}{\text{probability of }E}.\] It provides a means of obtaining a conditional probability for the event of interest based upon the new information (the left-hand side of the equation) by using probabilities that may already be assigned (the right-hand side of the equation).

For the odds of two events \(A\) and \(B\) in light of \(E\), Bayes’ rule is can also be expressed as: \[ \frac{\text{probability of }A\text{ conditioned on }E}{\text{probability of }B\text{ conditioned on }E}=\frac{\text{probability of }E\text{ conditioned on }A}{\text{probability of }E\text{ conditioned on }B}\times \frac{\text{probability of }A}{\text{probability of }B},\] The terms on the left hand side of this equation are the posterior odds of \(A\) to \(B\) conditioned on \(E\). The ratio on the far right hand side is the prior odds of \(A\) to \(B\). When expressed like this, Bayes’ rule provides a link between the prior and posterior odds:
\[\text{posterior odds of }A\text{ to } B=\frac{\text{probability of }E\text{ conditioned on }A}{\text{probability of }E\text{ conditioned on }B}\times \text{prior odds of }A\text{ to }B.\]

This rule states that the posterior odds of \(A\) to \(B\) (conditioned on \(E\)) are a product of the prior odds of \(A\) to \(B\) multiplied by a ratio of probabilities for \(E\) conditioned on \(A\) and \(B\), respectively. This ratio is known as the Bayes factor, or likelihood ratio (LR) in this instance. The LR acts as the updating factor for the prior odds due to the event \(E\). It describes how much more (or less) probable event \(E\) is when conditioned on \(A\) compared to when it is conditioned on \(B\). We focus on the LR in Chapter 5.

Bayes’ rule gives us mathematical expressions that probability assignments must obey. This is another means of assuring logical values for conditional probability assignments. It does not remove subjectivity from the probability or odds assignment, but it does remove subjectivity from how that probability or odds assignment should be updated in light of new information.

Bayes’ rule switches the conditioning information as we move from the LR to the posterior odds. Probabilities for \(E\) which are conditioned on \(A\) and \(B\) in the LR are switched to probabilities for \(A\) and \(B\) conditioned on \(E\) in the posterior odds. This is known as transposing the conditional. Bayes’ rule gives us the correct way to transpose the conditional using the logic of probability. The legal domain has a tricky and tempting trap for incorrectly transposing the conditional, known as the prosecutor’s fallacy. We show this in Chapter 4.