3.10 Odds
Odds are another way of expressing probabilities. Odds are given as a ratio of probabilities so that it is clear how much more or less likely one event is compared to another. In this sense, probability is an absolute measure of uncertainty, whilst odds are a relative measure of uncertainty. Any two probabilities can be compared together to create odds. One pair which is commonly used is the probability of an event occurring versus the probability of the same event not occurring.
For example, the coin toss was just as likely to result in a head as it was to result in a tail. This is expressed as odds written as 1:1, which is spoken as ‘1-to-1’ or more commonly ‘evens.’ Odds of 1:1 means that we can separate the possible outcomes into a total of \(1+1=2\) parts which are equally probable. One part represents the probability of a head and the other part represents the probability of a tail.
We can convert from odds to probability for exhaustive events as follows. For the single coin toss we had odds of 1:1 for heads. The total probability, which must be 1, is made up of 2 equally sized parts. One of these parts represents a head, and so the probability of a head (and similarly for a tail) is \(\frac{1}{2}=0.5\). Suppose instead we have odds of 1:9 for an event occurring versus not occurring. This means that the total probability, which must be 1, is made up of 10 equally sized parts. One of these ten parts represents the event occurring, and so its probability is \(\frac{1}{10}=0.1\). The other nine parts represent the event not occurring, and so the probability of non-occurrence is \(\frac{9}{10}=0.9\), Odds which are used to represent probabilities before any conditioning occurs (or are only conditioned on background information), are known as prior odds. The conversion from odds to probabilities is more challenging when the events are not exhaustive, but we do not consider that here.
Converting from probabilities to odds is much simpler since we only measure how much bigger one probability is than the other. For example, the probability of player 1 winning the double coin toss game by getting double heads was 0.25. That meant that the probability of player 1 not winning before any coins had been tossed was 0.75. Since the probability of player 1 not winning (0.75) was three times larger than the probability of player 1 winning (0.25), the prior odds of player 1 winning were 1:3.
Prior odds can be updated using conditioning information. These updated odds are known as posterior odds. For example, as part of the double coin toss game we conditioned on the result of coin 1 being a head. Before the conditioning, the probability of player 1 winning was 0.25; the prior odds of player 1 winning were 1:3. Conditioning on coin 1 being a head resulted in the probability of player 1 winning rising to 0.5. This means that the posterior odds (i.e. after observing coin 1 being a head) of player 1 winning versus not winning were evens (1:1) for that round.
Another reason that odds are useful is because of the simplicity that they give to a very important mathematical result, which we discuss in the next section.