Chapter 3. A world of uncertainty – and how to measure it
1. Just how likely is that?
We noted in the Introduction that we live in an uncertain world. But not everything, we
believe, is equally uncertain. Some things are more likely than others, or so we think. It is,
for example, fortunately much more likely that the temperature tomorrow will be between
0 and 40 degrees Celsius than that it will be between 100 and 150 degrees. Very often it is
important to assess how much more likely certain things are to happen than others:
crossing a busy street, we have to assess how likely it is that we can get to the other side
before a car or bus gets too close; how likely it is that it will rain while we are out; how
likely it is that we will need additional insurance when renting a car or travelling abroad;
and so on. Important decisions that we make on a daily basis require some input of such
likelihoods, sometimes just qualitative, but for important decisions it is often helpful to use
actual number-estimates. National meteorological offices often tell people what the
numerical chance of rain is for today or tomorrow: if they are bold, or rash, enough, they
may extend their estimates even to the next week. Newspapers cull scientific journals to
tell their readers what the latest probabilities are reckoned to be of contracting heart
disease or cancer if you smoke or drink alcohol, and how much greater these probabilities
are in proportion to the amount you smoke or drink (or both).
I pointed out in the Introduction that a common alternative to probability-speak is odds-
speak. Betting odds were in any event first on the scene historically as the numerical
measures of uncertainty, because gambling is as old as the hills, gambling involves
numbers (gains and losses), and the odds a gambler is prepared to give represents his/her
degree of belief in terms of how much he/she is prepared to risk in order to gain a given
quantity if the prediction bet on comes true. Not only does the measurement of uncertainty
in terms of betting odds have an ancient pedigree, but it also offers an illuminating way of
approaching the theory of probability. It is the approach we shall adopt here, and the
starting point will be a look at the structure of simple bets and their associated odds. 2. Odds and bets.
Suppose we’re looking at a bet between two people, call them Esther and Lester, about the
truth of a factual statement A. If A turns out to be true one of the two bettors gains a sum of
money from the other, and if A is false they lose a sum to the other. We will assume that the
sums are non-negative amounts of money and are not both are 0 (otherwise, for obvious
reasons, there is simply no bet). The person who gains if A is true is said to be betting on A,
and the person who gains if A is false is betting against A. Since ~A is true just in case A is
false, the person betting against A is actually betting on ~A. I shall come back to this point
in a moment.
In the bet depicted in Table 1 below, Esther is betting on A and Lester betting against A.
Esther’s stake in the bet is the amount she stands to lose if A turns out to be false, and
Lester’s stake is how much he stands to lose if A turns out to be true. In Table 1 Esther’s
stake is r and Lester’s is q. The currency can be assumed to be dollars but is otherwise
immaterial, so I will omit dollar signs and just write r and q.
A Esther gains Lester gains
t q -q
f -r r
The odds on A in this bet are defined to be the ratio of the bettor-on’s stake, i.e. Esther’s, to
the bettor-against’s, i.e. Lester’s. Thus the odds on A in Table 1 are r:q. The odds against A
are ratio of Lester’s stake to Esther’s, i.e. q:r, and hence are the reciprocal of the odds on A.
Since odds are a ratio, odds of r:q can equally be expressed as kr:kq for any k. For example,
if Esther’s stake is 100 and Lester’s 20 the odds are 5:1. Where the odds are given as ratios
of whole numbers they are usually expressed in the form where the numbers are relatively
prime, i.e. have no factor in common other than 1. Thus odds of 100:20 would usually be stated as 5:1. But the same odds can also be stated in terms of fractional, or rational,
numbers: thus odds of m:n for whole numbers m and n are also equal to m/n:1, and to
m/k:n/k where k is greater than both m and n. For the sake of a smooth treatment, it is
usual to go one step further and suppose odds can be stated as ratios of any real numbers,
i.e. any non-negative numbers including those which may have non-terminating decimal
expansions, like for example (though you won’t see people at race-tracks offering odds of
Exercise 1. Suppose a bet is taking place at odds a:b on A.
i. If the stake of the person betting on A is r, what is the stake of their opponent in terms of
r, a and b?
ii. If the stake of the person betting against A is q, what the stake of their opponent in terms
of q, a and b?
Cultural Note. In traditional horserace betting, odds are advertised by a bookmaker, and
any member of the public, known as the punter, can deposit a stake. Often the bookmaker’s
odds are against an event occurring. Once the punter’s stake is deposited the bet goes
ahead. In the present discussion we will ignore the conventional distinction between
bookmaker and punter and regard Esther and Lester simply as equal parties to a betting
At this point we can make an interesting observation that we touched on earlier. Since A is
true just when ~A is false, the bet in Table 1 is identical to a bet on ~A, as depicted in Table
~A Lester gains Esther gains
t r -r
f -q q Thus Lester’s bet at odds of q:r against A is actually a bet on ~A at those odds. Of course,
Table 2 is only superficially different from Table 1, since the bets, considered simply as
payoffs depending on the truth-value of A, are identical. We shall come back to this
important point later.
Since Esther gains (loses) exactly what Lester loses (gains) there is no need to list more
than Esther’s gain and losses, so from now on I will just list one side of the payoff table,
3. Odds and betting quotients
Since the ratio of Esther’s stake to Lester’s is the odds on A, by definition, once we know
their respective stakes we automatically know the odds. But since the odds are only a ratio,
by themselves they don’t tell us what the stakes are. However, if we know both the odds
and the total stake, i.e. the sum of the individual stakes, we can calculate the individual
stakes r and q of Esther and Lester. For if S is the total stake and a:b the odds on A, then we
know that r = ka and q = kb for some k, and so S = r+q = ka+kb = k(a+b). Hence k =
S/(a+b) and so r = aS/(a+b) and q = bS/(a+b), giving the following equivalent
representation of the bet in Table 1:
A Esther gains
For the following sets of odds a:b on A and total stakes S, calculate the individual stakes of
Esther and Lester:
i. a:b = 19:1, S = 40
ii. a:b = 3:5, S = 16 iii. a:b = 7:3, S = 100.
If we put p = a/(a+b) then 1-p = b/(a+b), we can write Table 3 more economically as
A Esther gains
t S – pS = S(1-p)
The number p = a/(a+b) determined by the odds a:b will be of great importance in what
follows. Note that however the odds are expressed, i.e. as ka:kb for any k, p is always the
same number a/(a+b), because ka/(ka+kb) = a/(a+b). Hence p is independent of the
form in which the odds are expressed, and it is called the betting quotient on A determined
by those odds. Obviously, 0≤p≤1 since a a+b. Given the odds a:b on A, we can always find
p since p = a/(a+b), and conversely given p we can always find the odds, since they are
equal to pS:(1-p)S = p:1-p.
We can sum these facts up as follows. The bet in Table 1 always has an equivalent
expression in terms of S and p, and conversely, every bet given in terms of p and S, as in
Table 4, has an equivalent expression in terms of the odds and the individual stakes, as in
Table 1. We shall use these facts a lot in what follows.
i. What value of p corresponds to odds of 1:1 (so-called even-money odds, often just
ii. What odds correspond to p = 1/3?
iii. What odds correspond to p = 1?
iv. What odds correspond to p = 0?
v. If p = 0.06, what are the odds a:b where a and b are relatively prime whole numbers? [Optional. Since the odds ratio p:1-p is the same as x:1 where x = p/(1-p), we can in
principle regard the odds as represented by a single number x whose possible values are
the closed interval from 0 to ∞ (adding ∞ as the ‘number’ corresponding to p = 1 is
technically known as the ‘one-point compactification’). As we know, corresponding to each
value of x is the betting quotient p = x/(1+x) taking values between 0 and 1 inclusive. If we
graph p on the vertical axis against x on the horizontal axis (see figure ? below) we find that
p starts off near 0 at a gradient of approximately 1 and gradually flattens as it approaches 1
as x tends to infinity (1 is called the asymptote of the curve). Those with elementary
calculus can easily prove that the gradient of the curve does actually tend to 1 as x tends to
0, which means that small odds are approximately equal to small probabilities.]
4. Fair odds and fair betting quotients
When we offer or accept specific odds in a bet we usually do so because we feel that they
are advantageous to us. This means that they do not reflect our true opinion of the likely
truth of the proposition in question. Looking at Table 1 again, keep q fixed and let r vary.
There will almost certainly be values of r, and all larger ones, for which you would no doubt
think the bet becomes definitely disadvantageous to Esther, because you think she would
be risking a disproportionately large sum in order to receive q if A is true – by which I
mean disproportionately relative to your evaluation of A’s probability of being true.
Conversely, the smaller r is the more advantageous the bet is to Esther, because you think
she’s risking a disproportionately small amount given your assessment of A’s probability.
At the extreme, suppose A is a logical falsehood. Then if Esther was willing to risk any
positive amount r in a bet on A she would be certain to lose it and so the bet would
certainly be disadvantageous to her. If A were a logical truth and r was any finite number
then Esther risks nothing to get A and so the advantage is hers for any finite value of r. Now
let us idealise a little and suppose that there is a unique r such that, given your assessment
of A’s probability, all smaller values give Esther the advantage in the bet, and all larger give
Lester the advantage. Call this your fair value of r. To sum up: your fair r, call it r*, increases with your assessment of the probability of A. This
unique r*, for given q, also fixes the value of p, call it p*, at p* = r*/(r*+q). But it is an easy
exercise that as r increases, for that q, so does p. Hence as your assessment of A’s
probability increases, so does the value of p which equals out the advantages of Esther and
Lester. The fact that 0p1, i.e. p lies in the numerical scale of probabilities, suggests using
p* as your numerical evaluation of how probable A is.
But before we can do this we have to attend to the bothersome fact that all this is taking
place relative to a given value of q, so strictly we should write r* = f(q) to indicate that in
principle r*, and hence also p*, since p* = r*/(r*+q), depends on q. But we don’t want p* to
depend on some arbitrary value of Lester’s stake: your probability estimates don’t depend
on who’s staking what, but just – in this case - on how likely you think A is to be true, and
that certainly doesn’t depend on q. So we must make p* independent of q. In other words,
we want p* = f(q)/(f(q) + q) to be a constant. Dividing top and bottom by f(q) we get p* =
1/(1 + q/f(q)). So q/f(q) must itself be a constant. This means that f(q) must be
proportional to q, i.e. f(q) = kq for some k. In other words, we require that your fair r be
proportional to q.
But it seems at the very least question-begging that r* should increase proportionately to q.
What if q is trillions of dollars? There is something called the diminishing marginal utility of
money (the value of an incremental dollar depends on how much you already have). How
does anyone know, or why should they believe, that r* increases in proportion to q when
the latter, metaphorically, goes through the roof? Surely, as the popular saying goes, and
quite accurately in this case, all bets are off then. We want to use p* to measure as a
measure of a quantity, your degree of belief, much like a thermometer is a measure of
temperature. But just as thermometers, mercury, alcohol etc. are only reliable within
physically-imposed limits, so this measure can’t be expected to be reasonable, or even
meaningful, when enormous sums of money are used as stakes.
One standard escape route is to use utilities as stakes, because by construction utility is linear in value, unlike money (though this is known to raise other difficulties which we
won’t go into