In January 2018 the BBC News Web site announced that in the three months before November 2017, "UK unemployment fell by 3,000 to 1.44 million." The reason for this fall was debated, but nobody questioned whether this figure really was accurate. But forensic scrutiny of the U.K. Office of National Statistics Web site revealed that the margin of error on this total was plus or minus 77,000—in other words, the true change could have been between a fall of 80,000 and a rise of 74,000, and a more honest headline would have been "UK unemployment may have gone up or gone down."

Although journalists and politicians appear to believe this claimed decline of 3,000 was a fixed, immutable tally of the entire country, it was in fact an imprecise estimate based on a survey of around 100,000 people. Similarly, when the U.S. Bureau of Labor Statistics reported a seasonally adjusted rise in civilian unemployment of 69,000 from December 2017 to January 2018, this was based on a sample of around 60,000 households and had a margin of error (again rather difficult to find) of plus or minus 300,000. Unemployment counts based on surveys of businesses can also be subject to substantial revisions, such as the announcement in August 2019 that there were 500,000 fewer jobs than previously thought.

Although numbers are often treated as cold, hard facts, perhaps we need to be more willing to admit how uncertain they can be.

We are all (perhaps with the exception of some politicians) happy with acknowledging uncertainty about the future: nobody can know what is going to happen, and betting exchanges and Web sites such as FiveThirtyEight.com make their living from giving odds for future events. But we are not only uncertain about the future—bearing in mind the unemployment data, we may also not know what's happening now, or what went on in the past.

Suppose I have a fair coin, and I ask you for your opinion of the probability that it will come up heads. You will happily answer "50–50." Then I flip it, cover up the result before either of us sees it, and again ask for your probability that it is heads. If you are typical of my experience, you may, after a pause, rather grudgingly say "50–50" again. Then I take a quick look at the coin, without showing you, and repeat the question. Again, if you are like most people, you eventually mumble "50–50" once more.

This simple exercise reveals a major distinction between two types of uncertainty: what is known as "aleatory" uncertainty before I flip the coin—the chance of an unpredictable future event—and "epistemic" uncertainty after I flip—an expression of personal ignorance about an event that is fixed but unknown. The same difference exists between a lottery ticket (where the future, random outcome depends on chance) and a scratch card (where the outcome is already decided, but you don't know what it is).

Our lives are full of such epistemic uncertainty. Gamblers bet on the next card to be dealt, we discuss the possible sex of a baby, we gossip over who might be having a furtive relationship, we puzzle over whodunits, we argue over the numbers of tigers left in the wild, and we are told estimates of the possible number of migrants or the unemployed. All these are facts or quantities that exist out there in the world, but we just do not know what they are. Figure 1 displays the "fan chart" that the Bank of England has used for many years to communicate not only the uncertainty about the future, but also epistemic uncertainty about what growth is at the present and has been in the past.

Figure 1: Bank of England’s fan chart (from the August 2019 Inflation Report), showing uncertainty about for GDP growth in the future, the present and the past. The bands represent 30, 60 and 90 percent prediction intervals: the Bank is therefore assessing around a 20 percent probability of a recession in 2019/2020. Credit: UK Office for National Statistics

But we also live in an age of misinformation and skepticism about science, whether it concerns vaccines or climate. Officials and scientists can feel that they will not be trusted if they admit the limitations to their knowledge, and so they can be tempted to exaggerate their confidence in the safety of vaccines and the reasons for changes in our environment. In doing so, however, they risk being caught out.

We have an active program doing research into whether experts can acknowledge epistemic uncertainty without losing trust and credibility. Our experiments, conducted on thousands of participants, suggest that trust is not lost if the communicators can be confident about their uncertainty. This is fine if we have a good survey, because then we can quantify our margins of error, as for unemployment. But surveys may not be reliable, and there may be extra sources of systematic bias which cast doubt on the calculated intervals.

We are currently working with the U.K. Office for National Statistics on communicating uncertainty about the current levels of migration into the U.K.—a somewhat delicate topic. Taking inspiration from the Bank of England, these are now presented as "fuzzy fans," a format which we have found to be comprehensible and attractive.

Figure 2: Migration estimates, May 2019. The ‘fuzzy-fan’ only represents quantifiable uncertainty arising from the survey design. Credit: UK Office for National Statistics

The note on the graphic indicating "known uncertainty in survey estimate" is crucial, since there are other sources of uncertainty arising from the limitations in the survey. Indeed, it has recently been reported that migration has been systematically under-reported.

In a world in which strident voices dominate, open acknowledgment of what we don't know could be a small step for humility and trustworthiness. Of course, just because we don't know everything, does not mean we don't know anything. So I think the take-home message is that we should be clear what we do know, and then proclaim our uncertainty with confidence.

More to Explore

The Art of Statistics: How to Learn from Data. David Spiegelhalter. Basic Books, 2019.

Communicating Uncertainty about Facts, Numbers and Science. Anne Marthe van der Bles, Sander van der Linden, Alexandra L. J. Freeman, James Mitchell, Ana B. Galvao, Lisa Zaval and David J. Spiegelhalter in Royal Society Open Science, Vol. 6, No. 5; May 8, 2019. https://doi.org/10.1098/rsos.181870