“Please, Let’s Stop the Epidemic of Armchair Epidemiology” writes Tim Requarth – arguing that non-experts, bolstered by the Dunning Kruger effect, are endangering us all by sharing their ill informed opinions.

Intuitively, the idea makes sense. In times of crisis we can ill afford the distractions of conspiracy mongers, and should instead listen to those who have spent decades studying the very problems we face today.

Otherwise we might get important things wrong, like the likelihood of human to human transmission.

Or the likelihood of pandemic, and the usefulness of masks and travel bans and whether COVID-19 is worse than flu… the list goes on.

This is the immediate problem with Requarth’s argument. The “experts” – a trusted inner circle of academics, mainstream media and governmental institutions – have catastrophically failed in response to this crisis.

In fact there’s already been an unusual amount of reflection from the inner circle about their failure. Vox, for instance, wrote a post mortem which argued that the media should have clarified the limits of its knowledge instead of mindlessly repeating the claims of experts. Scott Alexander, a non-expert, had the temerity to respond by pointing out that while experts in the inner circle failed to predict the pandemic, plenty of contrarian thinkers in the rationalist and tech community did better.

The explanation for this, according to him, lies in errors of probabilistic thinking and communication. Those who were calling for early action recognised that even if deadly pandemic was unlikely – its potential costs were so high that it justified significant action.

In contrast, the inner circle framed the event as a binary. Either pandemic would happen, or it wouldn’t. To Alexander, if any party involved had stopped to think about the pandemic probabilistically, they would have realised that even a small chance of millions being infected by a disease deadlier than flu was something worthy of alarmism.

It’s a charitable explanation, but it strikes me that the average person already engages in the sort of probabilistic reasoning he discusses. Most people, if offered $100, wouldn’t play russian roulette. They’d recognise that while the chances of being shot are low the cost is very high.

In fact, I think that many ordinary people did do better at responding to COVID-19 than elite institutions. Not just the mavericks and eccentrics that Scott Alexander points to, but the citizens who began social distancing in advance of, and often against, government advice. Similarly, while panic buying groceries wasn’t the ideal response to coronavirus, at least those that did so correctly identified the threat of COVID-19 – even as the media sneered. Heck, there were even highly upvoted comments on reddit as early as January which correctly identified that China was hiding data, that the WHO’s response was insufficient and predicted pandemic and recession.

To be clear: I’m not claiming that the general public was calculating probabilities with the sophistication of Tetlock’s superforecasters – or that every ordinary person predicted pandemic. Rather, I’m pointing out that the rough, fear based heuristics of a substantial minority of people outperformed the uniformly miscalibrated response of the inner circle.

Now part of the reason individuals did better is, as Scott points out, their willingness to accept heterodox evidence and reasoning. The comments I saw justified their position with videos from on the ground in China, and mental models based on a mistrust of international institutions.

But this is also a limited explanation for the performance of ordinary people over experts. By late January, there publicly available hard data indicating that the R0 and Case Fatality Rate were significantly worse than flu. There was no need to rely on anecdotes.

So both experts and ordinary people had access to the relevant information and heuristics required to know that action was needed. Yet while a significant portion of ordinary people followed through on their instincts, few institutions of the inner circle did.

The problem for experts then, must lie not with their capacity to reason, or their knowledge, but rather with their incentives.

Specifically, I think that there is an important difference between being focused on the cost of an event occuring, compared to the reputational harm of incorrectly predicting that an event will occur.

To illustrate: ordinary people were thinking about the pandemic as if they were playing Russian Roulette with a gun pointed at their heads. Expert commentators meanwhile, were paid to predict whether there was a bullet in the chamber. The former person is concerned about the cost of dying, while the latter is concerned with making a correct prediction – and when there’s only a one in six chance there’s a bullet in the chamber, is it any surprise they didn’t predict the person’s death.

Scott Alexander frames this as simply a language problem, and suggests that experts should just be really clear that they’re speaking probabilistically. Yet as he himself inadvertently notes with the example of Nate Silver – no matter how good experts get at talking in probabilities, people are really bad at listening probabilistically. When 538 gave Trump a 29% chance of winning the election, they weren’t congratulated for giving him relatively better odds than other forecasters, they were pilloried for getting the result “wrong” .

So imagine what would happen if the media ran Alexander’s favoured headline:

“TEN PERCENT CHANCE THAT THERE IS ABOUT TO BE A PANDEMIC THAT DEVASTATES THE GLOBAL ECONOMY, KILLS HUNDREDS OF THOUSANDS OF PEOPLE, AND PREVENTS YOU FROM LEAVING YOUR HOUSE FOR MONTHS”?

Panic would ensue. People would stop showing up for work and would keep their kids home from school. Masks would become a common sight. The government, reasoning probabilistically (correctly), would close borders, infuriating China, and ban popular mass gatherings, offending citizens.

Which we know, with the benefit of hindsight, would be the correct actions to take. But given the sacrifice required, if this were one of the nine in ten times where the virus came to nothing – how many people would buy the media’s lame excuse that “well, we did say there was only a 10% chance…”? How would the epidemiologists and bureaucrats explain to their supervisor that, while the data was speculative they also saw some really concerning grainy videos of Wuhan on twitter? Most relevantly, how would voters respond to a government that crashed a fragile economy to wage war against a threat that only ever existed on the statistical tables of so called experts?

When the inner circle talked down the threat of the virus, and refused to act, they weren’t just accidentally failing to clarify uncertainty, or making basic errors of probabilistic reasoning. Rather, they were intelligently assessing which actions were in their rational self interest.. After all, governments are rewarded for how they respond to crises, not for preventing them. Similarly, the people who were appropriately alarmed by coronavirus weren’t necessarily geniuses, they just centered personal safety rather than reputational harm in their calculations (or they were contrarian thinkers with niche audiences who can listen probabilistically).

Maybe this reading is overly harsh, cynical, or conspiratorial. Remember however, that this is the worst crisis the world has faced in at least a generation, and it was totally preventable. Most of the articles I’ve linked were filled with quotes from epidemiologists who have dedicated their professional careers to modelling this stuff. Surely they should have been writing open letters demanding panic and action, instead of talking down the threat of the virus?

But to stick their necks out before they were absolutely certain we were facing a global pandemic would have been to risk humiliation – and by the time such certainty existed it was too late.

That’s the value of people outside the inner circle. They can risk being wrong in the pursuit of their own safety precisely because they don’t have a high status position to defend. The next time storm clouds gather on the horizon, instead of waiting for the inner circle to tell us what to think, we should listen to the amateurs, and decide for ourselves.

Post Script:

As far as I know, neither Tim Requarth, nor others who claim amateurs must remain silent hold post graduate degrees specialising in public discourse analysis. Given this, it’s understandable that they so confidently demand we listen to experts – they probably suffer from the Dunning Kruger effect.