“People are sick of experts.” These infamous and much-derided words uttered by UK Conservative parliamentarian Michael Gove express a sentiment with which we are now probably all familiar. It has come to represent a sign of the times—either an indictment or a celebration (depending on one’s political point of view) of our current age.

Certainly, the disdain for expertise and its promised consequences have been highly alarming for many people. They are woven through various controversial and destabilising phenomena from Trump, to Brexit, to fake news, to the generally ‘anti-elitist’ tone that characterises populist politics and much contemporary discourse. And this attitude stands in stark contrast to the unspoken but assumed Obama-era doctrine of “let the experts figure it out”; an idea that had a palpable End of History feeling about it, and that makes this abrupt reversion to ignorance all the more startling.

The majority of educated people are fairly unequivocal in their belief that this rebound is a bad thing, and as such many influential voices—Quillette‘s included—have been doing their best to restore the value of expertise to our society. The nobility of this ambition is quite obvious. Why on earth would we not want to take decisions informed by the most qualified opinions? However, it is within this obviousness that the danger lies.

I want to propose that high expertise, whilst generally beneficial, also has the capacity in certain circumstances to be pathological as well—and that if we don’t recognise this and correct for it, then we will continue down our current path of drowning its benefits with its problems. In short, if you want to profit from expertise, you must tame it first.

To draw a line between when high expertise helps and when it potentially hinders, we first need to acknowledge that this anti-elitist trend did not emerge from nothing. ‘Experts’ as a group (if such a thing can be held to exist) have not exactly covered themselves in glory in the last few years, so the cynicism they now face is to some degree justifiable.

We need not get into the weeds here about the specific issues. But, suffice it to say, the complex manoeuvring of some extremely bright and learned people unwittingly triggered the financial crisis. Apocalyptic deadlines for climate change devastation came and went without fireworks. Election predictions on both sides of the Atlantic have been appalling, as have the predictions on the immediate consequences of those elections. Silicon Valley ‘geniuses’ plunge from one self- inflicted crisis to another. And, meanwhile, we have watched as what many people consider lunacy leaks out of the credentialed halls of academia and into the world at large.

In other words, smart people keep getting it wrong and scepticism about their competence has grown as a result. This seems to be a fairly straightforward story at first glance, and yet the public will only take their antipathy so far. Nobody says, “I want someone unqualified to be my president, therefore I also want someone unqualified to be my surgeon.” Nobody doubts the value of the expertise of an engineer or a pilot. This apparent inconsistency is what frustrates the anti-anti-elitists so much, not least because it seems to be unjustifiable.

However, it is worth drawing a distinction between these two types of expertise—the kind people question, and the kind people don’t. In short, people value expertise in closed systems, but are distrustful of expertise in open systems. A typical example of a closed system would be a car engine or a knee joint. These are semi-complex systems with ‘walls’—that is to say, they are self-contained and are relatively incubated from the chaos of the outside world. As such, human beings are generally capable of wrapping their heads around the possible variables within them, and can therefore control them to a largely predictable degree. Engineers, surgeons, pilots, all these kinds of ‘trusted’ experts operate in closed systems.

Open systems, on the other hand, are those that are ‘exposed to the elements,’ so to speak. They have no walls and are therefore essentially chaotic, with far more variables than any person could ever hope to grasp. The economy is an open system. So is climate. So are politics. No matter how much you know about these things, there is not only always more to know, but there is also an utterly unpredictable slide towards chaos as these things interact.

The erosion of trust in expertise has arisen exclusively from experts in open systems mistakenly believing that they know enough to either predict those systems or—worse—control them. This is an almost perfect definition of hubris, an idea as old as consciousness itself. Man cannot control nature, and open systems are by definition natural systems. No master of open systems has ever succeeded—they have only failed less catastrophically than their counterparts.

Every king, queen, pharaoh, emperor, president, prime minister, and dictator-for-life in history has tried to master statecraft, and every one of them has failed. If they had not, their formula would have calcified into knowledge and rumbled on successfully indefinitely. And wasn’t such a legacy the goal of every single one of them? The better ones only failed more gradually, less bloodily, than the rest. But slowly their ideas, too, unravelled in the face of chaos. Ultimately, history has shown this to be axiomatic: the more you seek to control nature, to control an open system, the more disastrous the results.

Knowing this, it’s a wonder that humility in the face of open systems is still such a rare commodity amongst those who know them. Perhaps it’s because the Enlightenment granted us so much mastery over closed systems that we forgot the distinction existed. One could argue that we have earned our arrogance when it comes to technological progress, for instance. But just because we invented smartphones, it does not follow that we can predict the future.

So what are we to do? The anti-elitist solution is to simply disregard the opinions of any expert in an open system. Given that accurate prediction or control are impossible, we might as well rely upon the layman’s word as on any other. Who cares what the so-called experts say? This would be the wrong conclusion to draw. Laymen can have big opinions too, which are likely to be even more erratic.

Instead, we must continually encourage the interplay of diverse expert voices to help ease us into the future gradually, without any one of them gaining absolute authority. This variety is important, since all open systems, being fundamentally unknowable, are governed by competing theories as to how they work. There are no competing theories for being an auto-mechanic, or for flying an aeroplane. There is just one way to do those things.

But when it comes to open systems, multiple interpretations apply. In economics, for instance, you have Marxists, Keynsians, Hayekians, and so on. All are ‘experts,’ and yet they may hold completely opposing views on any given topic of economic debate. It is only the interplay of such opposing views, stretched over time, that mitigates the chaos of the open system. It enables society to be agile and reactive. It prevents us from ever trying to sculpt the future, and thereby fends off the disasters that inevitably occur if only one voice is allowed to become hegemonic. It is this that has shaped the society in which we live today—nobody designed it; we merely stumbled here in the dark, getting this far by avoiding terminal catastrophe with a combination of deliberation and good fortune.

Herein lies the beauty of democracy, and indeed all bipolar, yin-and-yang systems. Democracy doesn’t work because it gives people what they want—it works because it gives nobody what they want. And, as a result, nobody is ever able to fall victim to their belief that they control open systems. The troubles of experts in recent times can be interpreted as a continuation of this balancing act—provided that they are not usurped altogether.

For expertise to function properly and flourish it needs to be bound in the following three ways:

First, an expert or observer must always consider whether the recommendations pertain to an open or closed system. This line won’t always be perfectly drawn (medicine is a good example of a semi-open system), but it can be approximated. This will determine the appropriate level of scepticism.

Second, experts who realise they are operating in open systems must be vigilant to adopt a certain level of humility. No grand theory works; it only fails less spectacularly. This should hedge predictions being made in unhelpfully absolutist terms.

And third, we should always encourage the interplay between diverse viewpoints in open systems—for only such an interplay can pull us back from the inevitable excesses of hubris, that attract us like moths to a flame.

Alex Smith is a strategist specialising in the underlying nature of complex systems and companies. He is founder of Basic Arts and you can follow him on Twitter @smithesq

Share this: Pocket

WhatsApp



Email

Print

