The 'Omnivore's Dilemma' is an extremely useful concept for understanding some of the paradoxes in human behaviour and psychology. Put simply, if a being can eat anything in order to obtain the energy and nutrients it needs to live, then it faces a dilemma not of survival but rather of choice. Rather than struggling just to achieve its goals (survival), the omnivore must answer questions such as how best to achieve that goal safely, efficiently and sustainably. Culture provides one way to find those answers - social learning increases decision-making effectiveness by offering proven solutions to questions about what's safe to eat, where the best stuff is found and how to prepare it efficiently.

Everyone who's shopped in a modern supermarket has had direct experience of the omnivore's dilemma: the paradox of choice we feel when selecting one breakfast cereal out of hundreds causes acute anxiety akin to that felt by our ancestors deciding on the day's hunt. In our daily lives we resolve these feelings by relying on a combination of innate biological preferences and learned behaviours - some of which may be adaptive and some of which may not be. Our taste buds tell us to indulge in sweet and fatty foods; our psychological openness to experience tilts the scale between trying a new brand or sticking with what we've had before; our upbringing nudges us towards the brands our parents trusted; or we seek to imitate the choice of celebrities who appear on marketing material. If we're being very careful (perhaps because we're resource constrained) we might even engage our System 2 reasoning and perform a cost-benefit calculation: i.e. which cereal will feed a family of four for the least dollars?

The omnivore's dilemma is not just about food: humans are behaviourial omnivores. Every action we take is the path of least resistance between the competing biases and impulses coded in our brains by biology and culture - and those psychological and cultural impulses are shaped by thousands, if not millions, of years of natural selection. As a result, our impulses make certain assumptions about the physical environment related to the environmental structure in which they became 'fixed' as part of our psyche. The Santa Barbara-type evolutionary psychologists speculate at length about the "environment of evolutionary adaption" (EEA) - but in reality there's a different environment for every trait. For example, our preference for sugary and fatty foods is likely rooted deep in pre-agrarian history, at a time when such energy sources were rare. But your learned preference for cheap cereal may adaptive only in the developmental environment of your childhood, when your family pinched pennies.

Signals and Behaviour

In terms of game theory, a behaviour is produced by a strategy which in turn relies on a stable set of expectations about the state of the world. As behaviourial omnivores, we are open to new information ('signals') about the state of the world and can adjust our strategies accordingly. In fact, humans as a species are remarkably adept at signal recognition: from birth, we are natural mimics with a preternatural talent for both pattern recognition and imputing causation. The canonical example of this is movement in tall grass: not only will we notice a sign of change in the state of the world, our first instinct is to attribute an agent or cause to that change. It's very likely in fact, that these abilities are somewhat overtuned: agency bias may be one of the psychological underpinnings of belief in the supernatural as well as social, political and economic conspiracy-mongering: we see patterns that just aren't there.

But signals about the state of the world may or may not be accurate, indeed, they may be intentionally falsified by other actors. How then do we select between them, particularly when trusting one signal over another (i.e. changing our expectations about the world) may result in vastly different behaviour? Let's connect this back to real-world politics: the information age provides every individual with almost unlimited opinions on every conceivable topic. We face a paradox of information: given that we can find information supporting any conceivable state of the world, how do we choose between them? The answer is the same as when we choose our breakfast cereal: we let our biological and learned biases and preferences take over and go for the option that causes the least anxiety. Everyone is likely to prefer information that re-enforces their pre-existing beliefs about the state of the world (confirmation bias); conservatives are likely to prefer information from sources they are already familiar with; authoritarians will preferentially imitate the bahviour of high-status individuals etc. Only rarely do we engage our rational mind and make a costly, independent assessment of the facts.

Social media makes all of this harder, of course. It strips away much of the context of information signals, removing information about the reputation and status of the sender that we might rely on to make such judgements. Bad faith actors can intentionally manipulate our biases to spread 'fake news'. Some of these techniques are quite insidious: propagandists and marketeers delight in abusing our learned biases towards the scientific method by deliberating misinterpreting research or associating themselves with high-status scientific professions. They attack the character or reputation of opposing sources (in areas unrelated to the quality of the information they are providing), knowing that this reduces the odds the experts will be listened to. They mimic the affectations and talking points of thought leaders: privileging 'open dialogue', the rhetorical style of varsity debate, and the cultural signifiers of wealth.

The anti-vaxxers' dilemma

Let's see how this might all work in practice. Imagine you're a skeptical cattle herder in a quasi-agrarian society. You have a short lifespan, in no small part because there's a one in three chance of dying from smallpox. One day, someone from a neighbouring village comes through and describes a behaviour in which people in his village take pustules from infected cows and rub them on the faces or wounds of their children. He or she swears they haven't had a smallpox outbreak in years. Do you imitate this behaviour, knowing that a sick cow will sometimes also make a child sick? Of course you wouldn't! You'd think the stranger and his village were mad. And you might be right: another village nearby sacrifices the elderly to the sky-god and claims the same results, and that's obviously just superstititous nonsense.

And yet the village that practices variolation is correct. Over millennia, they will live longer, healthier lives: have more children, herd more successfully and eventually come to dominate the local economy. Your village of skeptics (and the nearby village of religious fundamentalists) can't compete. You either imitate their behaviour or go extinct. Those who are most comfortable with novelty adapt the quickest. Over time, the behaviour becomes fixed in the population: scientists investigate and confirm the germ theory of disease; institutions are establish to subsidise the practice and punish those that don't comply. Ritualisation may even set in, such that compliance with the norm becomes a reliable signifier of group identity,

Now flip the script. You're a parent who lives in a society that practices widespread vaccination and regularly signals to you that vaccination is safe and effective. But one day, you encounter a signal that tells you the opposite: somehow a crank theory or conspiracy, a bad scientific study or new religious belief has penetrated through the cultural fog and established an information paradox. What is the omnivore to do? Here's the thing: were the new information stating that vaccines are dangerous correct (it's not, for the record) the fitness-increasing decision would be to accept the new signal, refuse to vaccinate your children despite the risks and spread the new signal as widely as possible. Over a lifetime, your child would be statistically fitter and healthier and may achieve a higher social status. But of course, the opposite is true. The same openness to novelty which is adaptive in one set of conditions is maladaptive in the other.

But the individual doesn't have the benefit of seeing life as a multi-generational evolutionary simulation in which statistically significant statistical differences in average outcomes are meaningful. They have to make a decision to reduce their individual anxiety in the moment. So their biases go to work. Most of us trust the information we learned as children about vaccinnes being safe; we attribute elite status to the medical profession and the advice it offers; we are at least partly responsive to the directives of government so long as it doesn't directly affect our individual rights and interests. A tiny minority of individuals will react differently and accept the new signal: maybe their psychological sanctity trigger is more sensitive; maybe they're more libertarian than average, and are skeptical about 'received wisdom'; maybe their openness to new information is set a little looser than average. Overall, it's plausible that there's a correlation between 'progressive' traits and anti-vaxxer idiocy: because the same set of underlying biases cause both sets of behaviour.

Openness to new information and skepticism of authority are politically adaptive behaviours for many people, but mental toolkits that may be adaptive in many scenarios are not guaranteed to be adaptive in all of them. We never know the state of the world with any certainty, and the adaptiveness or otherwise of our behaviour can only be known over extremely long timescales. Population-level behaviours, norms and institutions may help us resolve the paradox of information in many circumstances, but not all. We therefore remain behaviourial omnivores - capable of considerable strategic flexibility both on an individual and social level. That flexibility is central to what makes progress possible, but doesn't guarantee it for either the individual or society as a whole.