This is Chapter 3 in a series. If you’re new to the series, visit the series home page for a full table of contents.

_________

Chapter 3: A Story of Stories

.

In the last chapter, we met the human giant.

We talked about emergence and how the giant is what humanity looks like a few floors up the tower from the individual.

Building giants was a necessity for ancient humans. A human tribe was more than the sum of its parts, in physical power, in productivity, and in knowledge.

Given the powers of emergence, large human giants would be forces to reckon with. But unlike ants, humans are more than just cells in competing giants—they’re competing individuals too. So as tribes grew in size, the benefits of strength and capability would be accompanied by the cost of increasing instability. A human tribe is held together by weaker glue than an ant colony, and the bigger the tribe, the harder it is for that glue to hold up. This is partly why complex animals like wolves, gorillas, elephants, and dolphins tend to roll in groups with under 100 members.

Early tribes of humans were probably similar to tribes of other apes—glued together mostly by family ties. Kinship is an obvious natural glue because animals are programmed to be interested in the immortality of those with genes most similar to them—so humans are more likely to cede individual self-interest to a group when that group is family. That’s why today, people are so willing to make huge sacrifices for family members.

Family glue is strongest between parents and children, because genes “know” that copies of themselves live in their container’s direct progeny. Genes also have us selfishly caring about the well-being of siblings and nieces and nephews because a very similar version of themselves lives in them—but we don’t care quite as much about these people as we do about our children. As the distance between blood relations grows, the glue thins. As evolutionist J.B.S. Haldane puts it: “I would lay down my life for two brothers or eight cousins.”

With that in mind, let’s imagine a big extended family made up of 27 immediate families—the grandchildren and great-grandchildren of a single couple—living together as an ancient tribe.1

Say the red guy is the tribe’s chief. For the chief and his family, this is what the tribe feels like:

Pretty nice setup. The problem is that no one else views the tribe this way—because everyone is at the center of their own circle. Let’s focus in on the chief’s sister and her family.

To this yellow family, the tribe feels like this:

Not ideal, but not the end of the world. But how about the chief’s second cousins—like the orange family? Or the green family?

For these families—and all the other 16 families in that ring—the tribe feels like this:

And remember how the cousin system works. Your second cousin is equally related to you, your siblings, and your first cousins—to them, you’re all equivalent second cousins.

So if the chief is your second cousin, it may feel a bit like they’re part of a different clan from yours altogether.

And the way things are now, the head of the one clan is the chief of all three clans—leaving his clan with higher status and special privileges.

Now if all of you are immersed in a rivalry with your evil third-cousin tribe in the neighboring settlement, everyone will probably stay united, Bedouin proverb style,2 bonded together as a single life form by the threat of an equal size rival life form.

But what if there is no evil third-cousin tribe? Without the binding force of a common enemy, if you’re the alpha character in your clan, you may decide you don’t like the status quo and either go to battle with the other clan or break off into your own tribe.

When a loose tribe held together by weak glue grows bigger and bigger, it also gets looser and looser until it can’t hold itself together anymore, and it splinters.

This imposes a natural ceiling on human giant size—and therefore on human power itself.

Except I’m currently sitting in an eight-million-person city that’s inside of a 325-million-person country.

So what changed?

___________

To help us answer that question, let’s bring in the Johnsons.

The Johnsons have problems. First there’s Moochie.

Moochie never comes when the Johnsons call him, and whenever they open the front door, Moochie jumps out the door and runs away.

Then there’s Lulu.

Every night after the Johnsons put Lulu to bed, she waits until they leave the room and then she crawls out the window to go riding around with the bad baby who lives down the block.

Not good. So the Johnsons come up with a plan.

They get a bag of Snausages, and every time Moochie comes when they call him, they give him a treat. And they install an electric fence around their house.

And Moochie shapes right up.

But how about Lulu?

The Johnsons could go for a similar strategy, giving Lulu candy for staying home at night and lining her window frame with live electrical wire.

But instead they tell her about Santa Claus. They tell Lulu that A) Santa Claus is omniscient—he knows when she’s been sleeping and he knows when she’s awake and he knows when she’s been bad or good; and B) when Santa breaks into their house next Christmas, he’ll leave presents for her if and only if she’s been good.

After hearing this, Lulu ends her fling with the bad baby.

Good for the Johnsons.

So let’s unpack this.

An animal’s behavior isn’t an independent entity. It’s the dependent variable in this equation:

A dog’s core motivations are hardwired into it by its software. The software is the real animal trainer, using a variety of chemical treats and chemical electroshocks to steer the animal to its genes’ liking.

If an animal’s life is a game of chasing good feelings and avoiding bad ones, the animal’s environment is the obstacle course standing between it and all those delicious chemical rewards.

So Moochie’s behavior is just a reflection of his particular motivations and the environment around him. If you want to change his behavior, you have to change one of the equation’s independent variables—Moochie’s nature or his environment. If we had a brain-machine interface, we might be able to change his nature—rewiring Moochie’s software so that dopamine hits are triggered by, say, the high arts instead of by gorging on food.

But it’s far less of a hassle to just change his environment. By giving Moochie a Snausage every time he obeys their commands, or by casually electrocuting him whenever he tries to run away, the Johnsons can link a certain type of behavior that his software doesn’t care about to one that it does. Moochie the good boy is still being just as selfish as Moochie the bad boy. He still doesn’t like expending energy obeying boring-ass commands—but with the change of environmental conditions, the negative of the effort plus the positive of the Snausage yields a net positive, so he obeys. He still wants to run away just as badly as he did before, but between [not running away + not being electrocuted] and [running away + being electrocuted], he chooses the former.

In some ways, humans are just like Moochie.

They’re wired by primitive software to have certain motivations, and they live in an environment that stands between them and what they want—with their behavior as the dependent variable.

But with humans, things get more complicated.

First, their primal motivations are super complex. On top of all the standard animal desires, humans are incentivized by all kinds of weird Snausages and electric fences. They crave self-esteem and want to avoid shame. They yearn for praise and acceptance and detest loneliness or embarrassment. They pine for meaning and fulfillment and they fear regret. They’re gratified by helping others and guilty when they cause pain. They’re terrified of their own mortality.

With so many factors involved, human motivation often comes down to personal priorities and what matters most to people—i.e. their values. Humans have a complicated relationship with morality too, and their conception of what’s right and wrong has a hand in the equation as well.

Values and morals have the power to override a human’s innate drives. Where traits like honesty, integrity, generosity, propriety, respect, loyalty, or kindness are valued, people will behave differently than where they’re not. If three humans with identical sex drives value monogamy, polyamory, and celibacy respectively, they’ll behave in three different ways with regards to sex.

The “environment” circle is more complicated with humans too.

Dogs tend to be evidence-based thinkers. The Johnsons could have tried to tell Moochie that obeying their commands would yield a Snausage, but he wouldn’t care. They could promise it—100 times—and Moochie won’t care. He will 0% believe what you say until he sees it with his own eyes / tastes it with his own mouth. If you want a dog to change their mind about something, show them hard, concrete evidence.

Humans also learn via direct experience, but their advanced language and imagination capabilities offer them a second learning pathway.

Let’s go back to Lulu for a minute. One thing I haven’t told you about her is that she fucking loves berries. And one day she’s out doing her thing and comes across a berry bush.

Now let’s run four quick scenarios.

In scenario A, Lulu is alone when she encounters the berry bush. The enjoyment of berries ranks way high up in Lulu’s values hierarchy, so she eats one.

The berry is as delicious as expected, but five minutes later, Lulu feels nauseous, which she hates.

The next day, she encounters the same berry bush and pauses to consider the situation. She decides “not being nauseous” > “enjoying berries,” so she doesn’t eat any. She learned a lesson the hard way and adjusted her behavior accordingly.

In scenario B, Lulu is with her friend Mimi when they see a different berry bush. Lulu is reaching out to grab one when Mimi says:

Lulu pauses to assess the situation. Lulu’s perception of reality, based on her own life experience, would yield berry-eating behavior here. But according to Mimi’s depiction of reality, the optimal behavior would be to pass the berry up.

Staring hard at the berry, Lulu considers Mimi’s credibility. Her experience is that Mimi is generally trustworthy, so Lulu decides to incorporate Mimi’s reality, in this instance, into her own. She passes up the berry.

Scenario C is like scenario B except now, Lulu is with Kiki.

When Kiki warns her about the berry, Lulu thinks about her experience with Kiki and recalls the day Kiki told her that one time she slid down a rainbow—Lulu later relayed the story to her mom, who told her that you can’t slide down rainbows. Concluding that Kiki is a lying bitch—who probably just wants to keep all the berries for herself—Lulu scoffs and eats a berry. If she then proceeded to get sick, it would be reason to update her opinion of Kiki’s credibility. But she doesn’t get sick—which only hardens her view. Fuckin Kiki.

Scenario D is just like B and C except this time, Lulu is on a late-night ride with the bad baby who lives down the block when they come across the berry bush.

Lulu considers. She’s pretty convinced that her bad baby bf tends to tell the truth, but he is also known to be gullible. She digs further.

Aha. Lulu knows that being truthful is only one part of being trustworthy, and in typical bad baby form, he’d been duped. Lulu eats the berry.

In the first scenario, we saw Lulu learn new information about reality from personal experience. She gained knowledge directly and used it to make better future decisions.

In the latter three scenarios, we saw Lulu perform an incredible magic trick.

In each case, another person presented Lulu with a claim about reality, placing it into her imagination. Lulu, being no fool, treats her beliefs like an exclusive club, and she treats the claims of others like the line outside the door. The gatekeeper of her beliefs—the club’s bouncer—is Lulu’s sense of reason. In these three scenarios, Lulu’s “reason bouncer” admitted Mimi’s claim into the club but turned the other two claims away.

In scenario B, Lulu acquired knowledge indirectly—stealing it from someone else who learned the berry lesson the hard way, allowing Lulu to learn the same hard lesson the easy way. Without indirect knowledge, 100 people learn the berry lesson by way of 100 people getting sick. With it, 100 humans can learn the lesson from only one of them getting sick.

But the same superpower makes us vulnerable.

Indirect knowledge only works in your favor when it’s coupled with reason. Imagination is why you can become emotionally invested in a horror movie—reason is why you don’t scream and run out of the theater when a ghost appears on the screen. Imagination allows you to consider an outlandish conspiracy theory—reason allows you to reject it as truth.

But what happens if the bouncer makes a mistake?

Back to Santa Claus. Lulu’s parents figured that between the trust she has in them, the naïveté of her inexperienced reason bouncer, and a little confirmation bias nudge from her inevitable desire for this delightful story to be the truth, they could slip one by her. And it worked.

If you want to change someone’s behavior, easier than altering their motivation or changing their actual environment is altering their perception of reality. This third way of manipulating a human is a shortcut—a cheat—made possible by one of human evolution’s best tricks:

Delusion.

Delusion is what happens when our reason bouncer fails as the gatekeeper to our beliefs—when our imagination is stronger than our judgment. It might be the most universal human quality. And it adds a whole other component to the environment portion of our behavior equation.

The Johnsons didn’t have too much to think about when they decided to change Moochie’s behavior. Moochie’s behavior equation presented a clear strategic winner—alter his environment, and his behavior will adapt to the changes.

With Lulu, the Johnsons had a whole array of options:

In a lot of ways, human history is just a bigger version of this story. The same toolkit the Johnsons had access to in changing Lulu’s behavior has turned out to be a breathtaking evolutionary innovation.

Picture ten different wolf packs of the same species, living in the same natural environment. They’d behave pretty similarly.

Over eons, animal nature and animal environment engage in a kind of life-or-death tango—the environment changes and animal gene pools either keep up with the dance steps by adapting to the changes or they die out. But on a lifetime-to-lifetime scale, a species’ core motivations and its general environment rarely change. They are more like constants than variables, making behavior pretty much a constant too.

Now consider ten human tribes, living, like the ten wolf packs, in a common natural environment. The human capability for delusion means that those ten tribes could vary widely in their perceptions of reality, and thus behave entirely different from one another.

Couple that with the complexity, flexibility, and revisability of human value systems and moral codes—and you have a species whose behavioral output is the product of multiple axes of wild variability.

Imagine if wolves were like humans. You’d be on a trek through the woods on a Monday and come across a wolf pack, and you’d be scared for a minute, but then you’d realize that this one believed it was evil to be violent. They’d come give you a few licks and move on. On Tuesday, you’d run into a new pack whose members were convinced that human children cast spells that caused wolf packs to starve, and that the only way to ensure wolf sustenance was to destroy them. You’d pick up your child and run away, barely making the escape. On Wednesday, you’d come across two wolves who weren’t part of a pack at all because they were convinced that most of the problems in the wolf world stemmed from “pack supremacy.” On Thursday you’d run into the first pack again—the totally non-violent one from Monday—and they’d ruthlessly attack you and kill you. Because a wolf missionary who preached the gospel of violence visited the pack on Wednesday and changed their beliefs.

This is the power of human beliefs. Not only do they produce an endless array of behavioral varieties—a million little evolutionary experiments—they allow for the complete behavioral mutation of any one of them within a single generation. Sometimes within a single day.1

Variety is the source of all of evolutionary innovation, and the flexibility of our beliefs made human evolution a creative paradise.

___________

Let’s return to the world of ancient human giants. As we discussed, the glue of raw tribalism is only so strong, which imposed a ceiling on tribe size for a long time.

This isn’t just a human problem—mass cooperation is rare anywhere in nature. Ant and bee colonies seem to pull it off, but they’re actually just using the same “glue via family ties” trick human tribes use: they’re all siblings in one huge immediate family. No human female can have thousands of children, so humans couldn’t do mass cooperation.

But gluing together is a behavior. And human behavior lives in a magic laboratory of variety. Could that additional flexibility find a way to create a human beehive?

We’ve talked before about how each of us has a personal storyline—a story we believe about ourselves that tends to drive our behavior and become a self-fulfilling prophecy. Scientists and historians talk about the same kind of stories, but in a collective sense.

In his book Sapiens, Yuval Noah Harari writes about the “imagined realities” we all believe—not only mysteries like the supernatural or the meaning of life, but seemingly concrete things like a company or a nation or the value of money. Evolutionary biologist Bret Weinstein talks about what he calls a “metaphorical truth”—a belief that’s not true, but one that enhances its believers’ survival chances. One example he gives is the belief that porcupines can shoot their quills. In fact, they cannot—but those who believe they can are more likely to stay far away from porcupines and therefore less likely to end up hurt by one.2

Human history is a long progression of human behavior, and human behavior is largely driven by human beliefs. And as Harari, Weinstein, and others point out, what has mattered most in our past is not whether our beliefs were true but whether they drove the right behavior.

At some point between 150-person ancient tribes and New York City, human evolution jumped off of the “survival of the fittest biology” snail and onto the “survival of the fittest stories” rocket.

The story virus

A story, for our purposes, is the complete array of a human’s beliefs—their beliefs around values and morality, their beliefs about their environment and the broader world they live in, their beliefs about what happened in the past and what will happen in the future, their beliefs about the meaning of life and death.

In the game of survival of the fittest stories, who wins and who loses?

Well, a story is like a virus. It can’t exist on its own—it requires a host. In the case of story viruses, a human host. So the first prerequisite for a fit story is that it’s good at binding to its host. A virus can invade an animal, but if it can’t convert that animal into its long-term home, it won’t make it.

So that starts things off with a few necessary characteristics of a viable story virus:

Simplicity. The story has to be easily teachable and easily understandable.

Unfalsifiability. The story can’t be easy to disprove.

Conviction. For a story to take hold, its hosts can’t be wondering or hypothesizing or vaguely believing—the story needs to be specific and to posit itself as the absolute truth.

Contagiousness. Next, the story needs to spread. If a particular virus were great at binding only to a random man in Minnesota named Skip Walker, it might have a good run while Skip was alive, but it would die with Skip. Likewise, a story about a god that created only Skip Walker, was only concerned with Skip Walker, and had a place in heaven only for Skip Walker wouldn’t make it very far. Skip probably wouldn’t get a great reaction telling people about that story, and others would have no motivation to adopt it or share it with anybody else. To be spreadable, a story needs to be contagious—something people feel deeply compelled to share and that applies equally to many people.

The story, once believed, needs to be able to drive the behavior of its host. So it should include:

Incentives. Promises of treats for behaving the right way, promises of electroshocks for behaving the wrong way.

Accountability. The claim that your behavior will be known by the arbiter of the incentives—even, in some cases, where no one is around to see it.

Comprehensiveness. The story can dictate what’s true and false, virtuous and immoral, valuable and worthless, important and irrelevant, covering the full spectrum of human belief.

So far, you might notice, the story of Santa Claus is crushing it.

But now we have to consider exactly what behavior the story is driving. Santa Claus is a great story to generate discipline in children who want gifts. And if evolution favored ancient humans who were good about cleaning their room, it might have worked as a “fit” story. But that wasn’t the idea.

In the game of stories evolution, the long-term survivors will be those whose hosts fare best over time.

Like microorganisms in our bodies, some stories can be parasitic to their hosts.

For example, for a story to have a long shelf life, its believers need to be super into passing down their genes, because stories are mostly passed down via generational indoctrination—they’re heritable. So stories that override reproductive instincts won’t fly. I’m sure there were tribes along the way who came to believe that sex was disgusting or that babies were demons or that severe child abuse was a virtue or that baby circumcision should include the testicles—beliefs that drove their genes right to extinction. That today’s priests are celibate is a testament to the power of stories to override even the most fundamental tenets of our software.3 But that doesn’t make the story parasitic for the future of Catholicism, because only a few Catholic men are priests. Stories that made celibacy an obligation for everyone quickly disappeared.

A story also needs to preserve at least a reasonable degree of self-preservation instinct in its hosts. I’d bet that somewhere, at some time, some tribe became convinced that suicide at the age of 16 is the only way to enter heaven, while death at any other age sends you straight to hell. You’ve never heard about this tribe because it went the shit out of extinct.

Another parasitic story would be one that absolutely forbade any use of violence. A story like that would, on the ancient game board, be like HIV—disabling the host’s immune system—and wouldn’t last very long.

Long term successful stories would instead need to be symbiotic—making their hosts better at surviving. Kind of like Weinstein’s “porcupines fire their quills” story.

But does that necessarily mean making the individual humans who believed it better at surviving? No, because as we’ve discussed, the ancient human life form wasn’t just the human—it was also the human giant. So the right kind of story symbiosis would line up with the survival game humans had already been playing. It would need to make the giants who hosted it better survivors.

If natural selection was calling for bigger, stronger, meaner giants, then the stories that enhanced that trajectory would be the fittest of them all. Our biological evolution made us tribal to help glue us together. The right story would be our superglue.

Superglue stories

To make human superglue, here are some logical ingredients:

Ingredient 1: Tribal Values

In Chapter 2, we discussed some of the trademark values of tribalism. A superglue story goes all out on these.

There are Us > Me values like conformity and self-sacrifice, and a superglue story reinforces these instincts by painting a clear paragon of what a good, righteous, worthy person looks like—something believers will try to conform to, for social status and for the sake of their own self-esteem.

The story will center around something greater than individual people that all believers should serve. This idea helps explain why so many early human marvels were temples or other monuments dedicated to worship.4 The collective service of something greater was the force behind some of the earliest mass-scale human cooperation.

A superglue story also jacks up the Us > Them values. The story needs to be all about good guys and bad guys, with a crisp, clear distinction between the two. The good guys must be good in every way—in knowledge, talent, motivation, and virtue. They’re good now, they were always good in the past, and they’ll continue to be good in the future. The bad guys are the opposite—they are and always have been stupid, ignorant, malicious, and morally backwards. Strife between the good guys and bad guys is always the fault of the bad guys.

Most importantly, the bad guys are seen as a dangerous and immediate threat. Remember the Bedouin proverb. Humans are Emergence Tower hybrids whose mindset can move up and down the tower’s elevator—and nothing brings humans up to the “small piece of a larger organism” level better than a threat from a common enemy. The bigger the common enemy, the stronger the glue.

On the Us > Them front, there’s an obstacle the story must contend with—the nuisance Higher Mind and all of his irritating disapproval of plundering and raping and beheading people. Because of the Higher Mind, it’s hard for people to truly hate a real human. It’s hard to pillage a settlement where real humans live. It’s hard to commit heinous violence against a real human. But is it hard to do awful things to filthy vermin and vile cockroaches and revolting scum of the Earth and agents of the underworld? Not really. An effective superglue story goes further than painting the enemy as bad, dangerous people—it dehumanizes them.

Through the millennia, the dehumanization trick would morph into the notion that it is not only okay to kill “Them”—it’s the duty of a good person. Optimized geopolitical stories would turn everyday people into mass murderers by framing a soldier’s work as the noblest human calling, only topped by dying while doing it. Optimized religious stories would depict the killing of non-believers as the highest service to god and dying in the act an instant ticket to heaven.

The ability to dehumanize is another gift of the delusion trick. A tribe could worship the local mountain all they want, but if their delusion stopped there, their genes probably aren’t here with us in 2019. A giant needed to be big, but it also needed to be mean. The delusion that your enemies aren’t actually full three-dimensional people with full life stories like your own is the prime source of giant aggression.

Ingredient 2: A Queen Bee

If you want people to act like ants or bees, give them a queen. The queen bee can be a rightful ruler or a mythic figure or a natural wonder or a higher cause or a hallowed homeland. The important thing is that the queen bee is seen as more sacred than any form of primal fulfillment. Tribes split when they get too big for everyone in the tribe to have an intimate relationship with everybody else—but there’s no limit to the number of people who can have their own intimate relationship with the queen bee.

Usually, the story’s queen bee is seen as all-powerful. Defying a monarch or dictator was seen as a sure-fire death sentence—for you and perhaps your whole family. Religious queen bees, free of real-world constraints, took things to an even more intense place, ramping the incentives up to unfathomable heights, wielding Snausages and electroshocks that would have made Moochie faint. The human Primitive Mind comes hardwired with this as the full range of possibilities for rewards and penalties:

Optimized superglue stories though, able to author reality, innovated with extensions to the range that were so tantalizing or terrifying to our Primitive Minds that they made everything else look trivial.

Extending the range like this overrides any care the Primitive Mind would otherwise have for what goes on within the normal range. If you’re sitting in hell when all is said and done, all of that food and friendship and sex and power you scored during your life does you no good. If you have to do some seemingly awful things in order to win a ticket to eternal heaven, you do them without a second thought.

Human rulers got in on the afterlife game by claiming to have direct connection to the divine, or by offering eternal heaven or hell for a person’s identity—with statues, monuments, and long-to-be-remembered legacies.

Which brings us to the next ingredient.

Ingredient 3: Identity Attachment

A superglue story will almost always intertwine itself with the identity of its believers. You know a superglue story is linked to its believers’ identities when you hear them use the story as a noun to describe themselves—when they call themselves “a [story]an” or “a [story]ist” or something like that.

For like-minded believers, the story-based identity gave otherwise total strangers a way to trust each other, which helped foster cooperation and trade. 5

And by latching on to the identity of its believers, a story becomes protected by the primal flame rooted deep in a human’s core. Rather than try to convince Moochie to behave differently, the Johnsons just let his existing drive for dog treats do the work by linking obedience to primal gratification. When a story is linked to our identity, the same phenomenon is happening. Why reinvent the wheel when you can just hop on the back of the human’s most deep-seated wiring?

When people see a story as an external object, then someone challenging the story is just making an intellectual argument. But when believers identify with a story, someone challenging the story is a personal threat. And since our brains are notoriously bad at distinguishing between our psychological identity and our physical body, the personal threat doesn’t feel like an insult—it feels like danger.

To double down on the identity trick, stories will also attach themselves to the identity of the entire human giant, as the group will use the story to define itself.

If a human giant is united by belief in a common story, that story can become synonymous with “Us” to its members. And for a culture with a tribal mindset, that makes the story a sacred object.

When a story becomes sacred to a group of people, you’ll hear lots of people spending lots of time talking about how true the story is—how great the story’s god is, how superior the story’s values are, or most commonly, how disgusting and vile the story’s bad guys are. Today we call it virtue signaling. It’s a common tribal practice, because doing this:

Is really doing this:

Which is really doing this:

Which is really just this:

Expressing allegiance to the story is the best way of expressing allegiance to a story-based tribe. And when someone does this, fellow tribe members, partially to express their own allegiance, will respond by saying stuff like:

Which sounds to the person like:

On the flip side, when a story is culturally sacred, a challenge to that story is culturally taboo. Sacredness and taboo are almost always opposite sides of the same coin—the sword and shield of uniformity. And violating a taboo is a risky thing to do. Because doing this:

Sounds to the rest of the tribe like this:

Which could quickly turn into this:

By attaching itself to believers’ identities on both the individual and group emergence levels, a superglue story becomes synonymous for “Us” and synonymous for “Me” in the minds of its believers. Via the transitive property, this makes Us and Me feel one and the same, bonding them together with the story’s glue.

___________

All three of these ingredients rely heavily on delusion. For someone to believe the kinds of claims made in a superglue story, their reason bouncer has to be pretty incompetent. This is where the smoke comes in. When the Primitive Mind is dominant in a person’s head, the room fogs up so much that the Higher Mind’s clarity, wisdom, and powers of consistent reason, universal empathy, and responsibly-applied imagination become faded and weak. The Primitive Mind’s emotional manipulations gain much more influence over the person, and it has free rein to toggle the superpowers as it pleases.

Delusion isn’t the same as fogginess. Fogginess on its own is just confusion, disarray, forgetfulness. Delusion is fog plus the illusion of clarity. Delusion isn’t confusion about what’s true—it’s full belief in what’s not true. When the Primitive Mind is fully empowered, it can turn reason down while jacking imagination up to the max—which can leave a person vividly believing crazy things, including the belief that their Higher Mind is the one doing the thinking and that what they believe has been fully vetted by reason.

The ability to admit a Trojan horse superglue story into our beliefs, via a fogged-out consciousness, was a strong survival trait—so strong that every person on Earth today is susceptible to it. We all have an inclination to believe in superglue stories—and if you think you’re an exception, you may be…a little delusional.

But as always, humans have a lot going on. As susceptible as we are to the Primitive Mind’s tricks, we’re also each the home of a determined Higher Mind—and no matter how many people believe a superglue story, there will always be clear-headed people among them.

That’s why even the stickiest superglue story is up against the odds—because the same thing that makes a story an efficient way to influence human behavior also makes it a vulnerable one. A commonly believed story can build the strongest of strong giants—but strength dependent on certainty is also brittle. Belief is a remarkable but cheap trick for controlling behavior, and cheap tricks can break down. All it takes is a particularly charismatic person with a new, even more compelling story to convert people away from the sacred story and create a schism down the middle of the giant.

If a giant relies on glue to survive, and that glue is generated by common belief in a story, any threat to that belief is like cancer to the giant. It can spread, and if it spreads far enough, the giant will fall apart. Stories whose hosts weren’t good at fighting cancer didn’t survive. That’s why the final superglue ingredient is the critical cancer-fighting tool.

Ingredient 4: A Cudgel

Meet the cudgel:

If there’s a common theme to all of human history, all over the globe, it’s probably humans bullying other humans. This is because bullying is one of the primary ways the Primitive Mind does business. Bullying is just humans doing business in a primitive format: the Power Games.

The Power Games basically goes like this: everyone acts fully selfish, and whenever there’s a conflict, whoever has the power to get their way, gets their way. Or, more succinctly:

Everyone can do whatever they want, if they have the power to pull it off.

There are no principles in the Power Games—only the cudgel. And whoever holds it makes the rules.

The animal world almost always does business this way. The bear and the bunny from the beginning of Chapter 1 found themselves in a conflict over the same resource—the bunny’s body. The bunny wanted to keep having his body to use for being alive and the bear wanted to eat his body to score a few energy points from his environment. A power struggle ensued between the two, which the bear won. A bear’s power comes in the form of being a big strong dick. But power isn’t the same as strength. A bunny’s power comes in the form of sensitive ears, quick reflexes, and running (bouncing?) speed—and if the bunny had been a little better at being a bunny, he might have escaped the bear and retained the important resource.

Humans have power in numbers. That’s why tribe glue was so important in the ancient world. More glue = bigger tribe = bigger cudgel. And in the Power Games, a bigger cudgel is the means to every important end: safety, resources, mates, peace of mind.

Just as important as the size of a tribe’s outward-facing cudgel (the giant’s “military”) is the size of the one it points inward at its own members (the giant’s “police force”). One fights external threats—the other fights cancer.

The first three ingredients have an internal cudgel embedded in them: Tribalism generates peer pressure to conform and a fear of being labeled a secret member of Them and ostracized (or worse). Fear of the queen bee translates to censorship for any dissenters without a death wish. Identification with the story causes people to protect the story like they’d protect their own children.

A superglue story will usually go even further and write a cudgel right into its pages—it’s a jealous story that expressly forbids belief in other stories.

I’m sure some ancient stories were chill about things, upholding the value of tolerance of a variety of ideas and beliefs. Stories like these would probably encourage discussion and debate around true vs. false or right vs. wrong, and they’d probably emphasize that people are not their beliefs and that different people can believe different things and still be good people.

But you can’t build a tight beehive around a tolerant story—and even if you could, when the Power Games are all around you, it’s only a matter of time before tolerance is trampled over by intolerance.

A successful superglue story has intolerance as a central value—declaring, as part of the story, that dissenters from within should be obliterated. This led to concepts like heresy and blasphemy and treason and apostasy that came with consequences like imprisonment, execution, and eternal damnation.

Just like a giant’s outward-facing cudgel, the internal cudgel is all about numbers. If a critical mass of people in a tribe wants everybody in the tribe to behave a certain way, they can bully the dissenters into submission.

The “critical bully mass” phenomenon can turn a made-up story that some people live in into the actual environment that everyone lives in. When enough people believe that there’s a god who wants death for anyone who says X, those who say X will actually end up dead. When enough people think saying Y means you’re not a member of the tribe, saying Y actually gets you excommunicated. If a story could alter the behavior of enough people via indoctrination, the believers would alter the behavior of the rest via intimidation. This creates a loop that can keep a story, once implanted, in control of a tribe for centuries.

The self-perpetuating indoctrination-intimidation loop is the story virus’s promised land. It’s the reason why so many stories seem to get stuck in human beliefs for ages, even as the species continues to enhance its knowledge of reality.

From Perfect to Perfecter

As the centuries passed, super-optimized superglue stories competed to out-perfect each other in a game of rapidly growing giants.

As our stories evolved, so did their hosts. Evolution is only slow because environmental change is usually slow. Could the insta-mutation capability of stories also have sped up our psychological evolution?

In a Power Games environment, humans with a natural pull toward tribalism and conformity, with strong imaginations and questionable reasoning, and with an instinct to please powerful people rather than defy them, may have been the best survivors. It would explain a lot about the world around us today.

Meanwhile, people inclined to be manipulated by stories are also people inclined to be manipulated by other people—and clever profiteers caught on.

They realized that brainwashing offered the biggest cudgel of all. If you could brainwash, you could write the story. If you could write the story, you could write reality. You could write the values, the morals, and the customs. You could write who the good guys were and who the bad guys were. You could write the rules, dole out the rewards, and inflict the penalties. And if you could write all those things, you could write people’s behavior. If you could brainwash, you could play god.

As human giants grew larger, the most skilled manipulators competed with each other to control the stories that controlled the giants. Some would claim knowledge of the divine as a means of grabbing the strings. Some would stoke fear with stories of imminent danger or invoke rage with stories about injustice in order to gather an army of supporters. Some would write stories of their own ruthlessness or their own merit or their own rightful status as queen bee—aiming for that sweet critical mass of indoctrination at which point defying the imaginary queen bee gets you actually beheaded.

Over hundreds of centuries, hyper-optimized superglue stories came to cover all the bases so thoroughly, they were able to do something biological evolution never could—convince masses of human beings to cooperate. Rather than repress the human primal flame, these stories harnessed it, grabbed its reins, and redirected it—lining up individual flames in parallel lockstep, pointing them all in the direction dictated by the story.

With glue like that, we transformed our little primate giants into world-conquering beasts.

In a geological blink, a million animals scattered throughout the world’s forests became a billion people living in vast civilizations, wresting themselves from the animal world and conquering the food chain in a way no other animal had ever done.

And yet…

What did we really have to show for it?

We were still going through the same shit we were back in the ancient days—still stuck in the same old zero-sum power struggle the bunny and the bear were dealing with at the beginning of Chapter 1. We were still playing in the Power Games. Everything had just gotten bigger.

We went from tribes ambushing each other’s settlements to kingdoms invading each other’s coastlines. From brutal warlords enslaving ten people to brutal planters enslaving 1,000. From clans fighting battles over desired patches of land—

—to empires fighting wars over desired continents.

We clawed our way to the top of Emergence Tower—

—only to still act like fighting tribes of primates once we got there. Same shit, bigger giants.

Of course, there were some major positives—we had made an unfathomable amount of progress. Cooperation on a mass scale made human knowledge and technology soar into the stratosphere, and in some ways, quality of life rose with it. The world’s superglue stories, for all their downsides and damage, were also the source of some of the wisest and highest-minded values in our history, and were at times the bedrock of peace and stability.

But we hit the crazily futuristic year 1700 AD and most humans were living as a cell inside some human giant where the rules, the rights, and the resources were inflicted by a few people at the top on everyone else below.

What that meant for almost everyone was that while your destiny in life would be partially shaped by your biology, upbringing, choices, and luck—it mostly depended on the Mr. Question Mark Man who happened to sit at the top of your giant and whatever Question Mark story happened to rule over its culture.

How the Question Mark ruler and Question Mark story felt about things like freedom, fairness, or your particular ethnic group would determine everything about how you were able to live. It was like drawing a card from a deck and hoping it was a high card. If you happened to draw a jack of hearts and be born the child of a noble in one of the dictator’s deemed upper castes, you might live a safe and enjoyable life. But more often, you’d find yourself with the 7 of clubs and spend your one life as an in-the-shit peasant, or you’d draw a 4 of diamonds and spend 40 hard years as a slave, or you’d draw a 2 of spades and be thrust at the age of 13 into the front lines of one of Mr. Question Mark Man’s foreign exploits and that would be that for you. Even if you did draw a decent card, you were always one heart attack or assassination of the leader away from a new queen bee taking control of the giant and reshuffling the deck.

For all our advances, we hadn’t advanced where it matters most—the human world remained, like the rest of the animal world, a stressful place to be.

___________

Which brings us back to this odd creature.

When you consider human history as primarily the output of a software program—and when you consider the fact that that software program, unlike our rapidly evolving civilization, hasn’t really been updated in the last 10,000 years—it suddenly makes perfect sense that the global civilizations of 1700 AD would be acting out the same basic skit, on a larger scale, as the humans of the ancient past.

When you remember that the Primitive Mind cares about genetic immortality—not people—you’re reminded why it also shouldn’t be surprising that a species running on that software could develop an advanced civilization and find that life still sucks for most people. When the Primitive Mind is in charge, life will usually suck.

But how about the Higher Mind? Where the hell is he in all of this?

He’s stuck as a second-class citizen in the human head, that’s where.

The Power Games are the Primitive Mind’s output because the Power Games are the only way the Primitive Mind knows how to live. And given the heavy influence of the Primitive Mind in the human mind—and the Power Games’ knack for trumping any competing games out of existence—our species is pulled toward the Power Games with a continual force like gravity.

And here’s the problem—the Higher Mind is good at a lot of things, but he’s not good at the Power Games. The Power Games are survival of the fiercest, survival of the greediest, and survival of the conformist-est. They favor the tribal, the manipulative and the gullible, the bully and the bulliable—each of them right in the Primitive Mind’s wheelhouse. The Higher Mind just isn’t cut out for those streets. History is scattered with moments of Higher Mind triumph, but it typically was only a matter of time before the high-minded culture was trampled over by the stampeding Power Games.

If everyone simultaneously stopped playing the Power Games, the Higher Minds of the world might be able to take the driver’s seat for good, but in a world where some people are playing the Power Games, playing the Power Games becomes a survival necessity for everyone3, which perpetuates the cycle. It’s a suffocating loop the Higher Mind can’t find a way out of.

But through it all—through the Ice Age and the Bronze Age and the Iron Age, through the rise and fall of empires, through the wars and plagues and genocides, underneath miles and miles of thick mental fog, the Higher Mind remained.

And just maybe, after a hundred thousand years in the back seat of the human mind, the tables would turn.

Chapter 4: The Enlightenment Kids

___________

To keep up with this series, sign up for the Wait But Why email list and we’ll send you the new posts right when they come out.

Huge thanks to our Patreon supporters for making this series free for everyone. To support Wait But Why, visit our Patreon page.

___________

Three other things that once happened on Wait But Why:

My story about the nightmare of doing a TED Talk

My story about going to Iraq, along with a not that brief history of the country

The time I frightened myself by writing about the largest number in math

___________

Sources and related reading:

Yuval Noah Harari: Sapiens. Chapter 2 especially helped inform and crystalize the ideas in this post. Fascinating read for anyone who found this post interesting.

I don’t remember where I first heard Bret Weinstein talk about metaphorical truth, but here he is explaining it.

In researching the psychology of sacredness, and how it can be a lever for tribalism, I kept coming across the work of French sociologist Émile Durkheim. You can read about some of his major theories here.

Regarding the tension between strength and stability as human groups grow in size, you’ll often hear about Dunbar’s Number, which in pop culture has been simplified to the idea that 150 people is a kind of ceiling human groups run up against before losing the ability for intimate relationships to glue the group together. There has been a lot written about Dunbar’s Number—one I found interesting is a series written by Christopher Allen on his blog Life With Alacrity.

You can find the ongoing list of sources, influences, and related reading for this series here.