Note: This article is from 2017.

Suggested listening while reading: Pets by Porno For Pyros (1993) - Spotify - YouTube

One of my favorite science fiction series is the Culture Series by the late Iain M. Banks.

Banks imagines a complex, vast, sprawling universe. He takes this down to a single set of characters, a single set of threads. Each story is just this narrow slice, but it sits in a much grander imagination. Much of what makes his stories so impressive is what you never get to see.

The series follows "The Culture." A quasi-hedonistic, post-scarcity galactic society where humans, drones and AI "Minds" co-exist. Without any true rule of law. And all as equal citizens.

The Minds represent what many of us would think of when we consider the future of AI. Hyper-intelligent, sentient beings. As starships, they ferry people around the universe. As "hubs" controlling habitats they act somewhat like benevolent dictators. Otherwise, they are AI of unknowable intelligence.

The Culture leaves humans and machines to their own devices. Art, travel and other pursuits. More or less. As needed in a narrative, conflict emerges. Like most fictionalized utopias, it's a little more contradictory than that. The Culture is anarchist, yet organized, pacifist and yet full of military prowess. Banks writes on the fulcrum of idealized futurism and a swashbuckling Space Opera.

Irrespective, Banks paints the broad strokes as an idyll. The Culture is at it's a rational, progressive civilization. In many ways, it represents Banks' best vision of a pragmatic and practical utopia.

From motors to motivation

Fan as I was, there was something that always bothered me. Given hyper-intelligent AI with nearly limitless resources, why do they keep humans around at all? What is our function in this pan-galactic society?

One answer is a literary device for Banks. The people are a window into the story, to let us take part as readers. We can't hope to follow the story from the perspective of a hyper-intelligent mind; although Banks does attempt this.

But there is something more than that. For a metaphor, I reached for gut bacteria, of all things.

We spend a lot of time in our own head. It's easy to think we're in control of our mood, thoughts, and desires, but we're a complicated cocktail. Gut bacteria is a classic example, a still growing area of research. This relationship can affect our mood, our health, and how we perceive the world. The name for this is the "Gut-Brain Axis." The expression is to trust your gut. Turns out it might be more literal than that.

Opera needs drama and space opera puts this on a grand scale. In Banks' vision of The Culture, the Minds need something to do. Banks needed a relatable protagonist. Plus he foresaw the need for Minds to have an energetic motivation. An agitation that leads to action. In the story, humans represent something primitive. But they also represent the drive, the gut of the story.

Banks echoed this sentiment in his Usenet post "A Few Notes on The Culture":

Humans … are unnecessary for the running of the starships, and have a status somewhere between passengers, pets and parasites - Banks.

We're evolved to compete and succeed. When you step back, this is less necessary for an AI. You've got to wonder why they do anything at all. Meanwhile, humans are capricious. In Banks' universe, they add a sense of fun, drama, interest, and unrest.

Culture AIs are designed to want to live, to want to experience, to desire to understand, and to find existence and their own thought-processes in some way rewarding, even enjoyable - Banks.

I won't unpack "A Few Notes" too much. If you're familiar with Banks, it's worth reading. For now, there are a few key hypotheses to draw out. One pillar is that the universe is vast, on a scale of near limitless.

In that lens, an open, a curious mindset will flourish. Minds are curious, gregarious creatures and successful for it. In Banks' conception, this is a required condition, rather than a noble one.

A less curious mindset doesn't explore and doesn't grow. And when resources are not scarce, aggression is simply an overhead and perhaps a terminal one. There is little benefit in fighting when you can achieve the same ends without it. Banks' thesis isn't that the rational Culture is morally better. He views it as necessary.

The Culture, in its history and its on-going form, is an expression of the idea that the nature of space itself determines the type of civilisations which will thrive there - Banks.

We'll make great pets?

There is a profound discomfort in being something akin to pets. We like to feel agents of our destiny. But, co-existing with machines does not diminish that at all. If anything, this irrational judgment on our place in the universe is our biggest self-inflicted hangup.

If you still find the idea of being a pet alarming, then that is one thing that makes you human. There is a beautiful irony in that.

In summary, we make our own meanings, whether we like it or not - Banks.

Also, question the unique place of hyper-intelligent AI in "dethroning" humanity. In many ways, we've already done it. We already live within systems. Each of us is already party to financial, social and political systems. Many of which are too complex for us to understand. Certainly en masse.

Even in cases where we do understand, we often have limited control. And yet, these systems that we readily accept. We're already part of many different machines, albeit not hyper-intelligent Ai-driven ones.

So, when does AI take over these systems? Algorithms already drive everything from stock trades to what we see in our media. It's already happening. The question may be how intelligent we'll let these systems get. Assuming we have any control or insight into that in the first place. What are the limits, safety nets, and tradeoffs we'll make along the way?

There is a neat and topical example in self-driving cars. For us to buy in, do they need to be perfect, or just better than people? The former seems impossible; the latter could already be here. I argue this case in my article on self-flying planes.

But, let's avoid local minima?

Before we get too carried away with utopia, there are more near-term issues. The first concerns who control AI. If unchecked, it could give an unassailable advantage. This is true of many technological advances and is true here.

The second is concerns rogue AI overthrowing or compromising human civilization. Either by design or accident. A Skynet-esque catastrophe. Elon Musk is one of the most outspoken on the risks.

Musk's fears are not unfounded. Technology and war go hand-in-glove. Or hand-in-gauntlet. Innovations become weapons, eventually. And to be clear, it's already happening. It's a common disaster-scenario. AI featuring in post-apocalyptic movies is an easy demonstration of that.

Yet, humanity is the biggest agent in its potential destruction. We already hold the technology to destroy ourselves many times over. As I write this the Doomsday Clock is shifting ever closer to midnight. Two and a half minutes right now. The circumstances for a global catastrophe already exist. AI can make this better or worse. It's possible that AI will let someone galvanize these destructive forces. Either by intent or as seems a common narrative, by a tragic accident. The Skynet scenario.

In software, we talk about "leaky abstractions". Software abstractions are intended to hide the complex internals. But in partially hiding complexity you just make it harder to understand. In building AI, humanity the leaky abstraction.

The best defense is strong systems-thinking. Creating robust, resilient systems that prevent this occurring. Ironically, the best solution to this appears to be AI. In this lens, the biggest risk is not what we build, but when and in what order. First, we need to the basic tools to understand AI before we unleash it on ourselves.

Banks' alludes to The Culture emerging from "periods of mega-structure building and destruction." The Culture exists in huge vessels, called General Systems Vehicles (GSVs):

All that the Culture knows, each GSV knows; anything that can be done anywhere in the Culture can be done within or by any GSV. In terms of both information and technology, they represent a last resort, and act like holographic fragments of the Culture itself, the whole contained within each part - Banks.

Abundance and post-scarcity sit at the heart of Banks' vision. Here he also alludes to redundancy, and on a grand scale. We're missing both.

It's worth noting that Musk's SpaceX names their drone ships after craft from the Culture Series. Perhaps Musk is letting us know that right now; we've only got one shot at this.

Back to the gut

For all it's utopian glory, there is something sinister about The Culture stories. The protagonists are smart, versatile, brave. Yet, there is always this underlying thread that they don't understand their own context. There is always more going on than the protagonists and reader get to see. Banks is telling us something profound there, more about us than the future of AI.

The point is, humanity can find its own salvation - Banks.

In a civilization that is very ordered, his stories can be surprisingly messy.

Which brings us back to gut bacteria. It's possible that characterization is simply mean. The positive spin is your gut may well have influenced how you feel about that. Right now it's only a metaphor. We still get to choose. Even better, we also get to choose how we feel about it.

Perhaps humanity is the non-sequitur to the existential angst that awaits AI.