I can’t decide if that is a scary thought. I feel excitement and trepidation. Westworld made me think deeply about consciousness in AI. The script writing on that show is — for lack of a better word — amazeballs.

Spoilers ahead, but season 2 comes out this weekend so watch season 1 soon!

Westworld = AI empathy

Westworld tells the story of human-like ‘artificial life’ that is not conscious. Dolores is a ‘host’ in the theme park that is Westworld. Hosts look exactly like humans and express complex emotions. These emotions are somewhat preprogrammed. They go through a well defined set of actions and have a limited range of conversation. They look and feel real, but they don’t have consciousness yet so humans visiting the theme park (“guests”) think it is ok to kill and fuck them. They don’t have memories other than those given as a suitable backstory.

Image courtesy HBO

The show captures the mundane, day-to-day detail of an artificial life with limited consciousness. Dolores drops a can of food while she tries to put her groceries in her horse’s saddle. A man picks it up and gives it back to her. She wakes up and walks up to her father sitting on the porch of their house to have a conversation. Hosts exhibit near-human complexity in their behavior. If Westworld was a game, hosts are the ultimate NPCs. Sans consciousness.

Since Westworld is a TV series, the writers took the opportunity to develop the character of the AI. AI in Hollywood has mostly been in movies — this is one of the first times we are seeing AI characters in Hollywood portrayed with such depth.

This meant that I experienced AI empathy while watching the show.

When Dolores points her gun at humans (‘guests’ in the theme park), I don’t know if pulling the trigger is the right thing to do! She doesn’t either. I lived through her experiences. I really wanted her to be happy and safe. I did not feel this way about Terminator. Even about HAL-9000.

The pyramid of consciousness

The show asserts that consciousness will emerge from something. There’s no consciousness program being downloaded onto a metallic android. Arnold’s pyramid of consciousness gives us some insight into how the writers thought about this emergence.

At the bottom — memories. Then improvisation. Self-interest to direct your actions. And the top layer, erm — we don’t know exactly. There are hints throughout the series, but we are not sure.

Aside from the ambiguous top layer, this makes intuitive sense. Memories are experiences and I subscribe to the theory that the brain is a prediction machine. Much like the modern day neural network. Only much more complicated because it is processing inputs from the body, regulating bodily functions, taking inputs from sensory organs and tells you that you don’t know what you’re doing with your life. It does your inner monologue. Experiences, memories and dreams must act like training data for this prediction machine.

Dabbling with AI means (eventually) dabbling with consciousness

The creators of the theme park thought that there was a good chance the hosts would eventually become conscious. The two creators, Arnold and Ford, design for this in different ways.

The basic approach is derived from Bicameralism. The theory postulates that humans only became conscious i.e. aware of their awareness only about 3000 years ago. Before that, the ‘inner voice’ was thought of as divine instructions. Eventually, humans grew aware that the inner voice was their own and they had control over it.

Richard Dawkins called the theory “either complete rubbish or a work of consummate genius”.

The show drew me in because the underlying plot line is how Arnold and Ford approach the design problem of getting consciousness to emerge.

Arnold and Ford’s idea was to bootstrap consciousness by providing an ‘inner voice’ that gave instructions and hope that consciousness would emerge the same way it did for human under the theory of the Bicameral Mind.

Designing for consciousness

Hosts start off with their memories being wiped periodically, so there are no experiences to build on. The Reveries update results in hosts keeping some memories, deviating from their somewhat scripted existence (“improvisation”) and starting to discover themselves.

And then what?

Let’s put ourselves in Arnold and Ford’s shoes. We have built something that looks, feels and behaves like a human. No consciousness yet, but we think it is coming. We feel responsible for making sure that it happens in the “right way”. After all, this is sentient Artificial General Intelligence we are talking about.

We unlock memories, which leads to some improvisation in behavior. But, but. How do you evolve a purpose? The awareness that the inner voice is yours. Developing a sense of self. Free will in your actions.

Arnold asks Dolores to kill him, and a bunch of other hosts. We don’t know why, perhaps he hopes that Dolores will not listen to her inner voice giving these instructions? Or he wanted to prevent the park from opening? Ford designs a maze within the theme park. In solving this maze, Ford hopes that Dolores will gain consciousness. This fantastic article lays out how it happened in the final episode of season 1.

What can we expect from season 2? With consciousness comes random behavior and that is exactly what the writers promise. Chaos takes control.

Is Westworld possible?

Maybe.

Delos was the company that built Westworld. In the show, Ford says that they worked on Westworld for 35 years, and the AI passed the Turing test 1 year in. Ray Kurzweil’s estimate for passing the Turing test is 2029. Westworld takes place in 2063. If the timelines in the show play out in the real world, Ford and Arnold start working on it in 2028 or so. You are clearly very good at AI/Neuroscience/Nanotechnology/related fields when you start this company. That is 10 years away.

People are getting good at AI, Neuroscience and Nanotechnology now. Even people just entering these fields now have a decent shot — 10 years is plenty of time to get somewhere. It doesn’t matter if you are at the top of the field — starting a company and succeeding is about more than just being technically good.

That could be anyone. It could be you!

Whoever it is, please: watch Westworld, think deeply about the consequences of your actions, and have empathy towards machine intelligence. We don’t need a Battlestar Galactica future.

— — — — —

If you enjoyed this article, feel free to hit that clap button 👏 to help others find it!