Westworld Showrunners Say Sentient Artificial Intelligence Is "Imminent"

I think... like children, computers can get a mind of their own. They should become better than us, and in some ways transcend us. [A computer] could leapfrog human intelligence. So our responsibility, as a creator who might be outpaced by our creation, is to teach them a code and parameters and values and hope that those values can guide their future choices. That's practically happening right now in the discussions in Silicon Valley about what you do with AI technology and what kind of fail-safes you implant.

Film and television have played with this question for decades. The vast majority of it has been a dystopian [vision] - AI is going to kill or enslave us. You started to see for the first time [the AI relationship] done beautifully in Spike Jones' film Her, which detailed a love affair and the ways that that would be qualitatively different. We also tried to thread some of this into Person of Interest.



I've always been more interested to look past the apocalyptic scenarios - although, look at them - but then look past them and look at the ways in which AI will be childlike at first and very influenced by our culture and our values, and then might come to help us. That was the question for Person of Interest, and now with Westworld. The question is: Okay, step all the way through the question and into the other side, and ask not what will we think of [AI creations], not what will they do to us, but what will they make of us? How will they feel? How will they think?



I've been dealing with AI now on different projects for many years now. There's an urgency here because we really think this is imminent... So we feel a certain amount of responsibility, and Stephen Hawking and Elon Musk and others are trying to hold up a sign and say, "Let's take a beat, just a beat," as we did with genetic research in the 1970s, and just talk about it. Let's just talk about what we're doing, because it's happening very quickly. At some anonymous office park in Mountain View or in Shenzhen or somewhere, someone is trying to very urgently deliver what we would consider a true artificial intelligence. And we're not culturally talking about this because we've thought of it for so long that we haven't bothered to have a conversation about what this thing will be before we make it, which seems like a mistake.

HBO's Westworld, which takes place in an android-filled park that allows people to enact all of their darkest vices, is the latest in a long string of recent movies and TV shows that explore the notion of sentient artificial intelligence. But how likely is it that we'll see a Westworld-like situation happen in real life? According to Westworld showrunners Lisa Joy and Jonathan Nolan, robots with thoughts and feelings will likely happen in the near future.Recently, we had a frightening brush with responsive, adaptive AI when Microsoft launched a chatbot that became racist, sexist, and genocidal within 24 hours . According to Joy, all machines are products of their creators and their environment, very much like human children, which means we have a moral responsibility to make sure any sentient AI is a positive force.But in spite of the obvious and oft-discussed dangers of AI, Nolan thinks that it has the potential to be either a threat or a benefit—or both. As a result, the many dystopian and post-apocalyptic works of fiction that discuss the ramifications of AI only tell half of the story.And we'll need to figure out the answers to these questions soon, since Silicon Valley is on its way to creating a real-life artificial intelligence. Nolan, for his part, thinks that Westworld-like AI is "imminent."Westworld premieres on HBO on Sunday, October 2.