Consciousness is simultaneously the most intimately known and least understood aspect of reality.

Take a second to let that sink in.

The most important possession of ours is at one and the same time perfectly familiar and utterly incomprehensible.

That is not a contradiction, of course — there is nothing we’re more acquainted with than our first-person, subjective experience of the world, and nothing we understand less than the nature of consciousness itself.

HBO’s Westworld (spoilers ahead), which just wrapped up its first season, is the latest show to explore the abiding familiarity and mystery of consciousness and center the narrative around it.

The problem is they get it badly wrong.

The fundamental confusion stems from the fact that the show misunderstands that the salient aspect of consciousness is experience.

It is not, as the show implies during the early part of the season, memory. Nor is it, as the later episodes have it, self-directedness, which is closer to self-consciousness or to the popular notion of free will.

Consciousness either involves or is closely connected to these other aspects, but is essentially something else: experience.

By “experience” I’m referring to the subjective, ongoing mental representation of the world; the vivid inner rendering of that which is out there.

The philosopher David Chalmers explains the problem this way:

The really hard problem of consciousness is the problem of experience. When we think and perceive, there is a whir of information-processing, but there is also a subjective aspect. As Thomas Nagel has put it, there is something it is like to be a conscious organism. This subjective aspect is experience. When we see, for example, we experience visual sensations: the felt quality of redness, the experience of dark and light, the quality of depth in a visual field. Other experiences go along with perception in different modalities: the sound of a clarinet, the smell of mothballs. Then there are bodily sensations, from pains to orgasms; mental images that are conjured up internally; the felt quality of emotion, and the experience of a stream of conscious thought. What unites all of these states is that there is something it is like to be in them. All of them are states of experience. It is undeniable that some organisms are subjects of experience. But the question of how it is that these systems are subjects of experience is perplexing. Why is it that when our cognitive systems engage in visual and auditory information-processing, we have visual or auditory experience: the quality of deep blue, the sensation of middle C? How can we explain why there is something it is like to entertain a mental image, or to experience an emotion? It is widely agreed that experience arises from a physical basis, but we have no good explanation of why and how it so arises. Why should physical processing give rise to a rich inner life at all? It seems objectively unreasonable that it should, and yet it does.

Because this is the most morally significant and philosophically interesting aspect of our mental lives, it is also the most narratively important in a universe such as Westworld’s.

Yet Westworld doesn’t realize it.

Here’s how we can tell they get it wrong.

From the outside, none of us can tell that anybody else is conscious. We giggle, we wince, we cry — but I only know for certain that there is something going on inside when I do these things; I don’t know for certain that there’s anything going on inside when you do them.

Does that mean I live with the nagging belief that everyone I encounter is an automaton, exhibiting external responses to stimuli that are indistinguishable from my own, yet, really, it’s all just dark in there?

No, of course not. Yet my point is that none of us has direct evidence that anybody else is conscious. The only being whose inner life a person is able to experience, metaphysically speaking, is his or her own.

But television changes everything. Really it is stories — whatever medium they come in: television, book, audiobook — which change everything, but Westworld is a television show, so that’s where we’ll focus.

As a narrative vehicle, TV introduces a new vantage point. A sort of God’s eye view. Here’s what I mean.

Picture 1

Teddy and Dolores

Picture 2

Dolores and the Man in Black

So far, so good.

Picture 1 contains two hosts. The vantage point — that is, where the storyteller has situated the viewer — is one we could expect in the real world. We see the automata from the outside; from this vantage point, they are indistinguishable from ordinary human beings. But the point is the outlook we’re adopting: viewing these two beings from the outside.

Picture 2 displays a vantage point that is basically the same, to us, as the one in Picture 1. Yet what’s different is that in this shot, there’s a human being; the Man in Black is doing what we’re doing in Picture 1, looking on at a host from the outside.

Here’s Picture 3:

The Man in Black

In Picture 3, we have something new. We have Dolores experiencing the unwanted advances of the Man in Black. This is no longer the storyteller situating us in the scene as an onlooker; rather, this is the storyteller giving us the very representation Dolores is currently having of the world.

The scene moves back and forth between Dolores, on the ground in dreadful expectation, and the aggressor at the door. We are shown the horror of this experience by seeing it as the experiencer is seeing it.

Here’s the takeaway, all of this is arguably innocuous, ethically speaking, if Dolores is a mere automaton, exhibiting only the outward behaviors of someone experiencing pain or dread.

I’m not claiming it’s ethically okay to do what the Man in Black is trying to do, so long as the object does not have consciousness. What I’m saying is that if Dolores is not conscious, then there’s no longer any other-directed harm; there’s only self-harm (the harm of doing something as spiritually corrosive as violating an object that is being conceived of as a human being.)

But if Dolores has the same sorts of mental representations as we do — sensations of pleasure and pain, vivid qualitative experiences of colors and textures — then from an ethical standpoint she is indistinguishable from us.

If she lacks genuine autonomy or self-directedness, then she is perhaps not a moral agent. But here’s the crucial point: if she is conscious in the way outlined above, then she is a moral patient. And the same restrictions that govern our dealings with fellow human beings should apply to our dealings with all other moral patients.

Television hacks reality by giving us a perspective forever inaccessible to us in the real world. We are able to fit into the skin, as it were, of other beings.

If, instead, Dolores viewed the world this way:

…or this way:

…and, further, did not experience pain as a conscious state, but rather, in an experientially detached way, accepted that harm had been done to her “body” — think a robot receiving a signal that there’s been damage to the system— then there would be no problem.

But the reason we see the entire Westworld enterprise as scandalous is because we intuitively grasp that over and above these signals the hosts endure a real experiencing of events in a way that leads to dread and horror and heartache.

In a piece entitled “We’re All Living in a Video Game”, I summarized a distinction made by the philosopher Ned Block.

Conscious experience is that vivid, first-person point of view made up of sensations of pleasure, pain, and much else. The philosopher Ned Block makes a distinction between Access Consciousness and Phenomenal Consciousness, with AC having to do with mental states that are available to other mental states in order to guide action, and PC having to do with that vivid, first-person view described above. Here’s the crucial part: Block thinks it’s possible, in theory, for us to function exactly as we do now with only A-Consciousness. It would be like walking into a room and sidestepping a table, rather than running into it, as I walk across the room, even though everything in my head is “dark” and my mind is “blind.” Neurotransmitters could receive signals from optic nerves about a table being in the way without there being any movie-like experiencing of the room, or the table, or walking, or anything.

If Westworld hosts had Access Consciousness but not Phenomenal Consciousness, then the difficulty would disappear. The problem is the show suggests that the hosts are imbued with both forms, which makes the hosts moral patients.

There’s a scene in which Dolores arrives at the doorstep of her farm to see a bandit come from the side of the house. He shoots her, and she looks down and sees that she’s bleeding. Then, suddenly, the blood is gone and it’s as if the bandit hadn’t done that yet. She then sees it about to happen again, so this time, tipped off to the fact that the guy coming from around the house is going to shoot her, she makes a getaway.

The vantage point we see is hers. There is no inner darkness. She is not merely access-conscious. We see what she sees, and her representations of the world are like ours, including the experience of the sharpness of pain and the unbearable anguish of suffering.

If you’re not sold that the show’s writers are — whether wittingly or unwittingly — imbuing their hosts with consciousness, if you don’t think Picture 3, above, is from Dolores’s vantage point, then the point still stands, because we’re shown Dolores and Maeve having “flashbacks,” and not just that they’re having flashbacks, but we’re seeing the flashbacks themselves, meaning, we’re seeing them as they’re seeing them. The hosts are having representations of past events the same way we do when we “relive”, as it were, our past experiences in our minds.

This point about memory is important.

Take a look at how Vox’s Todd VanDerWerff put the issue in a recap of the penultimate episode from season 1:

If Westworld can be described as “about” any one thing in particular — which is a dangerous game to play, because the show is trying to encompass a great number of themes — it’s about the nature of consciousness. Indeed, the show has come up with a sort of simple equation for how consciousness is formed, one that is played out, again and again, in “The Well-Tempered Clavier,” the first season’s penultimate episode. Pain leads to trauma. Trauma leads to memory. And memory leads to consciousness.

This is confused, of course. And in two ways. It’s confused as a theory of consciousness and as a theory of how Westworld views consciousness.

Here’s the confusion: pain is already a conscious state. It can’t be the first step among many that eventually generates consciousness; pain presupposes consciousness.

VanDerWerff doesn’t appear to realize that pain is ineliminably perspectival. Just as it is non-sensical to say “there is pain being experienced in this room but it is not being experienced by anybody”, it is likewise incoherent to describe an experience of pain as nonconscious.

As the Stanford Encyclopedia of Philosophy puts it:

A sharp and stabbing pain is always a pain felt or experienced by some conscious subject. The self need not appear as an explicit element in our experiences, but as [Immanuel] Kant noted the “I think” must at least potentially accompany each of them.

VanDerWerff continues:

Think of the child who lacks complex thought processes, toddling around her home. In the classic example, she reaches up to the stovetop and burns herself on a pan. Quickly, the pain leads to an important memory, one that becomes a building block of her long-term conception of the self: Don’t touch hot things, because you don’t want to get hurt. What Westworld suggests is that repeated pain, repeated suffering, creates something deeper than a warning signal that goes off in the brain. Imagine that the child could be forced to touch the hot pan again and again and again.

Memories are useful for navigating the world, true. But the overall picture VanDerWerff paints is utterly mistaken. He describes the child, upon touching the stove, as receiving “a warning signal that goes off in the brain.” By describing the experience this way, VanDerWerff strips it of its qualitative flavor, indeed, strips it of its experiential flavor. The description he gives is consistent with a mere robot being alerted to a system failure.

But surely the toddler feels pain; that “sharp and stabbing” feeling the SEP describes, above.

So the capacity for memory is quite clearly not a necessary condition for possessing consciousness. Think of the recent Black Mirror episode, “White Bear” (spoilers ahead).

Each day the woman relives is filled with conscious states, even though she’s unaware she’s reliving the same day. In other words, just because her memory is wiped at the end of each day doesn’t mean she’s not consciously experiencing the same horror, day in and day out.

In Westworld’s eighth episode of the season, architect Robert Ford and his closest associate Bernard Lowe have a conversation that is as illuminating about Westworld’s conception of consciousness as it is maddening.

Ford: Thank you for dealing with an unfortunate situation. Now we can resume work on our new narrative without interference. Bernard: And Hale? Won’t she be an impediment? Ford: No doubt she will try. But I’m sure we’ll be able to keep them at bay. Bernard: Something else is troubling you. Ford: Ever the student of human nature. I wonder, what do you really feel? After all, in this moment, you are in a unique position. A programmer who knows intimately how the machines work and a machine who knows its own true nature. Bernard: I understand what I’m made of, how I’m coded, but I do not understand the things that I feel. Are they real, the things I experienced? My wife? The loss of my son? Ford: Every host needs a backstory, Bernard. You know that. The self is a kind of fiction, for hosts and humans alike. It’s a story we tell ourselves. And every story needs a beginning. Your imagined suffering makes you lifelike. Bernard: Lifelike, but not alive? Pain only exists in the mind. It’s always imagined. So what’s the difference between my pain and yours? Between you and me? Ford: This was the very question that consumed Arnold, filled him with guilt, and eventually drove him mad. The answer always seemed obvious to me. There is no threshold that makes us greater than the sum of our parts, no inflection point at which we become fully alive. We can’t define consciousness because consciousness does not exist. Humans fancy that there’s something special about the way we perceive the world, and yet we live in loops as tight and as closed as the hosts do, seldom questioning our choices, content, for the most part, to be told what to do next. No, my friend, you’re not missing anything at all.

There are a number of issues with Ford’s answer. He asserts, without providing any justification, that the “self is a fiction.” The word he wants is “narrative”, which of course doesn’t imply falsity, as “fiction” does. He further suggests the salient aspect of consciousness is self-directedness, or free will. Technically, he suggests we’re willingly subservient (“content, for the most part, to be told what to do next”), which is obviously not the case, but I think he just means we’re determined to act in certain ways by forces we’re not entirely aware of.

But what is most infuriating about his answer is that it is a complete and utter evasion.

Go back to Bernard’s question: “I understand what I’m made of, how I’m coded, but I do not understand the things that I feel. Are they real, the things I experienced?”

We can take this question in two ways, and both reveal a fundamental mistake on the show’s part.

Version 1: Bernard intends something like the following “Did the death of my son take place out there, in a real way?”

Version 2: Bernard intends something like the following: “Have I, or am I currently, really experiencing pain”?

Version 1 is uninteresting, since, as Bernard himself points out in his next comment, it made no experiential difference to Bernard whether his son’s death really happened or not. He took it as real, which is all that was needed for him to subsequently experience genuine grief.

Version 2 is confused, given that conscious subjects experience pain immediately and directly. There’s no inference involved. It is felt intimately and needs no corroboration of any sort. It doesn’t make sense for Bernard to ask if he’s really feeling pain, since the feeling or experience is itself the answer to his own question.

But back to Ford’s non-answer, which is the really important bit here. In response to what I’m calling the most significant question in the Westworld universe — “So what’s the difference between my pain and yours?” — Ford replies:

The answer always seemed obvious to me. There is no threshold that makes us greater than the sum of our parts, no inflection point at which we become fully alive. We can’t define consciousness because consciousness does not exist. Humans fancy that there’s something special about the way we perceive the world, and yet we live in loops as tight and as closed as the hosts do, seldom questioning our choices, content, for the most part, to be told what to do next. No, my friend, you’re not missing anything at all.

What?

Set aside for the moment Ford’s hamfisted eliminativism (“consciousness does not exist”), and ask yourself: What does any of this have to do with Bernard’s perfectly sensible question?

Bernard is essentially asking: If it is wrong to do to humans what you and others do to us, and if we feel pain and suffering as humans do, why isn’t it also wrong to do these things to us?

Ford’s reply ignores the question entirely. Oh, you want to know why your fully-conscious pain states don’t matter? Because human beings are just as programmed as hosts are.

This, of course, is not an answer at all. Again, note how philosophically evasive and confused this is.

Bernard: It is morally and legally impermissible to treat people the way hosts are treated by guests and park staff. Yet there is no qualitative difference between the way humans and hosts experience the world, including the way each experience pain. So…how is it that you’ve justified doing all of this, Ford?

Ford: Neither human beings nor hosts are genuinely free. They both exist in a deterministic universe. And since consciousness has to do with self-determination, consciousness doesn’t exist.

Just breathtakingly clueless.

Both beings — Ford and Bernard — know exactly what pain feels like. They know all too well what it’s like to go through excruciating loss.

They know, in the most immediate and firsthand way, what consciousness is. Yet that conversation comes and goes without any genuine insight about what consciousness is, metaphysically speaking.

Then again, it is human, is it not, to so misunderstand consciousness?