Like the reunion tours by long-disbanded rock groups whose members dislike each other and the endless parade of remade hit movies from decades past, the vogue for reimagining old science-fiction narratives seems like evidence of an entertainment industry prone to expecting nostalgia to do the work of creativity. Look what’s become of poor Gene Roddenberry’s legacy, with the Star Trek universe long since reduced to the condition of an oil field pumped dry by ill-conceived movies and spin-offs. There’s nothing left for the studio to do except reimagine the original series itself: “To boldly go … where we’ve been in syndication all this time.”

So, with all that now out of my system, I’m ready to tell you about Westworld and Philosophy (Wiley-Blackwell), a collection of articles edited by James B. South and Kimberly S. Engels, to be published shortly before the second season of the HBO series begins in late April. I knew the premise from seeing the original Westworld in the theaters in the early 1970s: the idea of an Old West-themed vacation resort with robot cowboys that went haywire and started killing guests was memorable enough, plus it had a brief scene involving a partially nude robot saloon girl, which filled my young nerd mind with wonder at the possibilities. But taking the premise out for another spin all these decades later did not sound promising.

Nor did the idea of a cheesy Star Wars knockoff from the late 1970s called Battlestar Galactica when the series was revived starting in 2003. As it turned out, I was wrong -- totally and instructively wrong. The rebooted Battlestar Galactica, with its monotheistic androids conducting a holy war against the polytheistic humans who had created them, caught the prevailing tone of American public life after Sept. 11, without simply giving in to the traumatic affect and the Manichaean thinking of that era. The reimagined version was more inspired than the original had ever been.

Likewise with Westworld, albeit in a more anticipatory mode. Here the androids embody (to use a highly pertinent verb) the potential for bioengineering and artificial intelligence to merge in ways that feel uncomfortably plausible if not quite imminent enough to keep anyone awake nights, just yet. Anthologies in the pop-culture-and-philosophy genre are always a very mixed bag, and very often the potential of extracting dollars from a fan base seems to count for the book's existence more than any reward to be had from thinking about the material.

But in the case of Westworld, it's more fair to say that the series keeps questions about mind, personal identity and ethics in play at all times. The "hosts" (androids) populating the vacation resort are programmed with artificial memories, giving them realistic personalities as they interact with "guests" in the course of various Western "narratives" (romance, raising a posse, staking a mining claim, etc.), also programmed by Westworld's corporate management. Artificial intelligence enables the hosts to respond with appropriate emotions and actions as the narrative unfolds in real time. Indeed, they are so perfectly lifelike as to be effectively indistinguishable from the dude-ranch clientele, who are free to behave toward the hosts in any way they want, without the risk of the consequences that they'd experience in the outside world. After a narrative has run its course, the host's short-term memory cache is cleared so they can be deployed again with another bunch of guests/customers. (The imaginary engineering in the series is much more sophisticated than in the original movie, in which it sufficed to show Yul Brynner's face being removed to reveal all the transistors.)

Without any spoilers, it's possible to say that certain glitches and hacks affect some hosts as they start to process unerased data and "remember" how past narratives have played out. I use quotation marks there because the hosts, after all, are extremely complex cybernetic units. But then, what are the guests, or the viewers, for that matter? If a machine were able to handle the same stimuli as a human nervous system, process it in ways modeled on the cognitive capabilities of the human brain and respond with language and behavior as complex and variable as a fully socialized adult, how meaningful would the distinction be?

Philosophers and screenwriters follow the conundrum in different directions, of course, but Westworld displays a surprising awareness of where the philosophical discussion has already gone. The viewer may feel compelled to humanize the hosts -- to imagine the androids as somehow, at some level of complexity, generating an interiority, a sense of self. But onscreen, we find Robert Ford -- the creator of the technology, played by Anthony Hopkins -- considering that idea from the other end:

"There is no threshold that makes us greater than the sum of our parts, no inflection point at which we become fully alive. We can't define consciousness because consciousness does not exist. Humans fancy that there is something special about the way we perceive the world, and yet we live in loops as tight and as closed as the hosts do, seldom questioning our choices, content, for the most part, to be told what to do next."

This stance has a perfectly legitimate pedigree, as Michael Versteeg and Adam Barkman point out in their chapter in Westworld and Philosophy, "Does the Piano Play Itself? Consciousness and the Eliminativism of Robert Ford." It derives from the philosophers Paul Churchland and Daniel Dennett, who overcome the old problem of finding a bridge between mind and matter by advancing a thoroughgoing materialism. "Consciousness" is, in effect, a fancy word for what certain really complex nervous systems (ours) do in the course of running themselves.

Which is one way of kicking the old Cartesian can further down the road. Westworld doesn't endorse eliminativism but rather imagines a world in which it is a very consequential idea for the lives of people not involved in the philosophical profession. It also permits and encourages a number of ethical thought experiments: guests are able to commit acts of violence and murder without breaking any law and -- what seems more troubling -- are free to do so without any obligation to think about the suffering of the androids, however realistic it may be. (All the recreational mayhem of today's video games, but in person!) A chapter by one of the book's editors, Engels, draws on Sartre to analyze the effect on one character of choosing to enjoy the simulated death and horror. Another chapter considers the growing awareness and resistance of the hosts in terms of Frantz Fanon's writings on the violent struggle of colonized people against dehumanization.

With many of the chapters kindly provided to me for an early look by the editors, the concepts and questions explored were clearly things already on the minds of Westwood's writers. The contributors fleshed out the background and the logic of an imagined world in which, as with a car's rearview mirror, objects may be closer than they appear.