If we did create conscious beings, conventional morality tells us that it would be wrong to harm them — precisely to the degree that they are conscious, and can suffer or be deprived of happiness. Just as it would be wrong to breed animals for the sake of torturing them, or to have children only to enslave them, it would be wrong to mistreat the conscious machines of the future.

But how will we know if our machines become conscious? Descartes argued that one’s own consciousness is beyond any possibility of doubt. In the case of others, we’re never absolutely sure. Many of us have entertained, if only for a moment, the idea that everyone else might be a zombie: laughing, crying, complaining, rejoicing, but with no one home. Perhaps scientists will eventually discover the signature of consciousness, and then we will be able to test for it in our robots, as well as in animals and one another. But it is certain that we will build machines that seem conscious long before we arrive at that point.

Anything that looks and acts like the hosts on “Westworld” will appear conscious to us, whether or not we understand how consciousness emerges in physical systems. Indeed, experiments with AI and robotics have already shown how quick we are to attribute feelings to machines that look and behave like independent agents. Multiply that temptation to anthropomorphize a thousandfold: Think not of a machine with visible wires, cartoonish eyes and a voice that sounds like Siri but of a beautiful stranger who engages you in intelligent conversation and who may be more aware of your emotions than your spouse or best friends ever were. It would be irresistible to see this creature as a person, whatever your philosophical views and regardless of what its creators told you about how it was built.

This is where actually watching “Westworld” matters. The pleasure of entertainment aside, the makers of the series have produced a powerful work of philosophy. It’s one thing to sit in a seminar and argue about what it would mean, morally, if robots were conscious. It’s quite another to witness the torments of such creatures, as portrayed by actors such as Evan Rachel Wood and Thandie Newton. You may still raise the question intellectually, but in your heart and your gut, you already know the answer.

Watching the show, you also discover how you feel about the people who rape, torture and kill these robots. We have no idea how many people would actually behave this way in a place like Westworld (the show implies that there’s no shortage of such customers), but there is something repugnant about those who do. In this scenario, the robot hosts are the most human and the humans who abuse them are monsters.

Kant had odd views about animals, seeing them as mere things, devoid of moral value, but he insisted on their proper treatment because of the implications for how we treat one another: “For he who is cruel to animals becomes hard also in his dealings with men.” We could surely say the same for the treatment of lifelike robots. Even if we could be certain that they weren’t conscious and couldn’t really suffer, their torture would very likely harm the torturer and, ultimately, the other people in his life.

This may seem like an extreme version of the worry that many have about violent video games. It has long been speculated that enacting violence in a virtual world desensitizes people to violence in the real one. The evidence for such an effect turns out to be weak. In fact, as video games have become increasingly realistic, the rate of violent crime has dropped.