Watching the incineration of Elmo made me feel vaguely uncomfortable. Part of me wanted to laugh, but I also felt sick about what was going on. *

Photo: Mauricio Alejo * During the 20 months that Fisher-Price spent developing the innards and software of its latest animatronic Elmo, engineers gave the project the code name Elmo Live. And sure enough, they made him more animate than ever: He moves his mouth in time with the stories he tells, shivers when he gets scared, and has a fit when he sneezes.

When they were finally able to test the doll on children, they were struck by how immediately the kids blocked out all other stimuli in the room and began interacting with Elmo. "It was as if Elmo were part of their family," says Gina Sirard, Fisher-Price VP of marketing. "To a child, he really is alive."

So the code name stuck, and over the past few months legions of $60 Elmo Live dolls have joined families everywhere. Some are certainly doomed to join previous Elmos in a new pastime: robotic-toy torture. YouTube is full of videos of idiots dousing Elmo with gas, setting him on fire, and laughing as his red fur turns to charcoal and he writhes in a painful dance.

I've seen videos of the incineration of T.M.X. Elmo (short for Tickle Me Extreme); they made me feel vaguely uncomfortable. Part of me wanted to laugh—Elmo giggled absurdly through the whole ordeal—but I also felt sick about what was going on. Why? I hardly shed a tear when the printer in Office Space got smashed to bits. Slamming my refrigerator door never leaves me feeling guilty. Yet give something a couple of eyes and the hint of lifelike abilities and suddenly some ancient region of my brain starts firing off empathy signals. And I don't even like Elmo. How are kids who grow up with robots as companions going to handle this?

This question is starting to get debated by robot designers and toymakers. With advanced robotics becoming cheaper and more commonplace, the challenge isn't how we learn to accept robots—but whether we should care when they're mistreated. And if we start caring about robot ethics, might we then go one insane step further and grant them rights?

First, the science: The brain is hardwired to assign humanlike qualities to anything that somewhat resembles us. A 2003 study found that 12-month-olds would check to see what a football-shaped item was "looking at," even though the object lacked eyes. All the researcher had to do was move the item as if it were an animal and the infants would follow its "gaze." Adults? Same reaction.

The perennial concern about the rise of robots has been how to keep them from, well, killing us. Isaac Asimov came down from the mountaintop with his Three Laws of Robotics (to summarize: Robots shouldn't disobey or hurt humans or themselves). But what are the rules for the humans in this relationship? As technology develops animal-like sophistication, finding the thin metallic line between what's safe to treat as an object and what's not will be tricky. "It's going to be a tougher and tougher argument to say that technology doesn't deserve the same protection as animals," says Clifford Nass, a Stanford professor who directs a program called the Communication Between Humans and Interactive Media Lab. "One could say life is special—whatever that means. And so, either we get tougher on technology abuse or it undermines laws about abuse of animals."

It's already being considered overseas. In 2007, a South Korean politician declared that his country would be the first to draw up legal guidelines on how to treat robots; the UK has also looked into the area (though nothing substantial has come of it anywhere). "As our products become more aware, there are things you probably shouldn't do to them," says John Sosoka, CTO of Ugobe, which makes the eerily lifelike robot dinosaur Pleo (also tortured on Web video). "The point isn't whether it's an issue for the creature. It's what does it do to us."

We live in an age of anxiety—about the economy, the environment, terrorism. And now even about our toys, which are forcing us to question the boundaries of humanity and compassion. Back on Sesame Street, Elmo Live's creators have an answer: Keep soul-searching to a minimum and recognize that you're buying a product, pure and simple. "This is a toy," Fisher-Price's Sirard says. "There shouldn't be any laws about how you use your toys." Happy grilling, Elmo!

Senior writer Daniel Roth (daniel_roth@wired.com) profiles Comcast CEO Brian Roberts in this issue of Wired.

Start Next: Datastream: USA's VHF TV-Channel Frequencies Asimov's 3 Laws of Robotics (Plus a Few He Forgot)

Toy Robot Intended to Save Humans From Evil, Future Bots

Gallery: Best Robot Love Stories, From Wall-E to Weird Science

This Is Your Robot Life