Today's military robots are ugly little buggers. The unmanned ground vehicles that assist our troops in Afghanistan typically look like miniature tanks, with nary a human feature to their credit. They're certainly not what The Terminator and Short Circuit taught us to expect when we hear the term robo-soldiers.

But rest assured, robots that resemble their flesh-and-blood creators are on their way to the world's battlefields. According to a 2004 Darpa survey, US military officers believe that humanoid robots will begin filling out infantry units as early as 2025. Those android grunts will likely be descendants of Petman, a bipedal Boston Dynamics robot funded by Darpa that walks more gracefully than C-3PO.

And they may take aesthetic cues from Vecna Robotics' BEAR, a military prototype that features a rounded head studded with soulful Bette Davis eyes.

Yet despite our love of science fiction, this coming trend in robo-aesthetics is a bad idea. By anthropomorphizing their products, robot designers may unwittingly be encouraging needless bloodshed. Because, as recent research shows, the more human a robot looks, the more likely the Homo sapiens at its controls may be tempted to make the droids go Rambo on their foes.

"Robots don't need to look like people to get a job done," says Leila Takayama, a research scientist at the robotics company Willow Garage. "Actually, sometimes it's better if they don't."

For the foreseeable future, at least, even the most advanced robots will require human operators, especially when lives are at stake—it will be a long time before the Pentagon trusts bots to fire at enemies on their own accord. As a result, we want those operators to understand the moral consequences of the instructions they pass along to their charges. They must have a sense that they—as the wizards behind the curtain—are actually engaged in deadly combat, not playing a videogame.

Victoria Groom, a robotics researcher at Stanford University who has studied this issue with Takayama, says that one of the best ways to do this is by promoting "self-extension"—that is, the feeling that a robot is a mere tool rather than an independent entity. And her research shows that the more utilitarian a robot looks, the more likely its operator is to self-extend into the machine.

In a 2008 experiment, for example, Groom and Takayama asked test subjects to complete tasks with a pair of robots built out of Lego Mindstorms pieces: one that looked humanoid and one that resembled a car. Participants who used the humanoid tended to give their robot credit for doing the assigned work; those with the car took all the credit themselves, much like a carpenter with a hammer. "Anthropomorphic form inhibits the tendency to extend the self into a robot," the researchers wrote, "as anthropomorphic robots are perceived to have a more unique identity than functional robots."

This conclusion is supported by another 2008 study, in which a German team performed fMRI scans on people who were playing games with one of four types of partners: a laptop, a functional robot, a humanoid robot, and an actual human. The scans revealed that the latter two partners activated remarkably similar neural activity within the test subjects, even though those subjects knew that the humanoid was basically just a remote-controlled mannequin.

"The humanoid form is such a powerful social cue," Groom says. "If you see this humanoid shape, you're going to respond to it like it's a person."

That response is precisely what the military must discourage among the humans who will be directing tomorrow's robot army. Those weapons operators will need to understand that they, not their robots, bear the ultimate responsibility for what goes down on the battlefield. Robot designers can help foster that mindset by resisting the urge to anthropomorphize droids destined for service in combat zones. Make them look like killing machines, not friends.

That's not to say that having humanoid bots is always bad. Self-extension among robot operators may be desirable in combat but not necessarily in other grave situations. In search-and-rescue operations, for example, one of the biggest problems is operator stress—people find it incredibly taxing to sift through rubble remotely, with the monotony broken only by the ghoulish discovery of corpses or body parts. Humanoid robots would be ideal for such tasks; they could help the operators feel less viscerally attached to the grim work at hand.

When there's the potential for violence, though, let's stick with ugly robots.

Brendan I. Koerner (brendan_koerner@wired.com) wrote about American manufacturing in issue 19.03.