Stay on Top of Emerging Technology Trends Get updates impacting your industry from our GigaOm Research Community

In a blow to would-be Terminators everywhere, a majority of Americans oppose robots that could autonomously choose to kill a human.

Of the 1,000 people surveyed by University of Massachusetts-Amherst researchers, 55 percent said they oppose autonomous weapons, with most answering “strongly opposed.” Almost 20 percent answered “not sure.” Answers were consistent across political affiliations, ages, genders, regions, education and income levels, but not service status: 73 percent of active military personnel responded with disapproval. Language such as “stopping killer robots” and “banning fully autonomous weapons” garnered similar responses.

“People are scared by the idea of removing humans from the loop, not simply scared of the label,” survey head Charli Carpenter said in a release. Carpenter is an associate professor of political science who studies ethical debate stirred by autonomous weapons.

Survey responses indicated that people worry about malfunctions, a robot’s lack of a moral conscience, human rights abuse, the ability to distinguish between targets and civilians and losing control of machines. Supporters often mentioned the need to protect troops.

Lethal autonomous robots are not in use yet, but all the ingredients are there. Drones, for example, can operate autonomously or lethally when instructed.

A United Nations expert recently urged a preemptive ban on the weapons. He said they could make make war more likely and violate accepted standards for how to treat human life.

“While drones still have a ‘human in the loop’ who takes the decision to use lethal force, LARs have on-board computers that decide who should be targeted,” U.N. expert Christof Heyns told the Human Rights Council in May. “Their deployment may be unacceptable because no adequate system of legal accountability can be devised for the actions of machines.”