Fully autonomous weapons, or "killer robots," present a legal and ethical quagmire and must be banned before they can be further developed, a new human rights report published Thursday urges ahead of next week's United Nations meeting on lethal weapons.

The report, titled Mind the Gap: The Lack of Accountability for Killer Robots, was jointly published by Human Rights Watch and Harvard Law School's International Human Rights Clinic and outlines the "serious moral and legal concerns" presented by the weapons, which would "possess the ability to select and engage their targets without meaningful human control."

Although fully autonomous weapons do not yet exist, their "precursors" are already in use, such as the Iron Dome in Israel and the Phalanx CIWS in the U.S., the report states.

Under current law, the makers and users of killer robots could get away with unlawful deaths and injuries if the weapons are allowed to develop. Allowing weapons that operate without human control to make decisions about the use of lethal force could lead to violations of international law and make it difficult to hold anyone accountable for those crimes. Moreover, civil liability would be "virtually impossible, at least in the United States," the report found.

"No accountability means no deterrence of future crimes, no retribution for victims, no social condemnation of the responsible party," lead author and HRW Arms Division researcher Bonnie Docherty said in a press release on Thursday. "The many obstacles to justice for potential victims show why we urgently need to ban fully autonomous weapons."

SCROLL TO CONTINUE WITH CONTENT Never Miss a Beat. Get our best delivered to your inbox.







The UN will discuss the killer robots and more conventional arms at its upcoming meeting on inhumane weapons in Geneva, Switzerland from April 13-17. In the past, the UN has used the gathering to preemptively ban military tools such as blinding lasers (pdf).

The report calls on the UN to make a similar call on fully autonomous weapons, stating:

In order to preempt the accountability gap that would arise if fully autonomous weapons were manufactured and deployed, Human Rights Watch and Harvard Law School’s International Human Rights Clinic (IHRC) recommend that states: Prohibit the development, production, and use of fully autonomous weapons through an international legally binding instrument.

Adopt national laws and policies that prohibit the development, production, and use of fully autonomous weapons

Docherty concluded, "The lack of accountability adds to the legal, moral, and technological case against fully autonomous weapons and bolsters the call for a preemptive ban."