Pioneers in robotics and artificial intelligence have called on the Australian and Canadian governments to ban killer robots ahead of a United Nations meeting on weapons this month.

Leading researchers from the countries urged prime ministers Malcolm Turnbull and Justin Trudeau respectively to take a stand against autonomous weapons, arguing that their development and use crossed a “clear moral line.”



Artificial intelligence can be used to make weapons that operate without human oversight, giving them the ability to loiter in an area and make life or death decisions without approval from a military controller.



“If developed, they will permit armed conflict to be fought at a scale greater than ever before, and at timescales faster than humans can comprehend,” the letter to Turnbull states. “The deadly consequence of this is that machines, not people, will determine who lives and dies.”



The letters are signed by hundreds of specialists including Toby Walsh, an AI professor at the University of New South Wales in Sydney, Geoffrey Hinton, an AI pioneer who runs Google’s Brain Team in Toronto, and Ian Kerr, professor of ethics, law and technology at the University of Ottawa.



In August, many of the world’s top robotics and AI scientists called on the United Nations to ban killer robots and so halt the arms race now underway to build autonomous weapons. The race threatens to usher in a “third revolution in warfare” after gunpowder and nuclear weapons, the researchers warned in an open letter.



The military is one of the largest funders of AI research, and while the technology could be used to make mine-clearing robots or unmanned vehicles that deliver supplies, fully-automated offensive weapons would effectively become weapons of mass destruction, the scientists state.



“One programmer would be able to whole control armies of weapons,” said Walsh “They are the perfect weapons to suppress a civilian population. Unlike humans who have to be persuaded to commit atrocities, these will be cold, calculating weapons that will do whatever they are programmed to do.”



Arms manufacturers have already built highly autonomous weapons for the military, from robotic sentries and autonomous tanks to flying drones that can track and strike targets. The systems are designed to operate under human supervision. Compared with nuclear weapons, AI-powered weapons are likely to be cheap and simple to make, meaning they could easily find their way onto weapons black markets.



The letters to the Australian and Canadian governments coincide with the UN’s conference this month on the convention on certain conventional weapons, which aims to restrict or prohibit weapons that are excessively injurious or indiscriminate.

“Giving machines the right to make life or death decisions takes us down a horrible road,” Walsh added. “There are some technologies we should keep out of the battlefield.”