The US army follows the UK armed forces in getting its Special Forces several hornet drones, a new kind of flying robot which has military experts worried about the implications for the future of warfare.

US Special Forces are testing PD-100 Black Hornet drones, one of a new breed of small drones which can autonomously fly in many different environments and, experts warn, present scientific, technical and ethical challenges for military and civilian applications.

The Black Hornet drone weighs 18 grams and carries a regular or thermal camera; it has a maximum flight time of around 25 minutes, a top speed of 10 m/s and a range of more than 1.5 km. Its camera relays video and still images to a handheld control terminal.

The incredible piece of equipment is made by Norwegian firm Prox Dynamics, which calls the PD-100 a 'Personal Reconnaissance System' [PRS] that "provides law enforcement agencies with a game-changing pocket-sized ISR capability that provides instant situational awareness."

The tiny drone has been in operational use for three years; the UK military first began using the PD-100 in Afghanistan in 2012. In February 2013 the UK Military of Defense revealed its plan to purchase 160 of the robots, in a contract worth £20 million [$31 million].

"Using this is no different to playing on an Xbox, playing on a PS3 with the control," said Sergeant Carl Boyd of the British Army, when demonstrating the device.

commentary published by the journal Nature this week examined the challenges and impact of such devices, which according to the authors present "an important ethical decision: whether to support or oppose the development of lethal autonomous weapons systems [LAWS]."

"Technologies have reached a point at which the deployment of such systems is — practically if not legally — feasible within years, not decades. The stakes are high: LAWS have been described as the third revolution in warfare, after gunpowder and nuclear arms," writes Stuart Russell, Professor of computer science at the University of California, Berkeley.

"Autonomous weapons systems select and engage targets without human intervention; they become lethal when those targets include humans."

"The overriding concern should be the probable endpoint of this technological trajectory. The capabilities of autonomous weapons will be limited more by the laws of physics — for example, by constraints on range, speed and payload — than by any deficiencies in the AI [artificial intelligence] systems that control them."