News, views and top stories in your inbox. Don't miss our must-read newsletter Sign up Thank you for subscribing We have more newsletters Show me See our privacy notice Invalid Email

Robotic weapons have become so advanced that top military experts in the US fear the plot of the sci-fi film 'Terminator' could come true.

Huge technological leaps forward in drones, artificial intelligence and autonomous weapon systems must be addressed before humanity is driven to extinction by mechanical overlords like in the 1984 Arnold Schwarzenegger classic, according to Pentagon chiefs.

Air Force General Paul Selva, the Vice Chairman of the Joint Chiefs of Staff at the US Defense Department, said so-called thinking weapons could lead to: "Robotic systems to do lethal harm... a Terminator without a conscience."

When asked about robotic weapons able to make their own decisions, he said: "Our job is to defeat the enemy" but "it is governed by law and by convention."

(Image: TriStar Pictures)

He says the military insists on keeping humans in the decision-making process to "inflict violence on the enemy".

"That ethical boundary is the one we’ve draw a pretty fine line on. It’s one we must consider in developing these new weapons," he added.

Selva said the Pentagon must reach out to artificial intelligence tech firms that are not necessarily "military-oriented" to develop new systems of command and leadership models, reports US Naval Institute News .

(Image: Getty) (Image: Rex)

In June experts speaking at a Royal Academy of Engineering event in London warned that humans need to be educated about interacting with robots, to avoid creating potentially dangerous situations.

The experts want to see artificially intelligent robots making decisions that "fit some form of ethical or moral guidelines".

(Image: Getty)

Professor Alan Winfield, theme leader in swarm robotics at the Bristol Robotics Laboratory, said we need to "innovate conscientiously" because "if robots are not safe, people won't trust them."