Human rights groups have launched a campaign to put a stop to the development of what they call "killer robots"

A GLOBAL rights group has launched a campaign to ban Terminator-style "killer robots" amid fears over the rise of drone warfare.

Human Rights Watch said it was creating an international coalition to call for a global treaty that would impose a "pre-emptive and comprehensive ban" on artificially intelligent weapons before they are developed.

The New York-based group also warned of a possible "robotic arms race" if even one country took the step to allow such machines to enter service.

"Lethal armed robots that could target and kill without any human intervention should never be built," said Steve Goose, arms division director at Human Rights Watch, said at the launch in London of the Campaign To Stop Killer Robots.

"A human should always be 'in-the-loop' when decisions are made on the battlefield.

"Killer robots would cross moral and legal boundaries, and should be rejected as repugnant to the public conscience."

The campaign includes several non-governmental organisations involved in previous successful efforts to ban anti-personnel landmines, cluster munitions, and blinding lasers.

Activists wheeled out a home-made robot outside the Houses of Parliament in London for the launch of the campaign.

The United States has led the way in military robots such as the unmanned drone aircraft that carry out attacks and surveillance in countries including Pakistan, Afghanistan and Yemen.

According to Britain's Bureau of Investigative Journalism, CIA drone attacks in Pakistan have killed up to 3587 people since 2004, up to 884 of them civilians.

But these are controlled by human operators in ground bases and are not able to kill without authorisation.

Recent technical advances will soon allow not only the US but also countries including China, Israel, Russia, and Britain to move towards fully autonomous weapons, Human Rights Watch warned.

Fully autonomous robots that decide for themselves when to fire could be developed within 20 to 30 years, or "even sooner," Human Rights Watch and the Harvard Law School said in a report in November on the same subject.