ELON Musk has claimed the race to develop artificial intelligence could spark World War Three.

The Tesla founder spoke about his fears after Vladimir Putin claimed that the nation which controls artificial intelligence will come to rule the world.

On Twitter, Musk wrote: “China, Russia — soon all countries with strong computer science. “

“Competition for AI superiority at national level most likely cause of WW3.”

The development of killer computers would give any nation a clear edge over its competitors.

Artificial intelligence could be used to command fleets of drones or battalions of killer robots, while responding to threats at speeds much faster than any human could manage, The Sun reports.

But there’s a risk that a super-smart AI could go rogue and launch genocidal attacks without being constrained by human conscience and empathy.

Musk said that it may actually be the AI itself that launches the next World War.

He added: “[WW3] May be initiated not by the country leaders, but one of the AI’s, if it decides that a pre-emptive strike is most probable path to victory.”

Nick Bostrom, head of the University of Oxford’s Future Of Humanity Institute, recently claimed that we may have just 50 years to save ourselves from artificial intelligence.

Competition to build a machine that’s as clever as humans will be fierce in the coming decades, with considerable rewards on offer for the nation which manages to pull off the historical feat of achieving “machine intelligence”.

But the scrabble to create this silicon-powered mind could lead to mistakes with disastrous consequences, according to The Sun.

“There is a control problem,” Bostrom said.

“If you have a very tight tech race to get there first, whoever invests in safety could lose the race.

“This could exacerbate the risks from out of control AI.”

Once computers are as intelligent as humans it is likely there will be an “intelligence explosion” which sees the machines reach super-intelligence in a scarily short space of time — this moment is often referred to as singularity.

This process could begin in the next 50 years and once it’s kicked off there may not be any way to stop it.

To illustrate what might happen, Bostrom gave the chilling example of a machine that’s built to make paperclips.

This machine it may decide that humans are standing in the way of its mission and kill us all to enhance its own ability to churn out paperclips.

The academic said it was a “conservative assumption” that super-smart computers would be able to control “actuators”, which means armies of robots which do its bidding.

This story was republished from The Sun with permission.