Cops are already using computers to stop crimes before they happen, academics have warned.

In a major piece of research called "Artificial Intelligence and life in 2030", researchers from Stanford University said "predictive policing" techniques would become commonplace in the next 15 years.

3 Samantha Morton starred in Minority Report, playing a woman who had pre-cognitive abilities and could predict crimes before they happened

Read more: Terrorists ‘actively seeking’ to build deadly army of intelligent killer robots, UN warns

The academics discussed the crime fighting implications of "machine learning", which allows computers to learn for themselves and then solve problems just like a human.

This technique will have a major effect on transport, healthcare and education, potentially bringing massive benefits as well as putting millions of jobs at risk.

But in the hands of cops, AI has the potential to have a massive impact on society by allowing law enforcement to have an "overbearing or pervasive" presence.

"Cities already have begun to deploy AI technologies for public safety and security," a team of academics wrote.

"By 2030, the typical North American city will rely heavily upon them.

"These include cameras for surveillance that can detect anomalies pointing to a possible crime, drones, and predictive policing applications."

Machine learning and AI is already used to combat white collar crime such as fraud. It is also used to automatically scan social media to highlight people of risk of being radicalised by ISIS.

Yet the range of crimes which could be stopped by AI is likely to grow as the technology becomes more advanced.

3 AI is also being used to tackle cyber crime and foil terrorists' online recruitment drives Credit: Getty Images

The academics continued: "Law enforcement agencies are increasingly interested in trying to detect plans for disruptive events from social media, and also to monitor activity at large gatherings of people to analyse security.

"There is significant work on crowd simulations to determine how crowds can be controlled.

"At the same time, legitimate concerns have been raised about the potential for law enforcement agencies to overreach and use such tools to violate people’s privacy."

In the film Minority Report, a group of psychics called "precogs" were able to predict crimes by reading people's intentions and stopping them.

But real life AI will work differently by identifying trends in pre-existing crimes or learning the signs which show someone is about to commit an offence.

For instance, if cameras spot a person lingering down a dark alley, a computer could conclude a mugging is about to take place and scramble cops to stop the wannabe thief before he strikes.

3 Drones could be fitted with cameras and AI tech which allows them to predict crimes before they occur Credit: AP:Associated Press

Related Stories VOCAL NETWORK Facebook 'bots' will nag you to get fit and grumble when you fritter away your cash ROBOCROOK Computers and robots will commit more crimes than humans by 2040, experts warn DEATH DRONES China building cruise missiles powered by killer artificial intelligence EYES IN THE SKY CIA training artificial intelligence to spy on Earth from SPACE using 'computer vision' Exclusive GAME OF DRONES Inside the killer robot 'arms race' where the world's superpowers are preparing for all-out futuristic war

"Machine learning significantly enhances the ability to predict where and when crimes are more likely to happen and who may commit them," the Stanford University team wrote.

The experts were keen to emphasise the positive points of artificial intelligence, which could actually help to prevent miscarriages of justice and stop cops abusing their power.

"As dramatised in the movie Minority Report, predictive policing tools raise the spectre of innocent people being unjustifiably targeted," the academics continued.

"But well-deployed AI prediction tools have the potential to actually remove or reduce human bias, rather than reinforcing it, and research and resources should be directed toward ensuring this effect."

They added: "If society approaches these technologies primarily with fear and suspicion, missteps that slow AI’s development or drive it underground will result, impeding important work on ensuring the safety and reliability of AI technologies. On the other hand, if society approaches AI with a more open mind, the technologies emerging from the field could profoundly transform society for the better in the coming decades."