Israeli software is helping British police forces across the UK ream through vast amounts of data on suspects’ mobile phones, it has been revealed.

The technology, which allows officers to interpret images, visualise social networks, match faces and analyse communication trends, comes from Cellebrite, an Israeli firm now owned by a Japanese company.

The firm is working with a dozen police forces across the UK, but for the most part commercial non-disclosure agreements mean the forces trialling the firm’s Artificial Intelligence (AI) software remain unknown. The Met Police and Staffordshire Police, however, did confirm that they were using Cellebrite technology.

Get The Jewish News Daily Edition by email and never miss our top stories Free Sign Up

The firm has hit the headlines several times in recent years, most recently in reports that it enabled the FBI to bypass mobile phone locks and encryption in high-profile legal cases, and for developing software that allows the police to see whether a driver involved in an accident was using their mobile phone at the time.

Police forces in Britain are hard-pressed to extract and analyse data from suspects’ mobile phones, with backlogs of up to six months reported, so officers are turning to AI algorithms to help filter through the mass of information.

Prosecutors and officers have seen a number of rape trials collapse at trial in recent months after defence lawyers cried foul that important digital evidence was withheld. The new automated scanning software may remove some of the human error.

The problem is that officers are typically be presented with thousands of PDF pages of extracted mobile phone data, but the firm’s Analytics Enterprise system allows for intelligent sifting and filtering.

It visualises a suspect’s social network, feeds in data from multiple phones to analyse links and overlaps, and uses geo-tagging information to highlight when two people were in the same place at the same time.

It also uses “neural network-based machine learning algorithms” to “automatically detect previously unknown images and video clips related to key categories, such as child exploitation, weapons, money, drugs, nudity and more”.

The firm advertises its programme as allowing officers to “see the whole picture and find the connections in the case more quickly by eliminating the manual review of media files,” however privacy campaigners say the software should ring alarm bells.

“Powerful tools like this could mean that rape victims are doubly victimised by unnecessary incursions into their privacy, or that bias is built into decisions about what is relevant and what is not,” said Liberty’s Corey Stoughton.

“The home secretary must stop allowing police forces to ‘trial’ potentially harmful technologies without first allowing parliament and the public a say.”