Who needs the freaky precogs of Minority Report to predict if someone's likely to commit murder when you have an algorithm that can do it for you?

New crime-prediction software used in Maryland and Pennsylvania, and soon to be rolled out in the nation's capital too, promises to reduce the homicide rate by predicting which prison parolees are likely to commit murder and therefore receive more stringent supervision.

The software aims to replace the judgments parole officers already make based on a parolee's criminal record and is currently being used in Baltimore and Philadelphia.

Richard Berk, a criminologist at the University of Pennsylvania who developed the algorithm, claims it will reduce the murder rate and other crimes and could help courts set bail amounts as well as sentencing in the future.

"When a person goes on probation or parole they are supervised by an officer. The question that officer has to answer is 'what level of supervision do you provide?'" Berk told ABC News. The software simply replaces that kind of ad hoc decision-making that officers already do, he says.

To create the software, researchers assembled a dataset of more than 60,000 crimes, including homicides, then wrote an algorithm to find the people behind the crimes who were more likely to commit murder when paroled or put on probation. Berk claims the software could identify eight future murderers out of 100.

The software parses about two dozen variables, including criminal record and geographic location. The type of crime and the age at which it was committed, however, turned out to be two of the most predictive variables.

"People assume that if someone murdered then they will murder in the future," Berk told the news outlet. "But what really matters is what that person did as a young individual. If they committed armed robbery at age 14 that's a good predictor. If they committed the same crime at age 30, that doesn't predict very much."

Shawn Bushway, a professor of criminal justice at the State University of New York at Albany told ABC that advocates for inmate rights might view the use of an algorithm to increase supervision of a parolee as a form of harassment, especially when the software produced the inevitable false positives. He said it could result in "punishing people who, most likely, will not commit a crime in the future."