Every field undergoes its own data revolution. Retailers like Wal-Mart now know to stock Pop-Tarts during hurricanes, because that’s when toaster pastries sell best. Baseball teams know that a player’s value has more to do with his on-base average than home runs. Political science tells us that campaign speeches don’t affect electoral outcomes as much as economic conditions.

Police departments have long been in the data game, with such efforts as CompStat. But there’s a new twist: They’re not just using statistics to assess the past. Now they’re trying to predict the future. In November 2009, the National Institute of Justice held a symposium on “predictive policing,” to figure out the best ways to use statistical data to predict micro-trends in crime. The Los Angeles Police Department then won a $3 million grant from the Justice Department to finance a trial run in predictive methodology. (The grant, like the rest of the 2011 federal budget, is pending congressional approval.) Other police departments are giving predictive policing a shot, too, from Santa Cruz, which recruited a Santa Clara University professor to help rejigger their patrol patterns, to Chicago, which has created a new “criminal forecasting unit” to predict crime before it happens.

The phrase “predictive policing” may conjure images of Tom Cruise in Minority Report, the 2002 movie in which he arrests people for “pre-crime.” Civil libertarians can rest easy. “This is not about predicting the behavior of a specific individual,” says Jeffrey Brantingham, an anthropology professor at UCLA who works on the research team that’s partnering with the LAPD. Rather, predictive policing deals with crime in the aggregate. “It’s about predicting the risk of certain types of crimes in time and space,” he says.

Predictive policing is based on the idea that some crime is random—but a lot isn’t. For example, home burglaries are relatively predictable. When a house gets robbed, the likelihood of that house or houses near it getting robbed again spikes in the following days. Most people expect the exact opposite, figuring that if lightning strike once, it won’t strike again. “This type of lightning does strike more than once,” says Brantingham. Other crimes, like murder or rape, are harder to predict. They’re more rare, for one thing, and the crime scene isn’t always stationary, like a house. But they do tend to follow the same general pattern. If one gang member shoots another, for example, the likelihood of reprisal goes up.

In a paper slated for publication in the Journal of the American Statistical Association, the team of UCLA researchers working with the LAPD compares this kind of repetitive crime to earthquakes. The initial crime is the first tremor. Subsequent crimes follow like aftershocks. We don’t know exactly where or when the after-crimes will occur—or if they’ll occur at all. But we can create a predictive model based on probabilities. Police departments can then feed real-time crime data into these models and organize patrols based on the likelihood of certain crimes occurring in certain places. The LAPD plans to test this method in the coming months, pending federal funding, possibly by examining violent crime in South Los Angeles and burglaries in the San Fernando Valley, according to Capt. Sean Malinowski, who is heading up the project.

That doesn’t mean police can prevent every kind of crime—only the more predictable kinds, like burglary or auto theft. (Researchers call crimes that presage other crimes “self-exciting.”) And even then, some robberies could be truly random. But predictive policing could reduce crime on the margins, according to the authors of the UCLA paper. They calculate that compared to “hotspot” policing—which bases patrols on past crime trends, rather than current incidents—predictive policing could raise the rate of crime that’s predicted (as opposed to unforeseen, and therefore unprevented) by several percentage points.

That may not sound like much. But these days, with national crime rates at historic lows, reductions are on the margins. “The low-hanging fruit has already been taken,” says Brantingham. Predictive policing done right also allocates resources more efficiently, which can’t hurt at a time when departments are getting lopped in half due to tight budgets.

Skepticism is warranted. For example, how is all this different from CompStat, the computer statistics program pioneered by Bill Bratton when he was police commissioner in New York City? The big difference is that CompStat is more retrospective than prospective. It collects crime numbers from previous weeks or months and uses them to evaluate a police department’s efficiency. Cops could then use those trends to inform future patrolling. But they were still using old data, rather than current data. Predictive policing collects data in real time and uses it to map probable hotspots in the near future.

But isn’t a lot of this stuff intuitive? If a crime occurs on a particular block of Compton, can’t the LAPD just keep a closer eye on that area in the days after the crime? Sure, says Brantingham, but intuition can take a police officer only so far. In a city as large and complex as Los Angeles, it’s hard to perform predictive policing by gut alone. Statistical models may simply confirm police intuition 85 percent or 90 percent of the time. “It’s in the remaining 10 or 15 percent where police intuition may not be quite as accurate,” says Brantingham. Malinowski calls the data “another tool in the toolbox.”

Data-driven law enforcement shows that the criminal mind is not the dark, complex, and ultimately unknowable thing of Hollywood films. Instead, it’s depressingly typical—driven by supply, demand, cost, and opportunity. “We have this perception that criminals are a breed apart, psychologically and behaviorally,” says Brantingham. “That’s not the case.”

It’s a common saying that when someone gets killed, he becomes a statistic. So does the killer.

Like Slate on Facebook. Follow us on Twitter.