Policing the Future In the aftermath of Ferguson, St. Louis cops embrace crime-predicting software By Maurice Chammah, with additional reporting by Mark Hansen



Photography by Whitney Curtis

The officers in Jennings work for the St. Louis County Police Department; in 2014, their colleagues appeared on national TV, pointing sniper rifles at protesters from armored trucks. Since then, the agency has also been called out by the Justice Department for, among other things, its lack of engagement with the community.

The conversation also turned to the grand anxieties of post-Ferguson policing. "Nobody wants to be the next Darren Wilson," Officer Trevor Voss told me. They didn’t personally know Wilson. Police jurisdiction in St. Louis is notoriously labyrinthine and includes dozens of small, local municipal agencies like the Ferguson Police Department, where Wilson worked — munis, the officers call them — and the St. Louis County Police Department, which patrols areas not covered by the munis and helps with "resource intense events," like the protests in Ferguson. The munis have been the targets of severe criticism; in the aftermath of 2014’s protests, Ferguson police were accused by the federal Department of Justice of being racially discriminatory and poorly trained, more concerned with handing out tickets to fund municipal coffers than with public safety.

One day last December, a few Jennings police officers flicked on the substation’s fluorescent lights and gathered around a big table to eat sandwiches. The conversation drifted between the afternoon shift’s mundane roll of stops, searches, and arrests, and the day’s main excitement: the officers were trying out a new software program called HunchLab, which crunches vast amounts of data to help predict where crime will happen next.

Half a mile down the road in the city of Jennings, between the China King restaurant and a Cricket cell phone outlet, sits an empty room that the St. Louis County Police Department keeps as a substation. During the protests, it was a war room, where law enforcement leaders planned their responses to the chaos outside.

Just over a year after Michael Brown’s death became a focal point for a national debate about policing and race, Ferguson and nearby St. Louis suburbs have returned to what looks, from the outside, like a kind of normalcy. Near the Canfield Green apartments, where Brown was shot by police officer Darren Wilson, a sign reading "Hands Up Don’t Shoot" and a mountain of teddy bears have been cleared away. The McDonald’s on West Florissant Avenue, where protesters nursed rubber bullet wounds and escaped tear gas, is now just another McDonald’s.

Still, the county police enjoy a better local reputation than the munis. Over the last five years, Jennings precinct commander Jeff Fuesting has tried to improve relations between officers — nearly all white — and residents — nearly all black — by going door to door for "Walk and Talks." Fuesting had expressed interest in predictive policing years before, so when the department heads brought in HunchLab, they asked his precinct to roll it out first. They believed that data could help their officers police better and more objectively. By identifying and aggressively patrolling "hot spots," as determined by the software, the police wanted to deter crime before it ever happened. HunchLab digs into dozens of factors — even moon phases HunchLab, produced by Philadelphia-based startup Azavea, represents the newest iteration of predictive policing, a method of analyzing crime data and identifying patterns that may repeat into the future. HunchLab primarily surveys past crimes, but also digs into dozens of other factors like population density; census data; the locations of bars, churches, schools, and transportation hubs; schedules for home games — even moon phases. Some of the correlations it uncovers are obvious, like less crime on cold days. Others are more mysterious: rates of aggravated assault in Chicago have decreased on windier days, while cars in Philadelphia were stolen more often when parked near schools. At the same time, a growing chorus of activists and academics worry that the reliance on data is a sign that police departments have not adequately heeded the lessons of Ferguson. Kade Crockford, the director of the Technology for Liberty program at the Massachusetts ACLU, says that predictive policing is based on "data from a society that has not reckoned with its past," adding "a veneer of technological authority" to policing practices that still disproportionately target young black men. In other cities, some police departments are even moving toward predicting which people, rather than which places, are most crime-prone. "At a time when communities are crying out for justice," Crockford told me, "I never heard anyone in one of these communities say, ‘I think police need to use more computers!’"

Predicting crime has always been part of police work; any beat cop can tell you that a particularly dark street corner is vulnerable to carjackers, or a large parking lot offers anonymity for drug dealers. Scholars have been mapping crime since the 1800s, but during New York City’s crime spike in the 1990s, police officers started doing so systematically. Most notable among them was Jack Maple, a quick-talking, up-from-the-bottom transit cop who wore double-breasted suits, homburg hats, and two-tone shoes and has become a near-mythic figure in police circles. At the NYPD’s Manhattan headquarters, Maple would stretch out butcher paper across 55 feet of wall space. "I called them the Charts of the Future," he once told an interviewer. "I mapped every train station in New York City and every train. Then I used crayons to mark every violent crime, robbery, and grand larceny that occurred." Maple’s boss, Police Commissioner Bill Bratton, sent officers to patrol the areas Maple marked up. The process evolved into an entire system of police management called CompStat, which uses data to hold individual precinct commanders accountable for the crime levels in their areas. In varying forms, "hot-spot policing" has spread throughout the nation’s police departments. Bratton calls it "computerized fishing." "Cops-on-dots," as it’s sometimes known, has often been associated with Bratton’s other major legacy, "Broken Windows," in which police target low-level offenses like graffiti and public drinking, creating a sense of public order that is believed to deter more serious crimes. Such tactics have been credited with helping bring down crime rates, but they have also contributed to the aggressive targeting — and stopping and searching — of black people, fostering resentment of police in many communities. Crime analysis started edging toward real time St. Louis officials had been using data to send police to patrol hot spots since 2009; today the city holds weekly meetings for commanders to discuss why certain crimes keep hitting certain places, and how to address it. When one precinct captain noticed a lot of robberies of appliances from houses under construction, officers were instructed to keep track of building schedules. In agencies across the country, the more commanders looked at the data, the more timely their responses to that data could be, and crime analysis started edging toward real time. The dream was to go beyond the present. Throughout the criminal justice system, a faith in data’s ability to improve upon human judgment has led judges, prosecutors, and other officials in recent years to embrace tools that address the future; many use "risk assessments" of defendants — which involve questionnaires about demographics, family, and personal history — in sentencing decisions. The White House has asked Silicon Valley companies if they can develop algorithms to predict which people are likely to become "radicalized." In the summer of 2014, a couple of months before Ferguson erupted, St. Louis County Police Chief Jon Belmar returned from a conference of police leaders in Boston, where he had been impressed by presentations from mathematicians and data analysts. He told his aide, Sgt. Colby Dolly, that he wanted their department to join dozens of cities already using predictive policing software. As Dolly studied the predictive policing market, he found it was crowded with competitors. Since 2009, the National Institute of Justice had been funding research into crime prediction, transforming the field into big business. IBM, Hitachi, and Lexis had all begun to offer ways to predict crime through data.

The leader in the field is PredPol, a company that grew out of a team of researchers and officers working under Bratton during the chief’s mid-2000s stint in Los Angeles. PredPol’s algorithms digest years of data on crime locations, times, and types, spitting out the spots most likely to be hit by crime again. After using PredPol for four months, police in the Foothill Division in the San Fernando Valley claimed that property crime dropped 13 percent, while in the rest of the city, it rose by 0.4 percent. PredPol has received millions in venture capital funding and is now used by more than 50 police agencies in the US and UK. But Dolly was attracted by Azavea’s ability to analyze the impact of businesses, churches, and weather patterns on criminal activity. It was also cheaper: Azavea quoted around $50,000 for a year of HunchLab, where PredPol was asking for roughly $200,000. "You can only go so far in enforcing or arresting your way out of crime issues." Azavea’s employees have a Silicon Valley ebullience — their website mentions "ping pong tournaments, team runs, hackathons," and "chess matches over lunch" — but they do not share the tech industry’s talk of "disruption." Their rhetoric is civic-minded; the company’s other projects include tools to analyze legislative districts, as well as an app that helps city residents map the locations of trees in order to study their environmental impact. As predictive policing has spread, researchers and police officers have begun exploring how it might contribute to a version of policing that downplays patrolling — as well as stopping, questioning, and frisking — and focuses more on root causes of particular crimes. Rutgers University researchers specializing in "risk terrain modeling" have been using analysis similar to HunchLab to work with police on "intervention strategies." In one Northeast city, they have enlisted city officials to board up vacant properties linked to higher rates of violent crime, and to advertise after-school programming to kids who tend to gather near bodegas in high-risk areas. Dolly was not opposed to examining and addressing the causes of crime, but the department was still focused on patrolling. He hoped using HunchLab might improve relations with the community by reducing the frequency with which police had to aggressively sweep an area in the wake of a crime. "You can only go so far in enforcing or arresting your way out of crime issues," he said. "This is a way to combat crime that should have minimal impact on the community." In order to sidestep concerns about racially disproportionate policing, Dolly asked HunchLab to only predict the kinds of serious felonies that result in 911 calls, and not low-level crimes like drug possession. He asked the analysts to produce two boxes for every patrol area — no matter how wealthy or poor, black or white — showing two areas at the highest risk of crime for every 8-hour shift. But Dolly also recognized the fundamental limitation of the tool — it was "telling you where to go," he said. "It’s not telling you what to do."

A few hours before police officers in Jennings started their afternoon patrol, Dolly sat down at his computer at police headquarters. He logged into the HunchLab website and pulled up a map. The sprawling metropolis was covered in little bright dots. He clicked to zoom in, and the dots grew into transparent boxes, each covering a space roughly half the size of a city block, and each tinted green, orange, red, purple, blue, pink, or yellow. The colors indicated which type of crime was most likely to hit that box: green for larceny, orange for gun crimes, red for aggravated assault. As Dolly zoomed in on Jennings, he saw two boxes tinted green to indicate a high risk of larceny. He knew this area was one of Jennings’ only commercial districts, so of course there would be a lot of shoplifting. As he panned toward the residential neighborhoods nearby, however, he saw red and orange boxes in areas that looked fairly random. "I’ve been doing police work 16 years," he said, "and I don’t think you’d be able to isolate locations like this."

A few hours later, Thomas Keener arrived for his afternoon shift, checked his gun, got into his squad car, and pulled up the same map. Ten hours a day, four days a week, Keener’s primary job is to answer 911 calls and provide backup to other officers. When calls don’t come in, he patrols. Keener, 27, grew up in southern Missouri and graduated from the police academy six years ago. He is unfailingly polite. While his peers wear short sleeves, he chooses a long-sleeve khaki uniform and a dark brown tie, a formal get-up that, coupled with his buzzed hair, accentuates his boyishness. When Keener began his shift, he headed toward the boxes HunchLab deemed to be high-risk. Like Dolly, he immediately registered that a green larceny box was over an area that contained a couple of dollar stores where he has caught people running out with stolen goods. He pointed out common escape routes. "See how it’s easy to disappear over there?" Even without HunchLab, he would have probably gone to the area. In other cities where HunchLab has been used, police officers are often unsurprised by the locations of the boxes — police in Lincoln, Nebraska, started experimenting with the software in 2014 but have found it mostly tells them what they already know. "When I look at the HunchLab maps," said former police chief Tom Casady, "I say, ‘Yep, it got that right!’"

The shift rolled on, and Keener got a series of calls: to help a man who had overdosed, to assist with an arrest at the police station, to help look for some young men in hoodies suspected of a burglary. The day was proving to be a safe one. "I’m going to choose to credit the patrols with that," Keener said. Driving through Jennings, it was clear Keener already had a predictive map stored in his brain. He pointed out non-descript yards and houses where he had been called to the scene of homicides, burglaries, and gang shootings. He continued through particular blocks of residential neighborhoods where HunchLab had placed red and orange boxes to indicate a risk of aggravated assault and gun crime. The streets were lined with crumbling little brick houses. In a few yards, signs reading "We Must Stop Killing Each Other" had been stuck into the dirt.

As Keener drove through the bottom left corner of a HunchLab box — red to indicate a high risk of aggravated assaults — he noticed a white Chevy Impala with a dark window tint, dark enough to merit a traffic ticket. Keener gunned his motor and flashed his lights. The car slowed to a stop, and Keener walked up to the window. Leaning down, he caught a whiff of marijuana. The young man was black and looked to be in his 20s, with a baseball cap, grey sweatpants, and a tattoo that crept out of his shirt. Keener said, politely but firmly, "I smell what smells like weed to me." The man said he smoked earlier, but that there was "nothing in the car." "You ask me if I smoke. I smoke, man!" Keener decided the smell gave him probable cause for a search. He told the young man to step out, frisked him, and asked him if he had anything "I should know about." The man said he had a gun. Keener found a black Glock 23 pistol, .40 caliber, under the seat. He took it back to his car, noticing it had no magazine — just a bullet in the pipe. "Pretty big," Keener said, turning the gun in his hands. Another police car rolled up behind Keener — a standard call for backup had gone out. Standing between his own car and Keener’s car, the young man stared at the ground, clearly annoyed but also trying not to appear annoyed. When Keener asked about the smell again, he said, "You ask me if I smoke. I smoke, man!" While the man paced, Keener looked up his name and the gun’s ID number. It didn’t turn up as stolen. It is legal to have a gun in your car in Missouri if you’re over 18 and not a convicted felon. The man had been arrested, but never convicted, for stealing a gun. Keener let him go with a ticket for the window tint.

As the Impala drove off, Keener looked back at the HunchLab map. The stop itself had gone down just outside the aggravated assault square. "He could have been going to shoot somebody," he said, shrugging. "Or not." That HunchLab had sent him to a location where he may or may not have averted illegal activity was, for the moment, tangential; it would not be clear for months whether crime rates in Jennings might be affected by the program. "He could have been going to shoot somebody — or not." Research on the impact of predictive policing programs is still in its infancy. Last year, PredPol researchers published a study finding that sending patrol officers to several areas of Los Angeles predicted by their algorithm led to a reduction, on average, of more than four crimes per week in these neighborhoods — twice as efficient as human crime analysts. The researchers said the savings — resulting from not having to investigate and prosecute crimes that otherwise may have happened — could reach $9 million per year. Jeremy Heffner, the product manager for HunchLab, is careful about making promises; he argues the results will vary based on how a particular police agency uses their analysis. "By having more accurate locations, we amplify the effect a meaningful tactic may have," Heffner told me, "but you still need a meaningful tactic." Studies of HunchLab’s effectiveness are underway in several cities, and researchers in Philadelphia are comparing patrolling in marked police cars to sending unmarked cars, which could quickly respond to crime, but might not deter it. Even with data-driven tools, on-the-ground police work is full of ambiguity and discretion, which makes measuring their impact difficult. Would Keener have stopped the car at all had it not been in a HunchLab box? "It’s all relative," Keener said. "Probably." He was careful to point out that being in the box alone was not a good enough reason to stop someone. "Does the data give me grounds to stop just because they’re walking around? No."

"My son don’t have anything positive to say, so he’d rather not say anything at all," said the mother of the man whom Keener had stopped. It was a few weeks later, and we were talking by phone — her son had not wanted to be interviewed, and she was too suspicious of the police to put her name in print. "Believe it or not, if you say anything to the press," she said, the police "will make sure to pull you over and treat you worse." The mother and her son have been pulled over a lot, she said, and it often feels as though they are targeted because they’re black. "They give you a reason" — the tinted windows, the marijuana smell — "but then they get to asking you to get out [of the car]. Well, why do I have to get out? Because you said so? All I can go on is it’s because I’m black." There are widespread fears among civil liberties advocates that predictive policing will actually worsen relations between police departments and black communities. "It’s a vicious cycle," said John Chasnoff, program director of the ACLU chapter for Eastern Missouri. "The police say, ‘We’ve gotta send more guys to North County,’ [where Jennings is located] because there have been more arrests there, and then you end up with even more arrests, compounding the racial problem."