Robocops are reality.

Sure, we're still a long way from Hollywood's vision Robocop, but automated law enforcement is upon us. Brazil used iRobot PackBot 510s to secure the Maracana during the men's World Cup this summer, and Germany used OFRO terrain robots to patrol the Berlin Olympic Stadium during the women's World Cup five years ago. It's now possible to automatically detect a crime, identify the suspects, and even issue a citation without direct human involvement. It could happen next time you run a red light.

In some cases, only one link in the chain is automated—surveillance of underground tunnels below the U.S./Mexico border, for example—but the robotic possibilities are nearly endless. In other cases—such as identifying and diffusing bombs—using robots or other automated systems can keep police officers safe. But automation is also framed as a way to make law enforcement more efficient. A red light camera can catch a lot more violations than a human can.

The rub is that extreme efficiency isn't necessarily good thing. That's what a group of researchers argue in a paper presented earlier this year at a conference on robot law in Miami. They go so far as to argue that inefficiency should be preserved, even increased, as we move to automated law enforcement.

That may sound counter-intuitive, but in the end, it makes good sense. Woodrow Hartzog, an assistant professor at Samford University's Cumberland School of law and co-author of the paper, tells WIRED that, in some cases, making law enforcement less efficient just means putting humans back in the loop, allowing room for "inefficient" human judgments like mercy and compassion. "A robot can't forgive certain infractions that are generally accepted," he says.

The Letter of the Law

Part of the problem is it's extremely difficult to automate the enforcement of laws, including those that seem straightforward. For example, in a previous paper, Hartzog and company asked 52 different coders to create a program that would issue speeding tickets based on a sensor placed within a car. The results varied wildly, depending on whether the programmers were asked to follow the letter of the law or the spirit of the law. Programs that followed the letter of the law ended up issuing as many as 1,000 tickets for a single car trip.

"When speed limit laws were written, there was an assumption that they would enforced much of the time, with a certain amount of human discretion," Hartzog says. "Trying to do this automatically can have unforeseen consequences."

How do you solve this problem? You could program the application to issue speeding tickets to only one in every four speeders. But there's a rub. What do you then tell the family of someone who was killed by someone who was speeding, but not cited?

Another consideration is that, through automation, it could eventually become impossible to break certain laws. And that might not be a good thing either. The authors of the paper point out that in some cases, breaking the law is necessary for social change. After all, not all laws are just, and the way they're perceived can change over time.

What's more, leaving humans out of the enforcement loop can magnify the effects when things go wrong. The Prairie Village Post recently reported a case in which an innocent man was pulled over when a license plate scanner misread his license plate and flagged his car as one that was stolen. The police pulled him over and approached his car with guns drawn.

It may be inefficient for a human cop to double check a license plate every time it's identified by an automated system, but in this case, it would have saved both time and anguish for innocent people. The authors of the paper describe inefficiency as a means of providing checks and balances against automated systems. They call this the "conservation principle."

Room to Go Wrong

Mary Anne Franks, an associate professor of law at the University of Miami who critiqued the paper, agrees that there's plenty of room for automated law enforcement to go wrong. But she says that there are also plenty of positives. "Taken literally, the conservation idea sounds like preserving the status quo," she says. "And I'm not comfortable with preserving the status quo."

She points out that automation could be a way of reducing discrimination. A red light camera doesn't care what color your skin is, whether you're a police officer's spouse or if you're a pillar of the community. It just records infractions. "To have power of discretion is to have the power to discriminate," she says.

Woodrow agrees that automation can be a good thing. "We don't want to preserve the status quo completely," he says. "But we want change at a more deliberate pace, so that we can see how the changes effect people."

Eventually, they both agree, automation could become a way to test laws and bring attention to those that are poorly written or have disproportionately effected marginalized groups. For example, if New York City's stop and frisk policy was enforced at complete random, many more people would likely speak out against that law.

Ultimately, Woodrow hopes this is just the beginning of a larger conversation about both automation and law enforcement. "I think automated law enforcement could lead to more transparency than ever in terms of how the law is written and enforced," Woodrow says. "But you have to insist on it from the beginning."