When Eric Holder argued that new encryption settings announced for Apple and iPhone will help criminals, he wasn’t not wrong, per se. The outgoing Attorney General claimed that the default encryption on the iPhone 6 and the upcoming Android L operating system is tantamount to “thwarting our ability” to “protect [children] and to stop those that abuse children.”

And he’s right, but only if these criminals don’t know how to encrypt their habits already (and they do). In fact, if you don’t want to wait for the new Android release, you can encrypt your cell phone in about two minutes, just by changing a few security features. Same on any iPhone. All Apple and Google are changing is that now, those settings will be turned on by default.

But if you listen to Holder or FBI Director James Comey, you’d think Apple and Google had just chained up the cops and sold the criminals ski masks. “What concerns me about this is companies marketing something expressly to allow people to place themselves beyond the law,” said Comey. “The notion that someone would market a closet that could never be opened—even if it involves a case involving a child kidnapper and a court order—to me does not make any sense.”

The complaints about the new policies from all areas of law enforcement—down to local police forces—show a rather lazy attitude within law enforcement and an even more relaxed attitude about privacy. They represent a fundamental misunderstanding of the importance of privacy. Comey’s comments in particular highlight the entitled attitude of police and officials, thinking that tech companies have an obligation outside of the law to make their lives easier.

In fact, U.S. law expressly states tech companies “shall not be responsible for decrypting, or ensuring the government’s ability to decrypt, any communication encrypted by a subscriber or customer.”

Apple and Google, like any corporation, have one obligation, and that’s to the customer. Even shareholders, whose primary interest is growth, will recognize the importance of customer satisfaction and trust. Whether they know it or not, Holder, Comey, and others are encouraging consumers to be less safe, not more, by leaving their phones open to hackers, identity thieves, and whoever else might want to access your files.

The decisions by Apple and Google—who represent a combined 93 percent of the mobile phone market—come no doubt on the heels of a renewed interest in privacy, with the New York Times even describing it as a “post-Snowden” move. It’s a setting most people probably never even think about—but one that will make the sort of mass surveillance by the NSA unveiled last year increasingly more difficult.

Just as well, given that the recent releases of nude celebrity photos—most of them stolen from iPhones or Apple’s iCloud service—has shown the massive vulnerabilities in the cell phone network. In fact, much of the “hacking” done to obtain the photos amounted to not much more than guessing the answers to security questions.

These twin cultural narratives have finally forced the hand of Apple and Google to raise privacy from a niche concern to a top one, and of course, police are going to complain about it.

The NSA, FBI, DEA, and the Justice Department have relied heavily on the lax attitude towards privacy held by individuals and corporations. The NSA is capable of tracking the online habits of practically every American because so few people actually care to encrypt their data, phone calls, or traffic, and few companies see an incentive in doing it for them.

Now that attitudes have seemingly changed on both fronts, the twin statements from the Justice Department and the FBI show little progress in getting them to care about the privacy of Americans. Holder even dragged out the specter of a “backdoor” for law enforcement installed within cell phones, an idea not floated since the Clinton-era “Clipper Chip” debates.

But such an idea is fundamentally more dangerous to consumers, according to UPenn security expert Matt Blaze. Security backdoor options are “very likely to either introduce or exacerbate a flaw in the software,” making the phone’s data available to onlookers other than law enforcement.

But even the request for such a loophole is a sad display for American law enforcement. Have federal authorities become so reliant on ritual abuse of our privacy they now must demand it in order to defend the law? Is the only way to fight child pornographers or terrorists to expose millions of Americans to hackers, thieves, and worse?

Technology, more often than not, makes policing significantly easier. Facial recognition technology means law enforcement can pick suspects out of a crowd of thousands. Social media consistently aids police in solving crimes big and small. And tech companies are usually more than glad to help, with both Apple and Google employing their full resources to the hunt for child pornographers.

But the reactions seen this week to a minor privacy setting change reveal an existential crisis within federal authorities. If one citizen, innocent or not, can “thwart” the goals of the Justice Department with a few swipes on the screen, we should be questioning how hard authorities are willing to work to prevent the criminal element they say these changes will help.

Because if the cost of doing business for the police is both our privacy and our safety, you might start to question whether they care about the citizens they swear to protect more than the job that pays them to do it.

Photo via DonkeyHotey/Flickr (CC BY S.A.-2.0)