When Michalski’s phone was seized, agents were limited in what they could access. Even after it was unlocked, the FBI agent would need the phone’s passcode in order to plug it into a computer for forensic analysis to find “hidden, erased, compressed, password-protected, or encrypted files.” Interestingly, the law treats passcodes and PINs differently from fingerprints and face scans, even if they perform the same function: unlocking devices.

Can cops force you to unlock your phone with your face?

According to John Verdi, the vice president of policy at the Future of Privacy Forum, compelling passcodes from suspects can be extraordinarily difficult because of the Fifth Amendment, which enshrines our right not to be forced to incriminate ourselves. High courts have ruled consistently that passcodes are “testimonial”—that is, they “explicitly or implicitly, relate a factual assertion or disclose information”—and therefore that forcing citizens to surrender them is self-incriminating and unconstitutional.

But biometrics are different. Despite being more advanced technology, they’re encoded into the law as a traditional search. “In the same way that giving someone a Breathalyzer test or giving someone a blood test to try to determine the presence of alcohol is not testimonial in a Fifth Amendment sense,” Verdi told me, biometrics are “rather a search under the Fourth Amendment.” Police can acquire these types of warrants more easily.

This creates an interesting inversion. If your passcode is say, your mother’s birthday or your wedding date, a thief or snooping spouse could correctly guess and have full access to your device and everything therein. Biometrics, particularly the emergent wave of continuous biometric authentication, are almost certainly a stronger security option—unless law enforcement is involved.

But, of course, most average users aren’t using their phones with the expectation that someday the FBI will come knocking. Brian Jackson, a scientist at the Rand Corporation studying criminal justice and homeland security, told me that consumer preferences for novelty and convenience are shifting how people compute what he refers to as “risk calculus”—how we weigh the trade-off between privacy and convenience as we acquire more intelligent devices. “As technology changes and as people get used to things, things that we at one point would have thought were very invasive or very different become routine,” Jackson said.

A good example, he said, is the smart TV: an always-on listening device that continuously listens for external commands before communicating with other devices. It’s literally a microphone in your home. That may have caused alarm five years ago, but now, Jackson said, “I was talking to folks and they’re like, ‘It’s actually getting tougher to find TVs that don’t have that built in.’ That’s a feature that people have liked, and it’s convenient and it’s cool and, therefore, the market proliferates it. It would not surprise me if, as biometrics get more instituted to more and more devices for various convenience-related things, that it’ll seem much less unusual for these sorts of things to crop up.”

As the market adapts to this new convenience, Jackson said, people will, perhaps unwittingly, shift away from passcodes, invisibly changing the stakes of our risk calculus. Continuous authentication, like face recognition before, is a massive scaling-up of processing power, and a showcase for the potency of artificial intelligence. If only our privacy and legal protections could scale up at a similar pace.

This article is part of our project “The Presence of Justice,” which is supported by a grant from the John D. and Catherine T. MacArthur Foundation’s Safety and Justice Challenge.

We want to hear what you think about this article. Submit a letter to the editor or write to letters@theatlantic.com.