Unlike most Silicon Valley companies, Apple’s business model is one of "Data Liability." Unlike Google or Facebook which use advertising to extract value from users’ personal information, Apple focuses on selling things that protect a user's data from all unauthorized access — including by Apple. Basically, Apple views user data as a headache, not a monetization opportunity.

The upcoming iOS 11 makes further changes in order to protect users’ phones, tablets, and iPods from unauthorized access. Phones, however, cannot reliably distinguish between law enforcement, thieves, and hostile intelligence services; it treats anyone but the user as unauthorized. This means the upgrades will have some impact on lawful investigations. That isn’t necessarily a problem--the benefits here outweigh the costs. But it’s worth noting some of the potential impacts of these changes, and the areas in which fears are overblown.

For example, the new "SOS" mode, designed to quickly make an emergency call but which also locks-out the fingerprint reader, has drawn attention as for its privacy implications. Since the fingerprint reader is one-way, law enforcement has been able to unlock phones without triggering messy Fifth Amendment questions, and some view this update as a panic mode to avoid a police search. But there are already a number of ways to rapidly disable the fingerprint reader, such as powering off the phone, using the wrong finger 4 times, or just waiting long enough for the feature to disable itself. So this is more hype than substance.

A different and more significant change is requiring the passcode to "trust" a new computer. Currently, when the police wish to search a phone, they unlock it either with the fingerprint reader, by convincing the suspect to unlock the phone (e.g. to look up a phone number), or they simply seize the phone while it is unlocked. None of these avenues directly implicate suspects’ constitutional rights. Once the unlocked phone is obtained, officials connect the device to a computer running forensics software, or even just iTunes, direct the device to “trust” the new computer when prompted, and download a backup that contains almost all of the relevant information stored on the phone. Requiring the passcode in order to sync the device with a new machine means that, even with an unlocked device, a party that wants access is now limited to searching the phone manually for visible items and can only perform that search while the phone remains unlocked.

This might be particularly consequential during border searches. The "border search" exception, which allows Customs and Border Protection to search anything going into the country, is a contentious issue when applied electronics. It is somewhat (but not completely) settled law, but that the U.S. government can, without any cause at all (not even "reasonable articulable suspicion", let alone "probable cause"), copy all the contents of my devices when I reenter the country sows deep discomfort in myself and many others. The only legal limitation appears to be a promise not to use this information to connect to remote services. The new iOS feature means that a Customs office can browse through a device--a time limited exercise--but not download the full contents. This will be a welcome change for those--like myself--uncomfortable with the extent of border searches, but it will surely cause a few law enforcement headaches.

The change will also impact 5th Amendment issues around searches conducted with a warrant. Currently, there are a few cases concerning forced decryption, where the government demands that the device owner provide the password. I expect such cases to become much more common as, without the password, it will no longer be possible to conduct a forensic search.