Just As Attorney General Barr Insists iPhone Users Have Too Much Security, We Learn They Don't Have Nearly Enough

from the well,-look-at-that dept

You may recall a few years back, John Oliver did one of his always excellent Last Week Tonight shows all about encryption. It concluded with an "honest Apple commercial" that highlighted the difficulty of keeping phones secure, and noting that it's a constant war against malicious attackers who are always trying to figure out new ways to break into people's phones:

That commercial is a lot more realistic than people might think. And late last week, Google revealed a pretty astounding iOS exploit that broadly targeted anyone who visited a series of compromised websites, using a combination of zero day attacks that allowed them to more or less own anyone's iPhone who had visited the sites. As Wired noted in its piece about this attack, it changes most of what we know about iPhone attacks these days. At the very least, it demolished the idea that most iPhone hacking really only targeted key individuals.

It also represents a deep shift in how the security community thinks about rare zero-day attacks and the economics of "targeted" hacking. The campaign should dispel the notion, writes Google Project Zero researcher Ian Beer, that every iPhone hacking victim is a "million dollar dissident," a nickname given to now-imprisoned UAE human rights activist Ahmed Mansour in 2016 after his iPhone was hacked. Since an iPhone hacking technique was estimated at the time to cost $1 million or more—as much as $2 million today, according to some published prices—attacks against dissidents like Mansour were thought to be expensive, stealthy, and highly focused as a rule. The iPhone-hacking campaign Google uncovered upends those assumptions. If a hacking operation is brazen enough to indiscriminately hack thousands of phones, iPhone hacking isn't all that expensive, says Cooper Quintin, a security researcher with the Electronic Frontier Foundation's Threat Lab. "The prevailing wisdom and math has been incorrect," says Quintin, who focuses on state-sponsored hacking that targets activists and journalists. "We've sort of been operating on this framework, that it costs a million dollars to hack the dissident’s iPhone. It actually costs far less than that per dissident if you’re attacking a group. If your target is an entire class of people and you're willing to do a watering hole attack, the per-dissident price can be very cheap."

Now, it's true that device encryption has nothing to do with this attack -- and, in fact, the attack could be seen as a way to get around device encryption, since it was putting malware on your phone that could slurp up your data once you unencrypted it locally -- but it does strike me as yet another condemnation of Attorney General William Barr's utter nonsense lately about how the average consumer doesn't need that much phone security these days. If you'll recall, Barr shrugged off concerns about banning real encryption by saying that since all phones have some security vulnerabilities, what's a few more:

All systems fall short of optimality and have some residual risk of vulnerability — a point which the tech community acknowledges when they propose that law enforcement can satisfy its requirements by exploiting vulnerabilities in their products. The real question is whether the residual risk of vulnerability resulting from incorporating a lawful access mechanism is materially greater than those already in the unmodified product. The Department does not believe this can be demonstrated.

The Department of Justice and Barr are wrong. Encryption still remains not just a key piece of fighting these vulnerabilities, but one of the most important. Creating "lawful access" points is worse than taking away a protection, it's literally enabling a multitude of new vulnerabilities -- and playing right into the hands of people looking to exploit such vulnerabilities.

Indeed, as the Wired article notes, even as surprising and unexpected as the latest vulnerabilities were, it's notable that they appeared to be out there for quite some time, with many, many victims, and no one spotted it even though the attackers were super sloppy:

The hackers still made some strangely amateurish mistakes, Williams points out, making it all the more extraordinary that they operated so long without being detected. The spyware the hackers installed with their zero-day tools didn't use HTTPS encryption, allowing anyone on the same network as a victim to read or intercept the data it stole in transit. And that data was siphoned off to a server whose IP addresses were hardcoded into the malware, making it far easier to locate the group's servers, and harder for them to adapt their infrastructure over time. (Google carefully left those IP addresses out of its report.) Given the mismatch between crude spyware and highly sophisticated zero-day chains used to plant it, Williams hypothesizes that the hackers may be a government agency that bought the zero day exploits from a contractor, but whose own inexperienced programmers coded the malware left behind on targeted iPhones. "This is someone with a ton of money and horrible tradecraft, because they’re relatively young at this game," Williams says.

And that certainly suggests that there are likely already much more sophisticated attacks out there -- and if not, many more are coming soon. And, they will target any and all possible vulnerabilities -- including any "backdoor" the DOJ/FBI demands that device makers install. Contrary to what you may have heard that the debate over backdoors is a fight between 'security and privacy," it's not. It's a debate between "security for most people, and rare instances where law enforcement doesn't want to do basic detective work and wants everything handed to them."

This latest revelation should now make many people more aware of the security challenges of protecting connected devices. But it should also re-emphasize how utterly ludicrous it would be to purposefully insert new vulnerabilities into phones because the DOJ can't be bothered to do its job properly.

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community. Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis. While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: encryption, hacks, iphone security, iphones, security, william barr, zero days

Companies: apple, google