Mr. Comey,

Sir, you may not know me, but I’ve impacted your agency for the better. For several years, I have been assisting law enforcement as a private citizen, including the Federal Bureau of Investigation, since the advent of the iPhone. I designed the original forensics tools and methods that were used to access content on iPhones, which were eventually validated by NIST/NIJ and ingested by FBI for internal use into your own version of my tools. Prior to that, FBI issued a major deviation allowing my tools to be used without validation, due to the critical need to collect evidence on iPhones. They were later the foundation for virtually every commercial forensics tool to make it to market at the time. I’ve supported thousands of agencies worldwide for several years, trained our state, federal, and military in iOS forensics, assisted hands-on in numerous high-profile cases, and invested thousands of hours of continued research and development for a suite of tools I provided at no cost – for the purpose of helping to solve crimes. I’ve received letters from a former lab director at FBI’s RCFL, DOJ, NASA OIG, and other agencies citing numerous cases that my tools have been used to help solve. I’ve done what I can to serve my country, and asked for little in return.

First let me say that I am glad FBI has found a way to get into Syed Farook’s iPhone 5c. Having assisted with many cases, I understand from firsthand experience what you are up against, and have enormous respect for what your agency does, as well as others like it. Often times it is that one finger that stops the entire dam from breaking. I would have been glad to assist your agency with this device, and even reached out to my contacts at FBI with a method I’ve since demonstrated in a proof-of-concept. Unfortunately, in spite of my past assistance, FBI lawyers prevented any meetings from occurring. But nonetheless, I am glad someone has been able to reach you with a viable solution.

In spite of my careful vetting of personnel, over time, my own forensics tools had began to trickle out (leak) into first the forensics community, and then to private commercial forensics companies who reverse engineered them and ingested them into their own commercial products. Inside law enforcement, I’ve heard of a number of questionable uses of my forensics tools to do everything from accessing subjects’ devices without a warrant, to accessing devices of girlfriends of officers. Forensics tools like the one the FBI is in possession of are extremely powerful, and in spite of even your best efforts, sir, my experience suggests that yours will no doubt be used for questionable uses just as mine were. I urge you to take appropriate steps to protect this new tool under the strictest use guidelines, even requiring an audit trail for every single use of it to prevent abuse. I do not recommend releasing it outside of the FBI, but rather recommend that the FBI maintain control of this tool when providing any assistance to outside law enforcement agencies.

I am glad that you were able to find a private company to provide material assistance, rather than the alternative – Apple being compelled to redesign their operating system. I do understand, however, that this issue is likely to be raised again with Apple. My experience in this field leads me to numerous concerns about both the direction this case was originally going, as well as concerns about where the FBI stands now with access to an exploitation technique for iOS devices. Here, I would like to share my technical concerns, as I’m sure you’ve no doubt heard all of the philosophical ones.

Across the span of my research of the iOS operating system, I’ve found many very bright and talented communities of individuals who share ideas and collaborate on vulnerability research. Often times, it takes a team of people to come up with solutions like the one you’ve acquired for the 5c. Many different individuals often wind up analyzing the same security boundaries on a device, only to find that their research has overlapped, and then collaborate to find ways to exploit it. Add to this the constant threat of intellectual property theft, a common occurrence among the forensics industry, and other less honest motivations in vulnerability research. The result is that an often-unsavory collection of third parties will race to exploit the same weaknesses. My point is this: One way or another, someone else always finds out about a vulnerability and exploits it.

What has been made painfully apparent to me for nearly the past decade in this field is that keeping an exploit secret is not possible, no matter how good an agency or corporation may be at keeping secrets – because an exploit is merely a dotted line on a blueprint. Mere knowledge of the general parameters of a vulnerability – even just the details of the device’s condition in this case – has been enough for security researchers to know exactly what security boundaries to start looking at, and they can do so now with the confidence that there is a known, exploitable vulnerability. One does not need to steal any exploit code in order to take advantage of a vulnerability; they only need to find the vulnerability; the way in already exists until it is closed. In spite of the picture that Hollywood paints, the list of potential security boundaries is quite narrow, especially given the details of this case.

The same is true of the software the FBI was trying to compel Apple to create. The FBI argued that Apple could contain such a technology, using a digital leash, however it is the mere existence of a vulnerable design, and not the leash, that poses the greatest technological risk. Consider that the security “switches” and what they were connected to would have been the core target of hackers. At present, these “switches” don’t exist to disable the security mechanisms FBI described, which makes them exponentially harder to attack.

To use a less technical analogy, consider a home alarm system. There is no question that many savvy thieves know how to disable one of these, and all of them know to attack the alarm box: the central security mechanism. A key protects this box; much like Apple’s code signing protects code execution. As you can imagine, picking Apple’s code signing has historically been as easy as picking the lock on this alarm box, and the FBI’s new exploit is likely just one more proof of that. What the tool doesn’t have, however, and why the tool doesn’t work on newer devices, is a conduit into the alarm box on newer devices (the Secure Enclave) to disable the security inside of it. Apple’s alarm box is buried under six feet of concrete, to prevent the common thief from being able to simply shut the alarm off.

I know the software the FBI asked Apple to create because I’ve created it for FBI (and for others) in the past. My forensics tools did virtually the same thing that FBI tried to compel Apple to do, and I’ve often wondered if FBI got the concept from one of my books on the subject. The success of hackers, in fact, is the primary reason that Apple has buried their alarm box in concrete on newer devices. In order for Apple to comply with the FBI’s order, they would have to drill a conduit and wires down to this alarm box for the FBI to be able to deactivate these mechanisms that they’re now protecting much better than they used to. It is this conduit, and not the tool itself, that poses the greatest security threat to public safety. If Apple is ever compelled to comply with such an order, that conduit will exist on every device manufactured, whether it is ever used or not.

Even if Apple were able to protect their own copy of “FBiOS” from getting out, and did a better job than Coke did protecting their recipe (which was stolen once, then offered to Pepsi), the mere design demands that are implied in building a tool like this would force every iPhone (not just those running FBiOS) to suffer a great security risk to public safety. After nine years and billions of R&D dollars, the FBI has demonstrated that there are still third parties capable of breaking at least some of the locks on Apple’s latest operating system (and in record time). If one company can find them, then criminal hackers, nation states hackers, and others can certainly find and tap into that conduit as well. This puts everyone at risk – you, me, the President, and everyone else using an iOS device.

Digital security is crucial to not only digital privacy, but also to physical safety. As one example, consider the iCloud celebrity leaks from two years ago. Not only were women victimized in the most intimate way, but the EXIF data analyzed revealed that many high profile women’s GPS history – their homes, boyfriends’ homes, frequented locations, etc., were also exposed, making stalking, sexual assault, or other violent crimes possible simply by means of one instance of digital theft.

Given that it’s only a matter of time before a criminal finds the blueprint to this vulnerability, I urge you to consider briefing Apple of the tool and techniques used to access Syed Farook’s device. While the part of the tool that brute forces a PIN does not seem to work on newer devices, the locks that it picks in order to get past the front door most certainly can be vulnerabilities that carry over into newer devices. Depending on the nature of these components of the solution, criminals or nation states could take advantage of them to install malware, spyware, ransomware, or to infect a target by other means. Individual components of this tool may be very dangerous to millions of Americans, even if the solution as a whole is not viable.

In our nation’s continuing discussion about the force of the All Writs Act, I also urge you to take these technical issues I’ve raised into consideration as well. Compelling Apple to design a specific version of their operating system to defeat security will no doubt have the side effect of compromising the overall design of their latest technology, and will open up design flaws and weakness to any third party to attack their devices, even if such a tool is never leaked. There is a way in which Apple can design their devices to be unhackable – even by them. Please let them do this, for the security and safety of our country. Otherwise, you may end up solving some crimes, but will inevitably engender others. What may help provide background on a murder or kidnapping will undoubtedly also create the opportunity for identity theft, cyber stalking, and even potentially murder as well. The nation’s debate about privacy is a double-edged sword.

Sincerely,

Jonathan Zdziarski