Former antivirus developer and presidential wannabe John McAfee claimed a couple of weeks ago to have the perfect solution to the FBI-Apple stand-off. He offered to crack the iPhone for the FBI for free. This would let the government agency gain access to the phone while freeing Apple from any demands to assist. So confident was McAfee of his ability to help out that he said he'd eat a shoe on TV if he couldn't get into the phone.

It will probably not come as much of a surprise to anyone to learn that the FBI has not been beating down McAfee's door.

Perhaps they were unconvinced by the strategy that the man outlined. He said that he and his team would primarily use "social engineering," which is to say, manipulating people into telling you what you want to know through gaining their trust. It can be a powerful technique, but it certainly isn't a panacea. It's often less effective when the victims are aware that you're trying to socially engineer them (for example, by announcing your intent to do so on the Internet). It's less effective still when the people holding the information are in fact dead. McAfee may be persuasive, but probably not so persuasive as to be able to coax a corpse to give up its PIN.

But John McAfee is clearly a public spirited man, eager to share his wisdom with the government and protect us all from an Apple backdoor. In an interview with Russia Today in which he is unironically introduced as a "cybersecurity legend" and currently running to be nominated as the presidential candidate for the Libertarian Party (an association that frankly does neither party any favors), he outlined the real technique that he'd use to crack the phone, and it turns out to be a super convenient "half hour job." McAfee only said it'd take three weeks because he didn't want to have to eat his shoe if a cold or flu interrupted him.

McAfee's own words say more than any paraphrase could hope to do, so here in full is how he plans to crack the iPhone:

Now I'll probably lose my admission to the world hackers' community, however, I'm gonna tell you. You need a hardware engineer and a software engineer. The hardware engineer takes the phone apart and it [sic] copies the instruction set, which is the iOS and applications [sic] and your memory, and then you run a piece, a program called a disassembler which takes all the ones and zeroes and gives you readable instructions. Then, the coder sits down and he reads through, and what he's looking for is the first access to the keypad, because that's the first thing you're doing when you input your pad. It'll take half an hour. When you see that, then you reads the instruction for where in memory this secret code is stored. It is that trivial. A half an hour.

Moreover, he says that this technique will work against "any computer," and that if the FBI has any part of the process that they don't understand then they should call him.

Given the simplicity of this approach one might well wonder why the FBI hasn't done this already. The answer turns out to be straightforward: as some of our more astute readers may have noticed, it's a load of drivel. What he's proposing isn't just wrong; it's not even in the same zip code as the truth.

The core claim, the part on which everything else hinges, is that there is a location on the iPhone's flash storage (or perhaps RAM; he uses "memory" pretty interchangeably for both) that contains a plaintext, readable copy of the device's PIN, and that iOS compares the PIN typed in to this stored value. It's true that Apple could have designed the iPhone this way, if Apple was staffed exclusively by idiots. But Apple did not design the iPhone this way, and John McAfee should know that Apple did not design the iPhone this way. Apple has a rather good document that describes major parts of the iPhone's security systems—I wish every operating system vendor had comparable documentation—and in particular, it describes how the system's PIN or passcode is used to derive the encryption key for the filesystem. The PIN is combined with a unique hardware ID to generate the keys for the phone's encrypted filesystem.

The iPhone PIN is not stored on the flash storage at all, because there's really no need. If the wrong PIN is entered then the encryption key that gets generated by combining the PIN with the hardware ID won't work. It won't unlock the encrypted files. That's how the iPhone can verify that the PIN is correct (or not); a correct PIN will generate the right encryption key. An incorrect PIN will not. The software proves the PIN is correct by trying to use it, not by comparing it to an unencrypted version.

This aspect isn't unique to the iPhone, either. In spite of McAfee's claim that "any computer" can be unlocked this way, other encrypted storage systems, such as Windows BitLocker and TrueCrypt, have the same feature. They don't store a copy of the passcode on the disk; they verify that the passcode is correct by virtue of the fact that it can successfully unlock the disk.

Now, it's true that other passwords can be stored differently, and if we were feeling generous we might think that McAfee is mistakenly assuming that the iOS PIN is treated in the same way as a regular login password. The conventional login password that's used on a typical standalone Windows or Linux or OS X machine is in a sense stored on the disk.

But even for these passwords, operating systems do not simply store the password verbatim such that anyone can read it, meaning that McAfee would still be out of luck. Rather, they store it in some irreversible mathematically transformed form from which the original password cannot be recreated. To test if a password is correct, the operating systems perform the same mathematical transformation on the typed in password and see if it matches the copy stored on disk.

Aside from that core claim—that there's a PIN sitting on the iPhone's flash storage just waiting for someone to read it—the notion that a "software engineer" could just figure out where that location was with nothing more than a disassembler and half an hour of spare time is fanciful nonsense. Even with access to all of iOS' source code, an uninitiated developer would be doing well to find all the pieces of code responsible for handling the PIN and validating it. With nothing more than dissasembled code—the (barely) human-readable counterpart to the machine code—it's going to take substantially longer. iOS has hundreds of megabytes of executable code, and a disassembled iOS is going to be millions upon millions of lines of almost unintelligible assembly code.

Thus far, the most plausible method for decrypting the San Bernardino iPhone without Apple's assistance involves manually inspecting the handset's processor using acid and lasers. Done correctly, this would let the FBI learn not the PIN, but the device's unique hardware ID. With that ID, they could combine it with each of the PINs in turn until they hit upon the right one. The cost and complexity of this technique would be extremely high, and it would be extremely risky: one wrong move and the hardware ID would be destroyed permanently, making the phone's data permanently and irrevocably lost.

If John McAfee really does believe that the FBI, and everyone else working in computer security, is some kind of an idiot who hasn't realized a very basic and simple flaw in the iPhone's security, he could trivially prove his theory and show the world where the iPhone keeps its PIN. He could film the whole thing and put it on YouTube. Given that the whole exercise should only take half an hour, it's hard to see any reason why McAfee wouldn't do this—unless he's not a fan of the taste of shoe leather.