Apple has filed a motion to effectively bin an earlier court order forcing it to help the FBI break into a San Bernardino killer's iPhone.

The Cupertino giant said the Feds' demands were a "wild overreach" and would grant the US government a "dangerous" power.

The extensive motion to vacate [PDF] tears apart the FBI's claims that it needs Apple to create a version of iOS that bypasses the operating system's security mechanisms. Apple said that such a demand violates its First and Fifth Amendment rights, would be very difficult to obey, and opens the door to all kinds of problems. The motion – which will be considered at a hearing in March – reads:

Under the same legal theories advocated by the government here, the government could argue that it should be permitted to force citizens to do all manner of things "necessary" to assist it in enforcing the laws, like compelling a pharmaceutical company against its will to produce drugs needed to carry out a lethal injection in furtherance of a lawfully issued death warrant, or requiring a journalist to plant a false story in order to help lure out a fugitive, or forcing a software company to insert malicious code in its autoupdate process that makes it easier for the government to conduct court-ordered surveillance.

Today's filing points out that the legal action wouldn't have been necessary if the FBI hadn't stupidly changed the shooter's iCloud password – a move that prevented the murderer's locked device from automatically backing up its contents to Apple's servers where its files can be accessed by investigators. Apple has helped out with hundreds of cases, it states, but this one draws the line.

To build the custom code required by the FBI – which will allow agents to guess the iPhone's passcode by brute-force – would take up to a month and require six to ten Apple employees to work on the case full time in a secure environment. The code would then need to be tested across multiple devices to make sure it didn't trigger any bugs that could delete important information on the killer's iPhone 5C.

If Apple did create a knackered iOS for the Feds, it would then have to give the same code to other law enforcement agencies that are dying to crack into other Apple devices. The iGiant insisted it believed this customized iOS will never be used for just one iPhone. The motion, submitted in a Los Angeles court, states:

Responding to these demands would effectively require Apple to create full-time positions in a new "hacking" department to service government requests and to develop new versions of the back door software every time iOS changes, and it would require Apple engineers to testify about this back door as government witnesses at trial.

The filing also criticizes the FBI's use of the powerful All Writs Act to enforce Apple's compliance. The creaky old law – which gives courts a judicial tool to, simply put, get things done – was never intended for this purpose, Apple argues, and constitutes "wild overreach" by law enforcement.

New court filing: Apple hammers FBI on “dangerous” powers, but really it comes down to misuse of the All Writs Act pic.twitter.com/55koTCYNNr — The Register (@TheRegister) February 25, 2016

Apple questions the methods used by the FBI to secure its original order to compel against the firm. The judge in that case – magistrate Sheri Pym – simply took the FBI's word for it that the request was "reasonable," and Apple wasn't given the opportunity to make its case.

Apple also claims that complying with the FBI would breach the company's constitutional rights by forcing it to use its cryptographic keys in the borked operating system, which amounts to "compelled speech and viewpoint discrimination in violation of the First Amendment." The company's Fifth Amendment rights would also be breached because it violates due process:

While the government's desire to maximize security is laudable, the decision of how to do so while also protecting other vital interests, such as personal safety and privacy, is for American citizens to make through the democratic process. Indeed, examples abound of society opting not to pay the price for increased and more efficient enforcement of criminal laws. For example, society does not tolerate violations of the Fifth Amendment privilege against self-incrimination, even though more criminals would be convicted if the government could compel their confessions. Nor does society tolerate violations of the Fourth Amendment, even though the government could more easily obtain critical evidence if given free rein to conduct warrantless searches and seizures. At every level of our legal system – from the Constitution, to our statutes, common law, rules, and even the Department of Justice's own policies – society has acted to preserve certain rights at the expense of burdening law enforcement's interest in investigating crimes and bringing criminals to justice.

The motion also includes testimony from Erik Neuenschwander, Apple's manager of user privacy, which points out the problems in building GovtOS, as he calls the code. Basically the code would eliminate the feature that erase the device owner's data if too many incorrect passcodes are entered, allowing multiple and near instantaneous password attempts, and would have to be installed on the phone's RAM:

No such operating system currently exists with this combination of features. Moreover, Apple cannot simply remove a few lines of code from existing operating systems. Rather, Apple will need to design and implement untested functionality in order to allow the capability to enter passcodes into the device electronically in the manner that the government describes.

The GovtOS would have to be used in a secure facility on campus, he explained, and if the FBI took a copy it would have to make sure it was kept out of the hands of hackers and other miscreants. Even if the GovtOS is destroyed after this investigation – which is highly unlikely – it would still be possible to reproduce it, he argues:

Even if Apple were able to truly destroy the actual operating system and the underlying code (which I believe to be an unrealistic proposition), it would presumably need to maintain the records and logs of the processes it used to create, validate, and deploy GovtOS in case Apple's methods ever need to be defended, for example in court. The government, or anyone else, could use such records and logs as a roadmap to recreate Apple's methodology, even if the operating system and underlying code no longer exist.

Coincidentally the director of the FBI, James Comey, visited US Congress today to give evidence before the House Intelligence Committee on the threats facing America, and of course the iPhone case came up. He said that cases like this one shouldn't be decided by the courts but by the legislature.

"I love encryption, I love privacy, and when I hear corporations saying 'We're going to take you to a world where no one can look at your stuff,' part of me thinks that's great," he said.

"But then I step back and say law enforcement, which I'm part of, really does save people's lives, rescue kids, rescue neighborhoods from terrorists, and we do that a whole lot through court orders and search warrants of mobile devices. So if we're going to move to a world where that is not possible any more then the world will not end, but it'll be a different world." ®