The FBI wants Apple to do something no private company has ever been forced to do: break its own technology. Specifically, the FBI wants Apple to build a new version of its mobile operating system (iOS, or GovOS) so that the contents of an iPhone can be removed from an iPhone used by Syed Farook, one of the gunmen in the San Bernardino shooting.

A magistrate judge recently ordered Apple to comply with this request; Apple in turn filed a Motion to Vacate (MTV) the magistrate’s order. The key point made in the MTV — and the key issue on which this entire case hangs — is that complying with the FBI’s request would weaken a valuable encryption platform at a time when the United States desperately needs stronger, more effective encryption.

There is an arms race to create more-sophisticated, harder-to-crack encryption tools, and if the FBI gets its way, we will be running that race with a self-imposed handicap.

This week Apple is appearing before Congress to address the issues raised above. For those unable to attend the hearings, I want to explore how Apple is thinking about the FBI’s legal authority to compel the company to create new software to crack Apple’s security measures.

After exploring that legal issue, we’ll consider the broader constitutional stakes involved in this case. After all, it’s not everyday that the U.S. government asks a private company to undermine a technology platform without providing any concrete evidence that doing so will make Americans safer.

What does the law say?

To understand what the law says, we must first properly frame what the FBI is trying to compel Apple to do. Without a precise understanding of what the FBI is demanding in this case, it is hard to clearly say that the FBI is trying to overstep its bounds.

What is the FBI seeking here? First, the FBI is demanding that Apple make a new software product. Second, that software product would have to be designed in accordance with specifications provided to Apple by the FBI. Third, once Apple created that software product, it would have to test the product to ensure it met Apple’s own quality standards. Fourth and finally, Apple would have to test and validate this software product so that criminal defendants would be able to exercise their constitutional rights to challenge the government’s legal claims as provided by the Federal Rules of Evidence (FRE).

Forcing a company to break its own technology appears to be something a dictatorship might do, not a democracy like the United States.

Simply put, the FBI is demanding that Apple create a new software product that meets specifications provided by the FBI. As Apple clearly articulates in its MTV, the FBI is demanding “the compelled creation of intellectual property.” The legal grounds for the FBI’s demand come from the Communications Assistance for Law Enforcement Act (CALEA) and the All Writs Act (AWA).

With this understanding in mind, what does the law say? Is there any law that allows a government agency such as the FBI to compel private companies to create new software products?

Let us begin with the key law regulating the interception of electronic communications, CALEA. This law was enacted to carefully control the government’s right and ability to intercept communications in order to enforce the laws of the United States. Specifically, CALEA outlines the circumstances in which a private company must provide law enforcement with assistance in order to effectively carry out electronic surveillance.

Under CALEA, there is a strong argument that Apple cannot be legally required to create new software of any kind for any department of the federal government. When Congress passed CALEA, it had the opportunity to include device manufacturers like Apple within the scope of the law. Congress decided to require telecommunications companies to ensure that their equipment and facilities are built in a way that allows the government to conduct surveillance on the basis of a lawful surveillance warrant.

In other words, telecommunications companies have to build in a back door. However, under CALEA, Apple is not a telecommunications company; instead, Apple is considered an “information service” to which CALEA does not apply. In short, Congress made it clear they did not intend for CALEA to even apply to companies like Apple.

Even if CALEA applied to Apple, the FBI would not be entitled under CALEA to force the company to break its encryption protocol. The statute in section 1002(b)(3) states that telecommunications companies are not responsible for decrypting communications “unless the encryption (1) was provided by the carrier and (2) the carrier possesses the information necessary to decrypt the communication.”

Because Apple does not currently possess that information, even an improperly broad interpretation of CALEA would not compel Apple to create GovOS in this case. The FBI can ask, but under CALEA it cannot compel.

The All Writs Act (AWA) also does not allow the FBI to compel Apple to create new software. Enacted in 1789 as a stop-gap that allows the government to efficiently administer its given legislative privileges, the AWA is being given an impermissibly broad interpretation by the FBI.

According to that interpretation, this stop-gap gives courts any relief that is not specially prohibited by existing law. So, if there’s no law expressly prohibiting Apple from being compelled to write code for the FBI, then the AWA gives courts the authority to force the company to do just that.

Apple should do what is necessary to preserve our enduring constitutional values.

Let’s take a completely make-believe example. Imagine that a federal law gives a particular agency the right to do X, but doing X is hard and costly. The AWA might be invoked to help get X done more efficiently. But the key is this: The AWA is only appropriate when there’s already a federal law or a constitutional principle that gives the particular agency the right to do X in the first place. That is precisely why the AWA cannot be lawfully used by the FBI in this case: The FBI has no underlying right to compel Apple to create new software products.

If this seems like a legal technicality, zoom out a bit and reconsider that for just a minute. Imagine if the Department of Homeland Security used the AWA to argue that citizens with certain last names should be subject to arbitrary detention to make it easier to catch terrorists. Would that violate American values and our system of laws? Absolutely.

Alternatively, consider a scenario in which the Department of Energy tried to use the AWA to force federally funded universities to “donate” resources to the DOE in order to enhance its Energy Materials Network. Would this be inappropriate? It would be completely inappropriate, because the DOE does not have the underlying legal right to force universities to do this.

In a nation of laws, the FBI’s attempt to expand the AWA is dangerous. The FBI’s interpretation of the AWA transforms the law into something it was never meant to be: a tool granting government agencies boundless powers not authorized under the Constitution or in existing federal law.

Lawyers have a fancy way of describing this problem. They say that expanding the AWA violates the separation of powers between the federal courts and Congress. After all, what is the purpose of Congress if our courts are allowed to expand federal law without any meaningful limitations? One might go further still and say that forcing a company to break its own technology appears to be something a dictatorship might do, not a democracy like the United States.

Fortunately, a Brooklyn judge recently ruled, in a separate but similar case involving a demand from the Department of Justice to unlock an iPhone, that the AWA only empowers courts with “residual authority to issue orders that are consistent with the usages and principles of law.” Judge Orenstein explicitly condemned the government’s overreach in that case, echoing the exact concerns explored above: “The implications of the government’s position are so far-reaching — both in terms of what it would allow today and what it implies about Congressional intent in 1789 — as to produce impermissibly absurd results.”

What should Apple do?

Apple should do what is necessary to preserve our enduring constitutional values, including life, liberty and the pursuit of happiness. Those values also include the privacy and speech rights protected by the Constitution. The First Amendment famously protects an individual’s right to say what he or she thinks or feels, and the Fourth Amendment guarantees that Americans shall be free of unreasonable searches and seizure.

These values and constitutional ideals are not mere commodities to be traded away, but are instead regulative ideals that capture and define who we are. Such ideals must remain unmolested by the temporary whims of each and every government agency. That’s what it means to be a nation of laws that is guided by a constitution.

In this particular case, Apple has a responsibility to resist the FBI’s efforts to force the company to undermine the security measures in its mobile operating system. To understand what is at stake here, one has to think deeply about what the world would be like if Apple were to comply with the FBI’s demands.

Imagine that Apple complied with the FBI. To do so, Apple would need to build a new version of iOS (GovOS) that does three things.

Those who would give up essential Liberty, to purchase a little temporary Safety, deserve neither Liberty nor Safety. Benjamin Franklin

First, GovOS would bypass the auto-erase function for an individual iPhone. This feature is designed to prevent third parties from getting unauthorized access to an iPhone’s contents.

Second, Apple’s newly minted GovOS would need to provide the FBI a new way of electronically submitting passwords to a particular iOS device. At present, these passwords must be manually submitted, and each incorrect password submission results in a delay before another attempt can be made.

Third, and finally, GovOS would disable the delay between incorrect password submissions. In a nutshell, GovOS would be a special version of iOS that allowed an iPhone to be cracked automatically without knowing the owner’s password.

The FBI, then, is asking Apple to build a technology that destroys the value of the key security mechanisms built into its mobile operating system: The FBI wants to force a private company to build a tool that completely breaks the security technology for what is arguably the world’s gold-standard for mobile operating systems, iOS.

On this narrow issue, the FBI has to agree and concede this critical point. For the FBI cannot say that (1) it needs Apple’s assistance to crack an iPhone but (2) Apple’s assistance would not break a world-class encryption product. Once the FBI says that it needs Apple’s help, the FBI can’t honestly challenge the fact that the help it seeks would utterly break a security suite that Apple has spent years developing.

A recent conversation with information security expert John Sebes (formerly of Securify, acquired by McAfee) put this issue into proper context. Imagine you are building a security mechanism for your mobile ecosystem. You have spent years developing this system because you want to provide your customers, private citizens as well as the government, a software product that is secure. Your intention, in other words, is to create a product that protects the security and integrity of information your customers place on any device that has that security mechanism.

Now, someone comes along and asks you to make a special technology that defeats your security mechanism. If you go ahead and create this special technology, what happens to the retained value of the security mechanism that you took so many years to create? It simply vanishes because your customers now know that you have designed a way to undermine your supposedly world-class security mechanism. No one has to wonder whether this security mechanism could be broken; you’ve already demolished it.

This is a key insight lost on many who argue that the FBI’s request can be honored without eliminating the value of Apple’s security features on iOS. Many have said that Apple or the FBI could protect the integrity of GovOS by maintaining a special lab specially designed to make sure no bad guys ever got access to GovOS.

Do you see the irony here? The agents in charge of this case are effectively admitting that they are deeply confused, because here’s what they’re saying: Once Apple makes technology B (GovOS) to compromise technology A (standard iOS), we’ll make sure that technology B can never, ever be compromised because we’ll make a special super-secure room that no one will ever be able to break into.

The idea that a super-secure place, virtual or real, can be designed to make sure none of the bad guys ever get ahold of GovOS is really, really silly. “If human beings can in any way touch the code, to create it in the first place or modify it later, then human beings can copy and steal it, period,” noted Mr. Sebes. Further condensed: If you can touch it, it can be stolen.

This is precisely the point: There is no perfect “super-secure room” to hide GovOS so only the good guys can use it to catch the bad guys. So, in a world where Apple created GovOS for the FBI, a couple of things would happen immediately.

First, the retained value of iOS’s security protocol would vanish. All of Apple’s customers, including government agencies, would know that iOS has been cracked. Reflect for a second on what this means: In this imagined future, we would all know with certainty that iOS could be breached.

Second, the bad guys would have an additional incentive to rely on non-Apple encryption technologies from third-party vendors. After all, once the bad guys know with certainty that Apple’s products are not secure, they will want to use tools that are not susceptible to countermeasures by the U.S. government.

Third, while the bad guys transition to third-party encryption vendors, Apple would have to rethink its strategy. Because its individual and government customers would know that iOS’s core security features had been defeated, the company would have to decide whether it should continue investing in best-of-breed encryption technology.

It’s really hard to say how Apple would internally approach this question. Yet one thing would loom large in the minds of Apple executives: Technological advances in encryption would no longer matter to consumers because they would no longer have any reason to believe that encryption is, well, what it’s meant to be — a technology that prevents any and all third parties from accessing one’s data.

Final thoughts

Perhaps this is the right time to share Franklin’s adage about the moral quality, or lack thereof, of folks who want to trade privacy for a little temporary security. Franklin wrote the following to the Pennsylvania Assembly in 1755: “Those who would give up essential Liberty, to purchase a little temporary Safety, deserve neither Liberty nor Safety.”

I truly wish Franklin’s quote sufficed to describe the constitutional predicament we are in today. But matters are far worse than that. Franklin’s quote imagines a scenario where “temporary safety” can indeed be purchased by giving up liberty. That’s not the bargain the FBI is offering.

Instead, the FBI is offering what some people have called a grand bargain — where A gives something of concrete value to B, but B is not required to clearly specify what A will receive in return. In this case, Apple is being asked to give something of concrete value to the FBI (e.g. the time of its engineers) but it is totally unclear what the FBI is giving back to Apple or to the American people.

In a nutshell, here’s where we are: A government agency is trying to force the world’s most valuable technology company to break its encryption technology despite (1) having no legal authority to do so and (2) being unable to articulate what they hope to achieve on behalf of the American people. Sounds like a grand bargain to me.