The government wants to access information from digital devices in emergencies and legal investigations. It can’t do that right now because some services (like WhatsApp) and some devices (like iPhones) are encrypted.

While terrorists may still be using burner phones, law enforcement officials are in possession of an overwhelming number of devices involved in criminal investigations that they can’t access due to encryption. Naturally, investigators turned to the device and software manufacturers for help, and they’ve been petitioning for help breaking into these devices for quite a while.

But after the 2013 NSA surveillance leaks by Edward Snowden, people’s trust in the government shattered. Companies like Apple and Google began to include device encryption by default in their smartphones. App makers like the folks behind WhatsApp also added encryption to their products. Although the government wasn’t thrilled with the companies’ decision, the encryption debate largely remained hidden from the public eye. That is, until one particular case broke the debate wide open.

In December of 2015, a shooter by the name of Syed Farook, along with his wife, opened fire on San Bernardino county employees at a holiday event, killing 14 people. The pair, who died in a shootout with the police, are believed to have ties to the Islamic State, but all they left behind was an encrypted iPhone 5C that belonged to San Bernardino County.

Apple initially assisted the FBI in the investigation, but then the bureau slapped a court order on the Cupertino company, ordering it to create specialized code that would offer a backdoor into the iPhone. Apple, afraid the code would get into the wrong hands, refused to comply. The company argued that giving law enforcement an encryption key would threaten the security of its customers’ data — and their privacy. After a month-long legal battle, the FBI dropped the case, after it managed to access the iPhone with the help of paid, professional “gray hat” hackers.

If Apple had complied, the FBI would potentially be able to use the “GovtOS” code to unlock any iPhone it had in its possession. If Apple was forced to hand over its source code — that would mean the government could push an update to all iPhones and still claim it was coming from Apple. The government could then have accessed every aspect of your phone, from the camera to the microphone.

“There are all kinds of things we want to hide from other people.”

In the wrong hands, an encryption key would make it dead simple for cybercriminals to take over anyone’s iPhone, accessing pictures, texts, emails, and whatever else you have stored on your phone. People may say they have nothing to hide, because they are not doing anything wrong — but someone can always find data sitting in your smartphone to exploit.

But to some, it’s equally worrisome for the government to have the capability to snoop on everything you do. As Glenn Greenwald said, “there are all kinds of things we want to hide from other people — that we tell our psychiatrist, our lawyer, our doctor, our spouse or a stranger on the Internet — that have nothing to do with criminality.”

There have been numerous calls for Congress to take a stand and begin a conversation to resolve the encryption problem, and finally the draft of the anti-encryption bill, dubbed the “Compliance with Court Orders Act of 2016,” has officially been released. It’s almost the same as the one that was leaked a week ago.

The bill, introduced by Senator Richard Burr, R-N.C., and Senator Dianne Feinstein, D-CA, would force companies to comply with court orders requesting access into their devices or services.

Another bill backed by House Homeland Security Committee Chairman Michael McCaul, R-Texas, and Sen. Mark Warner, D-VA, wants to create a “national commission” that would delve into the subject of encryption within criminal investigations.

But the Feinstein-Burr bill aims to take action. Here’s everything you need to know about it.

What is the Compliance with Court Orders Act of 2016?

The bill, officially titled the Compliance with Court Orders Act of 2016, was introduced on April 13 as a discussion draft — meaning that its language is subject to change. The first step is to open discussions to get feedback, Tom Mentzer, Feinstein’s press secretary, told Digital Trends.

Its language is simple — no one is above the law. While the bill says companies that offer communications services and products should protect their customers’ privacy, it also says companies have to comply with court orders, as the law dictates. In other words, you can offer encryption, but you must also have a key readily available to break that encryption when law enforcement needs access to a suspect’s device.

“To uphold both the rule of law and protect the interests and security of the United States, all persons receiving an authorized judicial order for information or data must provide, in a timely manner, responsive, intelligible information or data, or appropriate technical assistance to obtain such information or data,” the discussion draft states.

When you break down the legal language, you’re left with this: Companies that have been ordered to provide information or data must do as the courts and law enforcement command. If your service or product is encrypted, decrypt it and provide the data legibly. But the government also wants to make sure future products will allow easy access to user data.

“A provider or remote computing service or electronic communication service to the public that distributes licenses for products, services, applications, or software of or by a covered entity shall ensure that any such products, services, or applications, or software distributed by such person be capable of complying with subsection (a),” the draft continues.

The government wants software and hardware license distributors to make sure new services and products can easily offer access to encrypted data in an “intelligible” format.

But if the data is encrypted, how can companies make it easily accessible?

A lot of the time, the data is encrypted and can’t be accessed at all — which is why Apple called it a “burden” to have a team of engineers dedicate time to create specialized software that would offer a backdoor for the FBI in the San Bernardino case. All iPhone data is encrypted, and Apple can’t access it without the special tool the FBI wants it to create. The problem is, that tool doesn’t yet exist.

Creating code to break your own encryption defeats the purpose of providing an encrypted service.

If a company can’t access the requested information due to encryption, the government will still expect “technical assistance,” meaning a company like Apple would be expected to build special code to get around its own encryption.

The end goal is for the company to “achieve the purpose of the court order,” regardless of the manpower and resources it takes. That can be quite difficult for small businesses that need to divert those resources elsewhere. But perhaps more important, creating new code to break your own encryption defeats the purpose of providing an encrypted service in the first place, because it gives the government easy access to your data.

The bill says companies don’t have to change their products

According to the bill, the government may ask a company do whatever it takes to break its encryption and provide requested data, but the government isn’t asking the company to change its product or service.

“Nothing in this Act may be construed to authorize any government officer to require or prohibit any specific design or operating system to be adopted by any covered entity,” the draft reads.

But that’s quite a confounding clause, because the entire point of this bill is to force tech companies to change their encrypted services by providing a key or backdoor — that in itself is forcing a company to change the design of its product or service.

If a company’s sole product is selling encrypted services, like encrypted email for example, its entire business model is moot with the introduction of this bill, because it can’t guarantee that no one is snooping on your emails.

The government will pay companies to decrypt services and devices

Apple argued that creating special software for the FBI to unlock the iPhone that was locked in the San Bernardino case would place a “burden” on its engineers. Smaller companies have also stated that they don’t have the money or resources to comply with government requests. The Feinstein-Burr bill wants to cover the costs incurred in aiding investigators by providing compensation for a company’s assistance.

“A covered entity that receives a court order … and furnishes technical assistance … shall be compensated for such costs as are reasonably necessary and which have been directly incurred in providing such technical assistance or such data in an intelligible format,” according to the draft.

The bill doesn’t specify an amount or set up guidelines as to how much compensation companies would receive. While monetary compensation can be enticing for some smaller businesses, companies risk losing the trust of their customers by handing over user data to the government.

What if a company refuses to comply?

The discussion draft is also missing any mention of penalties for what would happen should a company, like Apple in the San Bernardino case, refuse to assist law enforcement in accessing encrypted data. The language in the draft is different from the California anti-encryption bill, which was just struck down, and the New York bill, which would have placed $2,500 fines on companies that refused to cooperate.

“Penalties would be up to the courts. Judges have wide discretion on contempt penalties.”

Mentzer says that courts will determine the penalties.

“Penalties would be up to the courts,” he said. “Judges have wide discretion on contempt penalties.”

As such, the penalty for what a company would face, should they refuse to help the government, would be decided on a case-by-case basis.

Strong backlash

During the Apple vs. FBI case, tech companies, private citizens, and legal experts alike banded together to fight the government on its demands that Apple create a backdoor and undermine its own encryption. It’s likely we’ll see that kind of backlash again, if the Feinstein-Burr encryption bill advances further. Ever since the leak last week, legal and tech experts have come out strongly against it.

“A good parallel to this would be holding a vehicle manufacturer responsible for a customer that drives into a crowd,” forensics expert Jonathan Zdziarski said last week in response to the leaked draft. “Only it’s much worse: The proposed legislation would allow the tire manufacturer, as well as the scientists who invented the tires, to be held liable as well.”

Ladar Levison, founder of secure-email service Lavabit, thinks Sen. Feinstein and the bill’s authors should know better than to try and ban encryption. Levison shut down his email service after he complied with the FBI’s court order, and provided private SSL keys to his email network. The FBI wanted access to spy on NSA whistleblower Edward Snowden, who was using the encrypted email service at the time.

“It’s been a while since Google turned their homepage black [in protest],” Levison told Digital Trends. “Perhaps Feinstein missed the color scheme and the bill she introduced is really just her secret plan designed to solicit its return.”

His comment refers to when Google staged a blackout, along with several major tech companies, and turned its logo black in protest of the Stop Online Privacy Act, and the Protect IP Act. Many believed SOPA and PIPA would have detrimental effects to freedom of speech and internet-related entrepreneurship.

The Feinstein-Burr anti-encryption bill would change the definition of an encrypted device to something that can be broken into when the government so desires. Companies would have no say in the matter, and they’d be faced with an impossible choice: Either be fined by the government for refusing to comply, or lose consumer trust by breaking their promise of secure encryption.

More recently, an open letter penned by the Reform Government Surveillance, Computer & Communications Industry Association, Internet Infrastructure Coalition (I2C), and the Entertainment Software Association was published online. Apple Insider reports that these groups represent companies like Apple, Amazon, Microsoft, and Google. An excerpt from the letter reads, “We write to express our deep concerns about well-intentioned but ultimately unworkable policies around encryption that would weaken the very defenses we need to protect us from people who want to cause economic and physical harm.”

The group states that they design the encryption for protection from the government as well as criminal elements. If companies are forced to adhere to this law and allow government access, they may also create vulnerabilities that can be exploited by those seeking to do harm. The group acknowledges that vigilance must be maintained against crime and terrorism, but there must be a balance, and it is “ready and willing to engage in dialogue about how to strike that balance.”

Of course, we expect that tech companies will fight this bill aggressively in the coming months, and it’s important to note that the Obama administration has said it will likely not support any anti-encryption legislation. We’ll be following its progress closely.

Editors' Recommendations