STANFORD, CALIF. -- It has been a year since Stanford University launched an effort to better understand the legal tug of war that exists around electronics, the data they contain and the government’s right to access it. The undertaking is called the Crypto Policy Project.

In recent years, we have seen the larger encryption debate play out in court and the popular media. When Apple fought the FBI over the agency’s request to open a backdoor into iPhones following a mass shooting in San Bernardino in 2015, a conversation that had long been simmering on the back burner boiled over to the forefront of the privacy vs. security discourse.

How can companies protect privacy without impeding law enforcement’s ability to investigate crimes?

The idea that the debate is a new one is wholly false. Today’s push and pull is just another iteration of the surveillance and encryption debate we saw in the 1970s, and again in the 1990s.

For law enforcement, encryption in everyday devices poses a very real investigative barrier that must be balanced against the fear of unstifled access to any and every device.

As with the San Bernardino shooter’s phone, authorities argued that encryption was the thorn in the side of the investigation that followed. After publicly arguing with technology companies, investigators would go on to work around the encrypted device by paying a hacker more than $1 million to break in.

Jennifer Granick, director of Civil Liberties at the Stanford Center for Internet and Society, explained that separating the needs of law enforcement from the public’s rights under the Constitution is not as simple as it might seem. She calls this policy battle the third “crypto war.”

“Particularly after the Snowden revelations, there was a really concerted effort to encrypt more information to keep it secure …,” she said. “But this increased push for encryption has created a policy conflict, and the policy conflict is between the knowledge that encryption can secure information with the knowledge that encryption can interfere with legitimate law enforcement investigations.”

The discussion in the 1970s centered on the key length of encryption methods, and in the 1990s, it was about whether personal devices could, or should, have encryption built into them. After much back and forth, experts settled in favor of stronger encryption methods in both cases.

The general reasoning was that a lack of strong cryptographic protections would, in effect, create a laundry list of other security problems. Though protecting information seems straightforward to many living in the digital age, Granick said it is not without far-reaching policy implications.

“Here in 2016, we are facing the third crypto war, which is this policy battle over what are we going to do about encryption,” she said. “The policy battle has initially focused on legislative backdoors, where it would be unlawful to deploy crypto systems where there is no wiretap ability, the provider can’t provide plain text or the government doesn’t have the ability to decrypt.”

As we heard argued in the Apple vs. FBI entanglement, backdoors bring their own concerns, like what happens when a foreign government decides to use the method to track the movement of journalists or civil rights leaders? And what damage would be done to public trust when people learned of the deception?

“What the experts said, pretty uniformly, is that backdoors pose security risks and that these security risks can be pretty serious,” Granick said. “This is something that might not be obvious to a magistrate judge, sitting on the bench, deciding these cases.”

From the legal perspective, Riana Pfefferkorn, a Stanford cryptography fellow, explained that efforts to gauge court transparency related to encrypted devices have not been without their own barriers. Though judges must review and approve digital hunting expeditions on the part of law enforcement, active cases are not part of the public record, and sealed cases may never see the light of day.

Pfefferkorn said several Freedom of Information Act requests have been filed to not only learn more about “what the government is up to,” but also learn more about the context of the cases and how law enforcement agencies are going about the data retrieval process. Despite significant work to obtain this information, she said the process has been a slow one.

The center is also litigating in San Francisco federal court to open sealed records. One area researchers are particularly interested in is whether companies are being compelled to work around their own cryptography, provide encryption keys or some other technical assistance to investigators. If so, how often is this the case?

Because most of the incidents of interest to the Stanford team happen behind closed doors, are kept secret to protect investigations and are ultimately sealed, little is known about the scope of these requests and how third-party companies are cooperating, or not.

“We’re trying to get any records where there isn’t still some need for secrecy. We’re trying to exclude ongoing investigations,” she said.

Both Granick and Pfefferkorn see the Internet of Things as an expanding area of interest to law enforcement. As it expands to encompass new devices, agencies will no doubt try to leverage all the tools at their disposal and leave courts charting more unfamiliar territory.

Pfefferkorn pointed out that a device like a digital assistant or connected thermostat device could theoretically be used to gather information about the target of an investigation.

“So with everything from refrigerators to toothbrushes and feminine hygiene products getting the Internet of Things treatment, you can imagine the possibilities,” she cautioned, “but we shouldn’t have to just imagine what is happening. We don’t really know; we don’t know a lot of the facts about what the courts have been doing…”

Looking for the latest gov tech news as it happens? Subscribe to GT newsletters.