The terrorist attacks on Paris have reignited the debate over the use of encryption in everyday communications, pitting companies like Apple, Google and Facebook against efforts by law enforcement agencies to detect terrorist threats and track down possible attackers.

Reports from Paris say the terrorists used an encrypted messaging system to coordinate and plan Friday’s bloody assault. That’s likely to bolster calls from U.S. officials who want to require companies like Apple to create backdoor keys to their systems in order to help law enforcement agencies prevent such attacks.

Even so, just last month, the White House backed down from that idea after facing stark opposition from the tech industry as well as advice from cryptology experts. People are still smarting from Edward Snowden’s leaked documents, which revealed the breadth of government’s surveillance programs, and prompted privacy advocates to curb state spying activities.

Companies like Apple and Google have sided with their consumers — which isn’t surprising, given their commercial interests — but also because of the public outcry in the wake of Snowden’s revelations.

Three weeks ago, Apple fought a government subpoena to extract data from an older iPhone 5 seized in a drug case, as it had done in the past. In the filing, Apple noted that it would be unable to respond to these sorts of subpoenas on 90 percent of the phones using more current versions of the operating system — which “prevents anyone without the device’s passcode from accessing the device’s encrypted data. This includes Apple,” the company said in the briefing.

But beyond just the technical aspects, Apple CEO Tim Cook believes that allowing government access to people’s devices would have “dire consequences.”

“Any backdoor is a backdoor for everyone. Everybody wants to crack down on terrorists. Everybody wants to be secure,” Cook told the Daily Telegraph. “The question is how.”

Apple’s mobile rival, Google, has staked a similar claim. Eric Schmidt, now chairman of Google’s parent company, Alphabet, echoed Cook’s comments on backdoor access in a private conversation on policy last month. Back in February, Rachel Whetstone, then Google’s SVP of policy and communications, offered a lengthy treatment of the company’s position on encryption and data requests in the wake of the initial barbaric ISIS hostage situations.

Google, she said, supports a “principled yet practical approach” that balances both legal enforcement and consumer privacy. She said Google had scrubbed 14 million YouTube videos that broke policy, including those affiliated with terrorist groups. Accounts from those groups are “automatically” terminated, with their information handed to authorities, she added.

But the Google exec stressed that those requests go through the front door, via direct government requests, not through other access. “[W]e never let governments just help themselves to our users’ data,” Whetstone said. “No government — including the U.S. government — has backdoor access to Google or surveillance equipment on our networks.”

Google and Apple declined to comment on whether their policies will change in the wake of the Paris attacks. Facebook also did not respond to a request for comment. Last year, the company’s messaging service WhatsApp added end-to-end encryption — a communication system where only the senders and recipients can see and access the messages.

Apple changed its encryption policy in 2014 with the introduction of its iOS 8 mobile operating system for iPhones, iPads and iPod touch devices. The company began encrypting communications between its devices, so whenever a person uses iMessage or FaceTime, those messages are encrypted on the device in such a way that they can’t be accessed without a passcode — and Apple has no way to decrypt those messages. The decision immediately drew fire from FBI director James Comey, who said Apple’s decision would impair law enforcement’s efforts to combat major crimes, including terrorism.

Dell CEO Michael Dell disagrees that the government should have access.

“Our position on creating a backdoor inside our products so that the government can get in is that it’s a horrible idea,” Dell told the Telegraph in an interview just two days after the Paris attacks.

Dell, currently engaged in a complex acquisition of EMC, was responding to a piece of draft legislation in the U.K. that would require tech companies to aid in decrypting messages under a warrant. He added, “If you have a backdoor, it’s not just the people you want to get in that are going to get in, it’s also the people you don’t want to get in. All of the technical experts pretty much agree on this.”

But the latest attack could swing the debate in the other way. Officials have warned that the inability to track and monitor the efforts of terrorists have made intelligence gathering far more difficult. An NBC News investigation recently revealed how jihadists are teaching followers to encrypt messages in order to evade authorities through the use of a 24-hour help desk.

“They’ve developed a series of different platforms in which they can train one another on digital security to avoid intelligence and law enforcement agencies for the explicit purpose of recruitment, propaganda and operational planning,” Aaron Brantly, a counterterrorism analyst at the Combating Terrorism Center, told NBC News.

U.S. legislators have already been sounding the alarm. “The dark space of the Internet is becoming a breeding ground for terrorist communications, recruitment and plotting,” Homeland Security Committee Chairman Michael McCaul told Politico. “Our inability to monitor encrypted messages on social media apps, and the terrorists’ awareness of that, compounds the danger America and the West face … terror plots and warnings signs go under the radar when we can’t see their communications. You can’t stop what you cannot see.”

Yesterday, CIA director John Brennan criticized the government’s hesitation in seeking policies that would make it easier for his agency to access encrypted information that could help uncover terrorists. “I do think this is a time for particularly Europe, as well as here in the United States, for us to take a look and see whether or not there have been some inadvertent or intentional gaps that have been created in the ability of intelligence and security services to protect the people that they are asked to serve,” the CIA director said at a Washington conference today.

“I do hope that this is going to be a wake-up call,” Brennan said.

There is a massive technical knot at the heart of this debate, which in some ways offers the tech companies a way to avoid making a social or moral determination. Encryption uses math to turn messages into indecipherable text that can only be unlocked with a “key,” a series of characters that unlocks the message. It usually takes two keys to match up to decipher any communication, and tech companies like Apple have stayed away from trying to own or store either of these keys.

In broad strokes, tech companies see their first responsibility as protecting the information of their users whether it is their text messages on their phone or video calls around the world. With consumers everywhere under attack from hackers who want to steal their information, encrypting the lines of communication keeps the bad guys at bay.

“Do we want our nation to be secure? Of course. No one should have to decide between privacy or security,” Cook said last month at the WSJD Live conference in Laguna Beach, Calif. “We should be smart enough to do both. Both of these things are essentially part of the Constitution. They didn’t say prioritize this one above all of these.”

Additional reporting by Dawn Chmielewski.