Welcome to Mossberg, a weekly commentary and reviews column on The Verge and Recode by veteran tech journalist Walt Mossberg, now an Executive Editor at The Verge and Editor at Large of Recode.

Welcome to Mossberg, a new weekly commentary and reviews column on The Verge and Re/code by veteran tech journalist Walt Mossberg, now an Executive Editor at The Verge and Editor at Large of Re/code.

Protecting the security of the United States and of Americans abroad is no easy task, especially against terrorists. I got a lesson in this before I became a tech columnist, when I served stretches as the chief Pentagon correspondent and the National Security correspondent for The Wall Street Journal, including coverage of the intelligence agencies.

So, I’m somewhat sympathetic with the frustrations expressed over the past year or so by national security officials — especially FBI director James Comey — over fears that encryption of digital devices and services is making it harder for their agencies to spot and stop terrorists in the digital age.

A backdoor would be a huge mistake

I understand their exasperation, but not their proposed solution: forcing American companies, notably Apple and Google, to build "backdoors" into their encrypted smartphones that would allow the government access. This would be a huge change, because both companies have introduced whole-device encryption that even they can’t decrypt. It would also be a huge mistake.

Over the past year or so, Mr. Comey and his colleagues have complained that this encryption of smartphones by Apple and Google is causing a problem they call "going dark" — making it harder for them to conduct surveillance of smartphones, messaging services, and more.

The problem is that, even if the FBI served the companies with a legal, court-approved search warrant for particular encrypted phones, they couldn’t comply. The lawmen would have to serve the warrant on the phones’ owners, and try and force them to unlock the devices with a password, fingerprint, or some other authentication method.

Mr. Comey does pay some lip service to the values of strong encryption. In Senate testimony this past July, he acknowledged that "it is important for our global economy and our national security to have strong encryption standards. The development and robust adoption of strong encryption is a key tool to secure commerce and trade, safeguard private information, promote free expression and association, and strengthen cyber security."

But he also praised a 1994 law called CALEA that required "telecommunications carriers" to build into their systems methods to assist court-approved government surveillance. He lamented, however, that the law was out of date, because it "does not cover popular internet-based communications services such as email, internet messaging, social networking sites, or peer-to-peer services."

Wisely, after a vigorous national discussion, the White House declined a few months ago to push for a bill that would mandate a government "backdoor" that would poke a hole in a security system that gives power and protection to smartphone users.

But now, following the horrific terror attack in Paris, the issue is showing signs of coming back to life.

There is no such thing as a backdoor that only the US government can access

It’s a bad idea for a variety of reasons. But the biggest one is this: the tech companies and independent experts broadly agree that there’s no such thing as a "backdoor" in an encrypted device whose use could be limited to duly-authorized US agencies. Once an encryption system is breached, a cascade of other actors, from malevolent hackers to foreign dictatorships like China and Russia, will waltz through that backdoor, either by hacking or by enacting laws requiring that US companies provide them the same access provided to American agencies.

Apple and Google took pains to assure users they have no backdoors for the government, even before the current debate heated up.

Apple CEO Tim Cook posted a statement on a special privacy section of Apple’s web site, saying in part: "I want to be absolutely clear that we have never worked with any government agency from any country to create a backdoor in any of our products or services. We have also never allowed access to our servers. And we never will."

In October, he told a tech conference that "I don’t know a way to protect people without encrypting" and "you can’t have a backdoor that’s only for the good guys."

Earlier in the year, speaking at a security conference in Germany, former Google executive Rachel Whetstone made a similar point. She said: "No government "— including the US government — has backdoor access to Google or surveillance equipment on our networks. Let me repeat that. The United States Government does not have backdoor access to Google."

It’s fair to note that, in addition to protecting their customers, Apple and Google get business benefits from strong and secure encryption. They gain the ability to remove themselves from delicate law enforcement actions. And they gain protection against charges overseas that buying their products will give the US government access to foreign users’ data.

"Security risks, engineering costs, and collateral damage."

They also have plenty of support for their views from people with no such business interests.

In July, in a lengthy technical post, 15 MIT experts concluded that "current law enforcement demands for exceptional access would likely entail very substantial security risks, engineering costs, and collateral damage."

Also that month, in The Washington Post, former NSA director Mike McConnell joined former Homeland Security secretary Michael Chertoff and former deputy defense secretary William Lynn to side against breaching encryption.

"We recognize the importance our officials attach to being able to decrypt a coded communication under a warrant or similar legal authority," they wrote. "But the issue that has not been addressed is the competing priorities that support the companies’ resistance to building in a backdoor or duplicated key for decryption. We believe that the greater public good is a secure communications infrastructure protected by ubiquitous encryption at the device, server and enterprise level without building in means for government monitoring."

The trio noted that a similar debate about encryption arose in the 1990s when law enforcement agencies pushed for a sort of escrowed decryption key called the "Clipper Chip." The effort failed, and the former security leaders noted, "the sky did not fall, and we did not go dark and deaf. Law enforcement and intelligence officials simply had to face a new future. As witnesses to that new future, we can attest that our security agencies were able to protect national security interests to an even greater extent in the ’90s and into the new century."

There are other reasons to oppose a backdoor. For one thing, the legal authority often employed by the security agencies is a special court, called FISC, which is so secret its deliberations are sealed. So the legal process for breaching encryption would hardly resemble what you see on TV police shows when the cops have to get a judge to issue a search warrant.

For another, Mr. Comey’s complaints are overblown. Even without a backdoor, there are still many avenues that authorities can use to track terrorists.

Most phones are still unencrypted. Full-device encryption that is beyond the reach of Apple and Google is still only used on some smartphones, not all. It only works on both companies’ last two operating system versions.

The government has plenty of spying tools without an encryption backdoor

Even then, its use is limited. In the case of the iPhone, a device only gets encrypted if the user sets up a password and/or fingerprint ID. In the case of phones using Google’s Android platform, it’s only on by default in the costliest, most powerful handsets, because encryption causes the performance of wimpier Android phones to degrade too much. In fact, Google could only confirm for sure that encryption is on by default in its low-volume Nexus line of phones from last year and this year.

But if we finally arrive in a future where that's not the case, the government still has traditional wiretaps, and whatever secret technologies the NSA may be brewing up.

Furthermore, while it’s still early, there’s no evidence yet that the Paris murderers used Apple and Google encryption, or any encryption, in carrying out their attacks. Early reports say that the French authorities were able to pinpoint the terrorists’ base in St. Denis by searching an unencrypted cell phone discarded in a trash can and GPS units on cars the killers rented.

And, even if terrorists do use encryption, there are many encrypted communications services that can run on almost any phone, and which have nothing to do with Apple or Google.

In fact, according to The Wall Street Journal, an ISIS technical advice bulletin issued in January listed the messaging services of Apple, Google, and Facebook as only "moderately safe," and ranked them behind nine much less well-known services. One of these, called Telegram, has been reported to have become the ISIS messaging service of choice.

I sincerely hope that the US government, working with tech companies, can come up with some solution that helps catch terrorists and criminals who use smartphones and messaging services to disguise their plans and identities. I wish I could say what that might be. But I do know that it shouldn’t be one that weakens or destroys user-controlled smartphone encryption.