In the spring of 2016, I was told that the Democratic National Committee had been hacked, probably by Russians. Immediately, I was concerned that the campaign I managed, Hillary for America, had been hacked too. We wouldn’t know for months whether it had (to the best of our current knowledge, it never was, although private accounts of campaign staff and advisers were). In the days afterwards, we needed a way to have conversations that would be guaranteed not to leak – including ones relating to the hack itself. When the stolen information was exploited to generate news coverage or concoct “fake news” – such as that Democratic operatives were running a sex ring out of a pizza parlour – we learned some hard lessons in why privacy really matters. I worry the current rhetoric around encryption is ignoring that lesson.

The deputy attorney general, Rod Rosenstein, has called for “responsible encryption” that would allow officials to unlock encrypted data with a warrant. Christopher Wray, the director of the FBI, recently said that lack of access to encrypted smartphones was a “major public safety issue”. In the UK, the home secretary, Amber Rudd, has repeatedly said that encryption is a “problem”. And on the face of it, having more tools for law enforcement makes sense.

By creating a vulnerability, you’re creating an opportunity for adversaries to break in

But the devil is in the detail. There’s no question that terrorists are using encrypted communications to plan violence, precisely because they can’t be “tapped” the way their phone conversations can. There’s also no question that the companies producing the hardware and code that enable us to create, encrypt and store data must work with law enforcement to stop malicious actors. But for those companies to simply create a backdoor – a special set of keys that allow law enforcement to “unlock” encrypted communications of suspected criminals – would be wrong.

Privacy advocates might be tempted to co-opt an old slogan from the National Rifle Association: encryption doesn’t kill people, people do, therefore we shouldn’t target tech. But this a poor argument against sensible gun laws, and it also fails to address the real danger of a backdoor policy. The average American doesn’t need a gun to go about their daily life, but they do need encryption – and a backdoor will make it weaker for all of us. Because almost every aspect of our lives – our finances, our identities, our conversations with loved ones – are stored online. The stakes of theft are high for everyone, every day. The same way we need locks for our cars and our doors, have vaults in our banks and security personnel in our buildings, we need encryption to protect our data.

Real encryption protocols work because literally no one can “open” the data except the person sending or receiving it, each of whom has special keys. Advocates of a backdoor say that’s all well and good, but law enforcement needs a key too.

But consider this: last summer, a yet-to-be-identified group nicknamed the Shadow Brokers gained access to some of the NSA’s most highly classified and potent hacking tools. Unlike some privacy advocates, I believe the NSA was entitled to these tools, but they were “hacks” into different types of hardware and software, not a set of keys into every encryption protocol out there. Imagine if they had been. And then ask if it’s realistic to assume that anyone could keep such keys secure, particularly given the track record of nation states compromising the highest levels of corporate and government security.

By creating a vulnerability, you’re creating an opportunity for adversaries to break in. As China becomes a world leader in machine learning and artificial intelligence, why would it not train machines to find the loophole in the protocol? If they know that loophole exists, they can search for it, as could corporations or state-run companies interested in industrial espionage.

These backdoors would also have far-reaching consequences for human rights. If the UK or US governments are entitled to open this backdoor to stop dangerous criminals, why not the People’s Republic of China, or the Russian Federation, who could also conveniently monitor dissidents? Moreover, if the US and UK no longer have reliable encryption protocols, global customers will simply go elsewhere to get the solutions they need. So would terrorists: the black market could easily develop protocols of its own.

Two things need to happen. First, politicians have to get smart on this issue, fast. They don’t all need to become cryptologists, but they do need to listen to people who are – and they are warning us that what some officials are calling for is unworkable. The digital revolution has made so much of our lives so easily accessible that the old paradigms for surveillance don’t work any more. When law enforcement had the ability to listen to phone conversations with a warrant, there wasn’t a legitimate concern that criminals or the People’s Liberation Army could listen in too – or that our health records could be nabbed over the line as well. Lawmakers need to seek new options.

Second, we need to focus on ideas that are practical. With a warrant, it should certainly be possible to gain information about the size or frequency of data exchanged between accounts, the location of active parties, and devices used. There’s a lot that companies can share right now that won’t put the rest of us at risk.

What law enforcement needs is information. A backdoor to encryption is simply one means to that end, but one with potentially dangerous outcomes. Let’s demand better, so the people who keep us safe can do their jobs, and ordinary citizens can keep their data secure.

• Robby Mook was Hillary Clinton’s presidential campaign manager