The two US lawmakers behind legislation requiring the tech sector to build backdoors in encrypted products are playing the terrorism card. In an editorial Thursday in the Wall Street Journal, Sen. Richard Burr (R-N.C.) and Sen. Dianne Feinstein (D-Calif.) stoke fears that our personal safety is tied to their proposed legislation.

The pair cite what they called an "islamic State-inspired attack last year in Garland, Texas" and the non terror-related murder of a Louisiana pregnant woman named Brittney Mills.

"These are two of the many cases where law enforcement is unable to fully investigate terrorism or criminal activities. In fact, today the FBI is unable to gain access to data on many of the mobile devices they obtain that are password protected," the lawmakers write.

What does the law they propose require? The lawmakers don't use the term backdoor, but they want the tech sector to have the ability to fork over to the authorities, with a "court order," any encrypted data from any of their products:

The draft proposal requires a person or a company—when served with a court order—to provide law enforcement with information (in readable form) or appropriate technical assistance that is responsive to the judicial request. This will enable law enforcement to conduct investigations using the communications involved in criminal and terrorist activities. Our draft bill wouldn’t impose a one-size-fits-all solution on all covered entities, which include device manufacturers, software developers and electronic-communications services. The proposal doesn’t define the technological solutions or tell businesses how to solve the problem. It provides compensation for reasonable costs that businesses may incur when complying with a court order. We want to provide businesses with full discretion to decide how best to design and build systems that maintain data security while at the same time complying with court orders.

Pointing out the obvious, this means that nobody is allowed to build a product that locks away data to where it cannot be accessed by the product's maker. And looking at it inside out, that means nobody has the right to absolute data privacy.

Burr and Feinstein concede this but say it's in the interest of fighting criminality and terrorism:

We are not asking companies to provide law enforcement with unfettered access to encrypted data. We aren’t even asking companies to tell the government how they gain access to this encrypted data. All we are doing is asking companies to find a way to keep their data secure while also cooperating with law enforcement in terrorism and criminal investigations.

To a degree, all of this echoes the Clinton-era Clipper Chip fiasco. Today, it's the push for the Clipper Chip's modern-day equivalent, the "Golden Key" of sorts, as FBI Director James Comey has described it.

The two lawmakers say it's a no-brainer to buy into this program because the only way a company would be forced to cough up their customers' data is with a court order. But let's assume, for argument's sake, that a court order is from the Foreign Intelligence Surveillance Act (FISA) court. National Security Agency whistleblower Edward Snowden revealed that the government was using a FISA Court "court order" to acquire the metadata of every call made to and from the United States—all in the name of the war on terror. That 2013 revelation in the Guardian is the reason why the tech sector is now beefing up its encryption and is seemingly at odds with government moves for backdoors.

We're not sure Congress has the wherewithal to approve this proposal and send it to the president. And Obama's support for the measure appears lukewarm at best. But you never know what will happen when the terror card is in play, especially as election season heats up.