The Compliance with Court Orders Act of 2016 authored by Sens Richard Burr, R-N.C., and Dianne Feinstein, D-Calif., mandates companies to shoulder the technical burden of accessing encrypted emails or files when investigators issue court orders. It doesn't specify penalties for noncompliance.

When CCOA hit the internet this week, lots of techies, privacy advocates, reporters and security researchers saw red over what they described as legislation that makes encryption illegal or requires backdoors.

Not so fast. The senators might be clueless about security, but they saw those arguments coming from a mile away.

In reality, the Senate committee's Court Orders Act won't outlaw encryption. Nor does it mandate golden keys or backdoors in products -- it's very careful to avoid requiring or prohibiting any kind of design or operating system.

No, this slippery little act says that when a company or person gets a court order asking for encrypted emails or files to be handed over and decrypted, compliance is the law.

How compliance actually happens isn't specified. They don't care how user security was broken (or if it were nonexistent), and the senators are making it clear that from now on, this isn't their problem.

The thing is, that pesky encryption the Senate sees as impeding court orders is the same technology that only unlocks iPhones for their owners, that keeps email truly private, and that could have protected the 80 million sensitive customer and employee records stolen when health insurer Anthem's database was breached.

We're talking about encryption in computer security. You either have it completely, or you don't. On some things, the room for passive-aggressive political maneuvers is effectively zero.

The Court Orders Act of 2016 doesn't do anything as obvious as tell us what kinds of communications or stored files will need to be decoded. But it doesn't stop there. CCOA can also force entities to produce decrypted identifying information and device information and any data stored remotely or on a device.

Who are these entities? According to the document, that would be device and software manufacturers, electronic communications companies, remote computing services... you get the idea. It will also require companies to provide decrypted communications if the encryption is provided by a third party. Let that sink in for a minute.

This act's angle is a clever way to leapfrog arguments about making encryption illegal and demanding backdoors. It's also what makes this legislation even more destructive to security. As it stands now, getting companies and startups to encrypt and protect user security has been an uphill battle. If this bill passes, all the hard work done to raise awareness and establish practices around encrypted communications will be lost.

As everyone knows, security only works when everyone's doing it.

This bill would ensure that the links in security chains would grow weaker or nonexistent because no one's going to want to deal with the fallout of a court order.

It appears targeted at Apple, Google and services like WhatsApp. And guess who would happily be in compliance without doing a damn thing? All the companies that don't take security seriously enough to encrypt records and communications (of which there are far too many). That's right, the companies you shouldn't trust your data with will officially have a reason not to protect you with encryption.

Some pundits are saying it's too crazy to be useful. Wired believes that it's so bad for privacy that it's actually good and that it's unlikely the bill will become law. Sen. Ron Wyden, D-Ore., has threatened a filibuster if it reaches the Senate floor. All of this might make you think it can't happen here.

I disagree. The White House has declined to support or oppose it — though it did review the text and provide feedback — and President Obama recently admonished opposition to court-ordered access. Given the act's nuanced approach plus that most people just don't see how this affects them, I'm pretty sure this thing isn't as preposterous as everyone thinks.

Encryption legislation has become a priority after years of argument and impasse; at this point, some kind of lawmaking is inevitable. If this doesn't pass, then a mutated clone of it surely will. So this act isn't so much a shock as the abysmal way it's been handled.

Early this year, Sen. Mark Warner, D-Va., and House Homeland Security Committee Chairman Michael McCaul proposed creation of a national encryption commission. This was to study the issue among tech industry leaders, privacy advocates, academics, law enforcement officials and members of the intelligence community -- to prepare for crafting encryption legislation. Sens Burr and Feinstein decided in late January to skip that altogether, saying it was too slow and told The Hill that "Congress has to move fast." Bizarrely, Feinstein added, "If the internet goes totally dark, and there are apps that people can use to communicate to plot, to plan, to threaten, to do all of that, you've got a real problem."

Warner and McCaul's encryption commission bill was introduced to Congress last month. And McCaul is pushing an Energy and Commerce Encryption Hearing next Tuesday, actually even featuring the widely respected crypto and computer security researcher Matt Blaze.

But we can be sure that the people who wrote and back CCOA, people who consider baseline security measures like encryption to be someone else's problem, will regard crypto hearings with hackers like the kid's table at Thanksgiving.

I mean, it's great that the senators can pretend like they're our parents who know better, and waste time and money in all these variously unproductive ways. But the rest of us who are getting our data stolen every other week desperately need the security that encryption provides.