An Ethiopian journalist living in the U.S. was spied on by his own government. A pro-democracy activist in Dubai was beaten repeatedly by thugs after his computer was infected with surveillance software. An American who criticized the Turkish government was monitored by officials there.

They are among thousands of people whose computers and mobile devices were infected with “surveillance software” made and sold to governments and law enforcement agencies by the recently-hacked Hacking Team. Such targeted surveillance can lead to beatings, imprisonment, torture and often death. It can also bring harm to the victim’s family and friends, and anyone who communicated with the victim online or over the phone.

WIRED Opinion About Katie Moussouris is the Chief Policy Officer for HackerOne, a platform provider for coordinated vulnerability response and structured bounty programs. She is a visiting scholar with MIT Sloan School and a New America Foundation Fellow. Katie is an ex-hacker, ex-Linux developer, and persistent disruptor. Follow her on Twitter.

Many people point to Hacking Team’s customer list, which includes Sudan and other oppressive regimes, in arguing that more regulations are needed to stem the proliferation and accumulation of digital weapons. Meanwhile, security experts warn that overzealous laws will stifle this vital security research that aids defense. Many also fear these regulations will put legitimate tech companies out of business due to excessive license application burdens and delays in the ability to sell security products and compete globally.

Can this type of specialized intrusion technology be reasonably controlled in terms of who has access to it? Can international agreements on export controls that were created to limit land-mines and nuclear bombs be applied successfully to digital warfare? Would these regulations really be able to curb human rights abuses?

Overly Broad Regulations Are Worse than No Regulations, So Let’s Fix Them

Enter the proposed U.S. Bureau of Industry and Security (BIS) enforcement of an international arms agreement called the Wassenaar Arrangement. The voluntary agreement among the 41 participating countries calls for regulating the knowledge of how to create “intrusion software,” which is defined as “software that is capable of extracting or modifying data or modifying the standard execution path of software in order to allow the execution of externally provided instructions.”

It is our job to collectively ensure that no regulation stops defenders.

While regulating such defined technology has some lawmakers and human rights advocates cheering, the definition has struck a panic chord in security researchers, security software and testing companies, and even large software vendors.

And that is where you come in. BIS is currently accepting comments from the public about its proposed regulation. But as I will detail below (and as WIRED has explained), the phrasing of the proposal is currently problematic. The regulations will hinder the defense of the Internet, and therefore affect everyone, so it is up to us to provide comments to BIS as they have requested, before July 20, 2015.

Getting high quality written comments on the public record will help regulators understand what they overlooked and enable them to make changes.

Here are some tips for writing constructive comments:

Give examples of what technology is caught by these rules and what the impact will be. Explain in detail the burden to organizations and individuals who will have to apply for export licenses under the new rule. Show how the new rule won’t achieve the stated goal of protecting human rights, but instead will hinder defense of the Internet.

In addition to any specific concerns listed in written comments, the best bet for preserving security research and defense of the Internet, is to request that BIS create at least one more draft of this regulation and have another comment period.

Before I go into the details that will help you write constructive comments, let me answer the basic question: Why is this so important?

In the Digital Age, Warfare is Asymmetrical

The digital age is the end of the era of conventional military superpowers, when whoever had the most or the biggest guns won. The concepts that made sense in physical warfare no longer apply in this new online age.

Now, anyone can build a weapon. Anyone can raise an army. Anyone can be a general, a spy, a criminal, a victim. The present and future of war are straddling the realities of cyberspace and meatspace, and the rules are vastly different now. In fact, it is clear that we are making up the rules as we go.

We must adapt to our future, and support the innovations that built the Internet, not stifle them by passing laws of noble intention but profoundly flawed implementation. There is a possible way forward that has been suggested by Professor Sergey Bratus of Dartmouth, in implementing regulations that focus on the act of stealing data, as opposed to the current wording that is overly broad and includes many defense-related technology in its dragnet definition of "intrusion software".

One thing is constant: Those who wish to create tools and use or distribute them to cause harm will continue to do so with the impunity that was revealed in the internal communications of the hacked Hacking Team. No regulation will stop them. It is our job to collectively ensure that no regulation stops defenders.

Exploits Can Be Used for Good or Evil

The devil’s in the details.

There’s a conundrum when it comes to exploits and other potentially malicious software. For human rights advocates, software like DaVinci from Hacking Team that bypasses security protections, hides from anti-virus and other malware detection tools, and spies on the victim, represent a threat to human life when used by repressive regimes. But for security researchers, the same offense techniques that are developed to bypass existing computer security measures are used in research to highlight weaknesses in order to fix the vulnerable software.

These identical techniques simply can’t be logically separated from the exploit techniques that are used by criminals and nation states in spyware tools. In other words, these technologies are dual-use—aiding defenders who are testing their security and used by attackers who are up to no good.

Getting in the way of defense is not the goal of regulations, yet they stand to deal blows to that process in ways that will lead to more victims in the end, due to the overall weakening of Internet defense.

What Else Could Possibly Go Wrong?

The Electronic Frontier Foundation pointed out some important ways that the proposed US regulation goes even further than the original Wassenaar Arrangement, and even past what the EU and UK implementations have included. “The controls BIS is proposing aren't required by Wassenaar, nor are they included in other Wassenaar implementations.”

Noted security researcher Halvar Flake articulated that security research across international borders would be stifled. This "balkanisation," or division of security researchers by country — those covered by Wassenaar and those where it is not applied – will slow fundamental advancements in computer security, hindering breakthroughs in defense. In addition, some researchers will be concerned that their work will be commandeered by their own government for use in surveillance if they turn it over to request an export license before broader distribution.

The End of Vulnerability Disclosure

An important overlooked and misunderstood part of the Wassenaar debate is that vulnerability disclosure itself—with or without cash bug bounty payments—is threatened by the new rules, despite the stated intent of the authors of the regulations to leave vulnerability research and disclosure untouched.

Vulnerability disclosure itself is threatened by the new rules.

As BIS states in its FAQ:

“4. Will the rule control vulnerability research as well as research on exploits?

...the proposed rule would control the following, among other things:

Information "required for" developing, testing, refining, and evaluating "intrusion software", in order, for example, technical data to create a controllable exploit that can reliably and predictably defeat protective countermeasures and extract information.”

The FAQ further states that public disclosure provides an exemption: “..export controls do not apply to any technology or software that is "published" or otherwise made publicly available.“

But not all parts of vulnerability information sent to a vendor are necessarily publicly disclosed.

Vital pieces of technology, for example a brand new exploitation technique that may have been used in the Proof of Concept code delivered to a vendor as part of a vulnerability disclosure process, would be subject to export control if the technique were to remain private.

An example of an exploitation technique was discovered and reported to Microsoft to win the very first $100,000 mitigation bypass bounty, which helped improve the defense of future versions of Microsoft software. Two years ago, I wrote in a blog that “learning about new mitigation bypass techniques helps us develop defenses against entire classes of attack. This knowledge helps us make individual vulnerabilities less useful when attackers try to use them.”

For Whom Does the Burden Toll?

This confusion and concern around what does and doesn’t fall under export control won’t just affect security researchers. Vendors receiving vulnerability reports will have to apply for “deemed export licenses” themselves in the case where they employ foreign nationals that may come in contact with export-controlled technology.

This burdensome license application process would have to happen even if the researcher was in the U.S. disclosing to an American vendor who employs foreign nationals. Also taken from the BIS FAQ: “There is no license exception for intra-company transfers or internal use by a company headquartered in the United States under the proposed rule.”

Granting an exception for “intra-company transfers” wouldn’t solve the problem for cases when one vendor needs to work with other companies in order to address the security issue. Essentially, the proposed rule would break the fundamental ability for vendors to defend themselves, which leaves everyone more vulnerable to the kinds of attacks the regulation was designed to prevent.

'He Who Controls the Spice Controls the Universe'

In the Frank Herbert novel Dune, trade control of the vital substance known as “Spice” is the backdrop of political and power dynamics throughout the universe. While digital weapons may not have the same allure as the mind-altering Spice, they are no doubt as valuable or more to people who have sensitive data and corporate and government secrets to protect.

When the trade of knowledge on how to build digital weapons, now defined explicitly via the Wassenaar Arrangement and its implementations is regulated too broadly, defense loses out.

The entire Internet ecosystem and everyone who uses technology will suffer the chilling effect on research and advances in defense.

Hacking Team and companies like it will continue to perform its business, negotiating with their local government to get export licenses, or, as the now-public company email archives show, via a reseller. Victims will continue to be electronically surveilled as a result.

Individual security researchers, small security companies, and even large vendors who employ foreign nationals—all on the defense side—are the parties who will suffer under these regulatory burdens. With them, the entire Internet ecosystem and everyone who uses technology will suffer the chilling effect on research and advances in defense.

I personally believe that BIS and other regulators are sincere in their willingness to listen. It’s up to us to highlight points they may have overlooked or misunderstood. The way forward is with greater and early collaboration between technologists and regulators. To achieve that, we must look for ways to help regulators achieve their desired outcome, especially when it is to protect human rights, without the unintended consequence of impeding defense.