The challenge for the robotics code of conduct is much the same as other industries' attempts at self-regulation, ranging from banking to the private military industry. It's a laudable start, but it doesn't change the underlying issues and concerns. Like such other would-be "codes of conduct," it lacks one key ingredient: consequences.

As it stands now, the golfer who violates his country club's code of conduct risks stiffer punishments than a drone maker or user who violates the terms of their new code. Golfers might lose a point, or even be kicked out the club, if they violate their agreement. The new robotics code doesn't include a single potential sanction, such as, for example, something like kicking violators out of the trade group. Indeed, much of what is laid out is actually restatements of responsibilities the firms and users already must abide by, regardless of any code. For example, the code says that the firms "will comply with all federal, state and local laws." So, before the code, they could violate the law at will? Of course not. Saying one will follow the law is one of those things that sounds meaningful but is ultimately meaningless, as it just illustrates the importance of the law, not the code.

Similarly, the code is quite vague on a variety of legitimate concerns. It says that "we will ensure that UAS [unmanned aircraft systems] are piloted by individuals who are properly trained and competent to operate the vehicle or its systems." Who will determine this, and what does "trained and competent" mean in a world where some believe drones should only be operated by rated pilots, even though new versions can be flown by teens using iPhone apps? Likewise, the code pledges to "respect the privacy of individuals," which is a bold statement with nothing about what it actually means. "Respect" could be anything from avoiding the monitoring of individuals without their express permission to showing them "respect" only in the public-relations sense.

Of course, these are thorny issues. Indeed, it's their very thorniness that is why an industry self-regulatory code -- especially one that emerged in the context of bad press and built around lowest common denominator agreement within a trade group -- would sensibly want to avoid them for now. But the irony is that resolving these problems is what actually matters to the industry's overall goal of "gaining public trust and acceptance." The same need for resolution goes for the pressing concerns that the code completely ignored. For example, despite purporting to cover "those who design, test and operate UAS," it avoids stating any specific intent or concern about those we'd rather not see be involved in the field. What can we do not just to promote a powerful technology for good, but also to stop the illicit use by or unintended transfer of the technology to dangerous actors? Much like the technology, such worries are not science fiction. Everyone from terrorists to jewelry thieves to vigilante groups have already used UAS technology.