Once upon a time, users who were careless about security posed a risk only to themselves. But, with the advent of pervasive networking and botnets, that's no longer true. As a result, lax security has become the equivalent of second hand smoke: it poses a risk to everyone, and needs a security equivalent of a public health campaign and quarantines. That's the message of a new report by Scott Charney, who heads Microsoft's Trustworthy Computing Group. And Charney has a simple solution: a digital health report that every piece of network hardware would be required to provide before having access to the full suite of Internet services.

Charney's report is entitled "Collective Defense: Applying Public Health Models to the Internet," and is available for download. In a blog post in which he announced its release, Charney presents this as part of a larger attempt to redefine how we look at cyberthreats, and references an earlier report he prepared. Don't believe him; the two reports are largely unrelated, and the earlier one did little more than present a list of reasons why cybersecurity is so challenging for governments, businesses, and private citizens.

What the new report does is suggest that private citizens need to adopt some of the best security practices used by governments and business. Most large institutions now have a policy whereby machines are not allowed onto a network unless they have the most up-to-date security patches. The network is constantly monitored for signs of aberrant behavior, and the offending hardware is disconnected until an IT staff member can figure out what's going on.

Of course, private citizens don't have an IT staff dedicated to supporting them, and ISPs are unlikely to be especially interested in expending the resources needed to implement something like this. So, that's why Charney argues that we need to use a public health metaphor. "To address cyber threats generally, and botnets in particular," he argues, "governments, industry and consumers should support cyber security efforts modeled on efforts to address human illnesses." This would involve "promoting preventative measures, detecting infected devices, notifying affected users, enabling those users to treat devices that are infected with malware, and taking additional action to ensure that infected computers do not put other systems at risk."

In the public health arena, the equivalent activities require both government standards and the active participation of citizens. And that's precisely what Charney has in mind regarding the digital health certificates. These would be generated by the device itself, limiting the need for user involvement. Access providers would read these certificates and enable different levels of access based on the state of the hardware's health. In one example he gives, a VoIP system that's infected with malware might be limited to dialing an emergency services number until the infection is cleared up.

Meeting the requirements

Charney lays out several significant requirements. For starters, any additional service will inevitably provide a new vector for malware attacks, so it's essential that the health certificate system have security nailed down. The certificate system would also have to disclose a minimum of private information, although some disclosure might be valuable—Charney points out that such disclosure could be useful to have a certificate identify the type of hardware that's been compromised, so that home users know what they need to fix. Finally (and this isn't a surprise, given the author's job title), the system would have to be trusted: hardware can't be able to spoof a clean bill of health, and ISPs can't be able to disconnect users for arbitrary or commercial reasons.

Who can make sure all of this happens? Ideally, in Charney's view, market forces and the relevant stakeholders. But he's not especially optimistic that they'll be anywhere close to sufficient, and he falls back on expecting heavy government involvement in several places. These include making the system affordable to device makers and ISPs—"if market forces prove insufficient, then the government should use the tools in its tool kit to ensure the model is economically viable"—and making sure the system isn't abused—"Governments may still want to regulate how health certificates can be used so that any program is limited to ensuring device health and that information gathered is used for no other purpose (for example, the enforcement of intellectual property rights, the creation of marketing profiles)."

All of that seems like a lot to ask for, which is presumably why Charney turns to the second-hand smoke model. "Consumers have been told for years to keep their systems up to date, run antivirus programs, and backup their data." he argues. "Like smokers, they were told that the failure to follow the advice given would put them at risk, but ultimately they could choose to accept that risk. With botnets and similar types of malware, however, one is not simply risking one’s own device—one is putting others at risk too."

Overall, the health certificates are an intriguing idea, and could potentially make economic sense for IT departments, if not private citizens. But the number of moving parts and potential abuses will be very, very large, and the system would require input from a large number of competing interests if it's ever going to get anywhere. Charney's going to have to be a very compelling salesman if he hopes his ideas will catch on.

Listing image by Flickr