President Obama on Monday outlined a proposal that would require companies to inform their customers of a data breach within 30 days of discovering their information has been hacked. But depending on what is put in and left out of any implementing legislation, the effort could well lead to more voluminous but less useful disclosure. Here are a few thoughts about how a federal breach law could produce fewer yet more meaningful notice that may actually help prevent future breaches.

The plan is intended to unify nearly four dozen disparate state data breach disclosure laws into a single, federal standard. But as experts quoted in this story from The New York Times rightly note, much rides on whether or not any federal breach disclosure law is a baseline law that allows states to pass stronger standards.

For example, right now seven states already have so-called “shot-clock” disclosure laws, some more stringent; Connecticut requires insurance firms to notify no more than five days after discovering a breach; California has similar requirements for health providers. Also, at least 14 states and the District of Columbia have laws that permit affected consumers to sue a company for damages in the wake of a breach. What’s more, many states define “personal information” differently and hence have different triggers for what requires a company to disclose. For an excellent breakdown on the various data breach disclosure laws, see this analysis by BakerHostetler (PDF).

Leaving aside the weighty question of federal preemption, I’d like to see a discussion here and elsewhere about a requirement which mandates that companies disclose how they got breached. Naturally, we wouldn’t expect companies to disclose the specific technologies they’re using in a public breach document. Additionally, forensics firms called in to investigate aren’t always able to precisely pinpoint the cause or source of the breach.

But this information could be publicly shared in a timely way when it’s available, and appropriately anonymized. It’s unfortunate that while we’ve heard time and again about credit card breaches at retail establishments, we know very little about how those organizations were breached in the first place. A requirement to share the “how” of the hack when it’s known and anonymized by industry would be helpful.

I also want to address the issue of encryption. Many security experts insist that there ought to be a carve-out that would allow companies to avoid disclosure requirements in a breach that exposes properly encrypted sensitive data (i.e., the intruders did not also manage to steal the private key needed to decrypt the data). While a broader adoption of encryption could help lessen the impact of breaches, this exception is in some form already included in nearly all four dozen state data breach disclosure laws, and it doesn’t seem to have lessened the frequency of breach alerts.

I suspect there are several reasons for this. The most obvious is that few organizations that suffer a breach are encrypting their sensitive data, or that they’re doing so sloppily (exposing the encryption key, e.g.). Also, most states also have provisions in their breach disclosure laws that require a “risk of harm” analysis that forces the victim organization to determine whether the breach is reasonably likely to result in harm (such as identity theft) to the affected consumer.

This is important because many of these breaches are the result of thieves breaking into a Web site database and stealing passwords, and in far too many cases the stolen passwords are not encrypted but instead “hashed” using a relatively weak and easy-to-crack approach such as MD5 or SHA-1. For a good basic breakdown on the difference between encrypting data and hashing it, check out this post. Also, for a primer on far more secure alternatives to cryptographic hashes, see my 2012 interview with Thomas Ptacek, How Companies Can Beef Up Password Security.

As long as we’re dealing with laws to help companies shore up their security, I would very much like to see some kind of legislative approach that includes ways to incentivize more companies to deploy two-factor and two step authentication — not just for their customers, but just as crucially (if not more so) for their employees.

PRIVACY PROMISES

President Obama also said he would propose the Student Data Privacy Act, which, according to The Times, would prohibit technology firms from profiting from information collected in schools as teachers adopt tablets, online services and Internet-connected software. The story also noted that the president was touting voluntary agreements by companies to safeguard energy data and to provide easy access to consumer credit scores. While Americans can by law get a free copy of their credit report from each of the three major credit bureaus once per year — at annualcreditreport.com — most consumers still have to pay to see their credit scores.

These changes would be welcome, but they fall far short of the sorts of revisions we need to the privacy laws in this country, some of which were written in the 1980s and predate even the advent of Web browsing technology. As I’ve discussed at length on this blog, Congress sorely needs to update the Electronic Communications Privacy Act (ECPA), the 1986 statute that was originally designed to protect Americans from Big Brother and from government overreach. Unfortunately, the law is now so outdated that it actually provides legal cover for the very sort of overreach it was designed to prevent. For more on the effort to change the status quo, see digitaldueprocess.org.

Also, I’d like to see a broader discussion of privacy proposals that cover what companies can and must/must not do with all the biometric data they’re collecting from consumers. Companies are tripping over themselves to collect oodles of potentially very sensitive such data from consumers, and yet we still have no basic principles that say what companies can do with that information, how much they can collect, how they can collect it or share it, or how they will protect that information.

There are a handful of exceptions at the state level; read more here). But overall, we’re really lacking any sort of basic protections for that information, and consumers are giving it away every day without fully realizing there are basically zero federal standards for what can or should be done with this information.

Coming back to the subject of encryption: Considering how few companies actually make customer data encryption the default approach, it’s discouraging to see elements of this administration criticizing companies for it. There is likely a big showdown coming between the major mobile players and federal investigators over encryption. Apple and Google’s recent decision to introduce default, irrevocable data encryption on all devices powered by their latest operating systems has prompted calls from the U.S. law enforcement community for legislation that would require mobile providers to allow law enforcement officials to bypass that security in criminal investigations.

In October, FBI Director James Comey called on the mobile giants to dump their new encryption policies. Last week, I spoke at a conference in New York where the panel prior to my talk was an address from New York’s top prosecutor, who said he was working with unnamed lawmakers to craft new legal requirements. Last week, Sen. Ron Wyden (D-Ore.) reintroduced a bill that would bar the government from requiring tech companies to build so-called “backdoor” access to their data for law enforcement.

This tension is being felt across the pond as well: British Prime Minister David Cameron also has pledged new anti-terror laws that give U.K. security services the ability to read encrypted communications on mobile devices.

Tags: BakerHostetler, cryptographic hash, data breach notification laws, David Cameron, ECPA, Electronic Communications Privacy Act, New York Times, password hash, Student Data Privacy Act, Thomas Ptacek