A data privacy bill in Washington State has gained momentum. The bill, 2SSB 6281 (also known as the Washington Privacy Act, or WPA), has received widespread support from big tech companies . It’s no wonder they like it because, as currently written, the WPA would be a weak, token effort at reining in corporations’ rampant misuse of personal data.

The WPA didn’t come from nowhere, and it didn’t come alone. A number of industry-friendly groups have proposed or endorsed milquetoast privacy bills that will give the impression of regulation without changing the surveillance business model. Moreover, tech industry lobbyists are promoting the Washington bill as a template , and encouraging legislators in other states to adopt similar laws . Through a variety of channels, the tech industry is pushing state lawmakers to affirm and encode the privacy-harmful status quo.

State legislators should enact laws that change the paradigm and set out clear, robust privacy rights and obligations. Washington state in particular has an opportunity to set an example by resisting industry pressure in its own backyard, and passing robust reforms. But on both substance and enforcement, the current version of the WPA – which passed out of the state senate last week – comes up short.

In this post, we’ll break down the problems with the Washington Privacy Act, and suggest ways to make it better. This should act as a guide for state lawmakers who want to resist industry pressure and pass real user protections.

Why now?

In the wake of Facebook’s Cambridge Analytica scandal and Europe’s adoption of the GDPR , legislators across the U.S. are newly interested in making laws to protect consumer privacy. In 2018, California led the way with the California Consumer Privacy Act (CCPA), which we have called a good first step , though far from a comprehensive solution. Federal lawmakers have been unable to move a national privacy bill , so state governments around the country have taken matters into their own hands.

Privacy advocates and legislators aren’t the only ones interested in what new privacy laws will look like. The tech industry, especially big players like Facebook, Google, Amazon, and Microsoft, have realized that new consumer privacy laws are coming whether they like it or not. As a result, they’ve shifted from fighting against any regulation at all to pitching their own, weak consumer privacy bills in order to head off stronger laws. The WPA is a prime example of this kind of bill; tech companies have been aggressively promoting it by comparing the WPA to the (much stronger) GPDR.

How to improve the WPA

Contrary to what organizations like the Future of Privacy Forum would have you believe , the truth is that the Washington Privacy Act and bills like it are not stronger than the California Consumer Privacy Act (CCPA) or the General Data Protection Regulation (GDPR) in Europe. While 2SSB 6281 addresses some issues, such as data use, that the CCPA does not, the Washington Privacy Act falls far short of the protections provided by the CCPA and GDPR. Washington can—and must—do better.

After passing out of the Senate, the WPA started making its way through the state House. It needs to pass both houses and get the governor’s signature to become law. The House has begun taking some amendments, largely in spite of tech lobbyists, that would make the WPA a better, if not perfect, privacy law. At this time, we have no way of knowing whether any of these amendments will stick, so this post will focus on the problems with the last version passed by the Senate.

Significant changes are needed to 2SSB 6281 to provide clear, fair, and enforceable rules of the road for the treatment of personal data.

Add a Private Right of Action

The right of ordinary people to bring civil actions against companies that harm their privacy is one of EFF’s highest priorities in any data privacy legislation. A private right of action provides a valuable enforcement tool for everyday people, and also ensures that companies face real consequences for privacy harms. But this critical right is currently prohibited in the Senate version of the WPA currently working its way through the legislature.

The bill gives exclusive enforcement authority to the Attorney General’s Office, meaning that the office is likely to only enforce only large patterns of violations. People rightly can sue over product defects, car accidents, breach of contract, or injuries to reputation— they do not have to wait for the state attorney general to bring actions on their behalf in any of these instances. Privacy harms should be no exception.

Make Risk Assessments Transparent

The WPA requires companies to complete periodic “risk assessments”: audits of their own privacy practices. Risk assessment is a concept adopted from the GDPR . Unfortunately, the WPA allows companies to keep these assessments confidential, out of reach from the public. This provision hides information consumers need to know if they are to determine if their privacy is even being threatened. Absent other important protections from the GDPR, the WPA gives companies a hollow way to proclaim they are strong on privacy without ever having to show their work.

Privacy legislation that protects consumers should provide more true transparency for consumers and empower them with the information they need to hold companies accountable. EFF fears that the assessment process will be something that companies loudly spend money on that lets them talk about how much they’re committed to privacy.

Expand the Definition of Sale

The WPA’s current definition of a “sale” of user data is weak enough to exclude most commercial data sharing . In the economy of personal data, most companies don’t exchange information for envelopes of cash. Instead, they give access to personal data in exchange for tools, services, or other in-kind considerations . Unless the definition is expanded to cover all data shared for a commercial purpose, the WPA will not restrict many of the tech industry’s most common data-sharing practices.

Expand the Scope of “Personal Data”

Much of the personal data used for tracking can't be tied to a named consumer, but nevertheless represents an invasion of individual privacy. For example, a company may know the entire location history for someone identified only as “ABCD1234” without knowing their name. That doesn’t make the data any less sensitive, or the potential for abuse any less severe. Location data and other rich behavioral records are notoriously easy to reidentify .

Unfortunately, the WPA’s definition of “personal data” is also narrow. It contains a carve-out for “deidentified” data, which is too broadly defined. There is also a specific exemption for “pseudonymous” data: data which could be reidentified, but is considered non-identifying as long as businesses store other user data in a separate place. Together, these exceptions mean huge portions of sensitive data aren’t properly regulated.

In contrast, the CCPA has an expansive definition of personal information that includes data associated with households and devices. This reduces the ways that companies can use sensitive data outside the law, and makes it as easy as possible for Californians to exercise their rights. The WPA should do the same.

Stop Companies From Discriminating Against Consumers Who Exercise Privacy Rights

Privacy laws must not punish consumers who exercise their statutory rights. This discrimination is particularly critical for low-income consumers, who might not be able to afford to “pay for privacy” or otherwise protect it under such unfair rules. The results would be classes of privacy haves and have-nots, rather than consistently protected privacy rights. Wealth inequality would translate directly into privacy inequality.

An earlier version of the WPA had a strong non-discrimination provision, but the most recent version allows companies to disclose information collected from a loyalty program in certain cases.

Legislators should not allow loyalty programs to charge more money or offer lower-quality service when consumers opt out of having their data shared. Users shouldn’t be penalized for exercising their rights, in the grocery store or anywhere else.

Give Users a Global Opt-Out

The WPA’s right to opt out is too narrowly defined. It only gives consumers the right to opt out of data processing for the purposes of targeted advertising, sale of personal data, or profiling “in furtherance of decisions” that have legal or similar effects. This set of uses is far from complete; it doesn’t cover many of the ways that companies profile individuals for their own ends. And thanks to the narrow definition of “sale,”

it doesn’t let users opt out of more subtle ways companies can profit by sharing their personal data.

A bill that relies upon consumers taking advantage of opt-out rights needs some sort of mechanism to let consumers opt out of entire categories of data sharing all at once. Otherwise, the opt-out rights are not feasible or scalable. Consumers should have the right to avoid processing for any secondary purposes, and should always be able to opt out and exert control over their information.

Close Business-Friendly Loopholes

The WPA purportedly gives people the right to limit the disclosure of their information. But unless that information is sold for targeted advertising, the company selling—and profiting—from that data can unilaterally decline the consumer's request on arbitrary grounds. The WPA does not specify why a company may or may not reject consumer requests, leaving companies to decide themselves what rights their users have.

This is an unacceptable carve-out. Companies should be required to have clear reasons for the collection and use of data, as part of providing the service requested by the consumer. And a user’s request to opt out of sale should mean that their data won’t be sold. Period.

Second, the WPA’s exemption for the Fair Credit Reporting Act (FCRA) should be tightly limited to activities covered by the federal law, and only to the extent that they are covered. This is important to avoid a data broker carveout: many consumer reporting agencies like Experian contain databases that may not necessarily be covered by FCRA. Similarly, the Gramm-Leach-Bliley Act (GLBA) exemption is too broad. Under the GLBA, consumers do not have the right to access, correct, delete, or port the data financial institutions collect about them, and have only a limited ability to opt out of their data-sharing with third parties. As none of these federal laws preempt states from enacting stronger protections, there is no reason for Washington to exempt these data.

Preserve and Allow Stronger Local Law

Cities should be able to adopt their own privacy laws, and state laws should act as a floor, not a ceiling for local laws such as those governing the use of face recognition technology .

Still, Section 14 of SSB6218 would preempt existing local privacy protections. In addition to undermining local sovereignty in the future, this rule would do concrete harm now by overriding laws like the rule to protect broadband users’ privacy that was put into place by the City of Seattle in 2017. Local entities should be able to enact stronger privacy protections than those provided by state law.

Remove Face Recognition

The WPA currently includes face recognition provisions. But, given the controversy surrounding its use, face recognition should be handled in a different bill. The issues around face recognition are complex, and communities across the nation continue to grapple with its consequences in both commercial and law enforcement settings. The way forward in Washington will be considerably clearer if it is addressed in a different bill, for example, Rep. Entenman’s (HB 2856) to place a moratorium on government use of face surveillance.

Ignore the Tech Industry’s Siren Song

On Friday, Feb. 28, a House committee passed amendments to 2SSB 6281 that would eliminate the ban on a private right of action, allow local governments to take stronger measures on facial recognition, and add the possibility of using the Washington Consumer Protection Act as an enforcement mechanism. It would also remove the preemption on stronger local law and expand the definition of “personal data” to eliminate some industry-friendly loopholes.

These improvements are welcome, but at this time we have no way of knowing whether these amendments will appear in the final version. Problematic exemptions and loopholes remain in the bill, and problems with the facial recognition provisions aren’t adequately addressed. Meanwhile, businesses are putting immense pressure on the House to roll back what consumer-friendly amendments have already been taken.

The style of bill favored by Microsoft, IAPP, and FPF falls well short of protecting users. Lawmakers shouldn’t be convinced by legislation pitched as “GDPR-lite:” bills that grant lots of “rights” without thorough definitions or strong enforcement.

Other bad ideas we’ve seen from industry include forcing users to verify themselves before opting out, a “right to cure” that would give law-breaking companies a chance to “fix” their systems before any enforcement can happen, and full-on pay-for-privacy clauses that let companies price consumers out of their rights.