On June 28, California enacted the Consumer Privacy Act (A.B. 375), a well-intentioned but flawed new law that seeks to protect the data privacy of technology users and others by imposing new rules on companies that gather, use, and share personal data. There's a lot to like about the Act, but there is substantial room for improvement. Most significantly:

The Act allows businesses to charge a higher price to users who exercise their privacy rights.

The Act does not provide users the power to bring violators to court, with the exception of a narrow set of businesses if there are data breaches.

For data collection, the Act does not require user consent.

For data sale, while the Act does require user consent, adults have only opt-out rights, and not more-protective opt-in rights.

The Act’s right-to-know should be more granular, extending not just to general categories of sources and recipients of personal data, but also to the specific sources and recipients. Also, the right-to-know should be tailored to avoid news gathering.

The law goes into effect in January 2020, which means privacy advocates have 18 months to strengthen it—and to stave off regulated companies' attempts to weaken it.

Background to the Act

For many years, a growing number of technology users have objected to the myriad ways that companies harvest and monetize their personal data, and users have called on companies and legislators to do a better job at protecting their data privacy. EFF has long supported data privacy protections as well.

In March 2018, the Cambridge Analytica scandal broke. The public learned that private data was harvested from more than 50 million Facebook users, without their knowledge and consent, and that the Trump presidential campaign used this private data to target political advertisements. Demand for better data privacy rules increased significantly.

In May 2018, supporters of a California ballot initiative on data privacy filed more than 600,000 signatures in support of presenting the initiative to voters, nearly twice the number of signatures required to do so. But ballot initiatives are an imperfect way to make public policy on a complex subject like data privacy. Before enactment, it can be difficult for stakeholders to help improve an initiative’s content. And after enactment, an initiative can be difficult to amend.

California legislators hoped to do better, but now they faced a deadline. June 28 was the last day the initiative’s sponsor could remove it from the ballot, and the sponsor told the legislature that he would do so only if they passed data privacy legislation first. Legislators rushed to meet this deadline, but that rush meant privacy advocates didn’t have much chance to weigh in before it was passed.

The Basics of the CCPA

The CCPA creates four basic rights for California consumers:

A right to know what personal information a business has about them, and where (by category) that personal information came from or was sent. See Sections 100, 110, 115. See also Section 140(c) (defining “business”), and Section 140(o) (defining “personal information”).

A right to delete personal information that a business collected from them. See Section 105. While the right-to-know extends to all information a business collected about a consumer, the right-to-delete extends to just the information a business collected from them.



A right to opt-out of sale of personal information about them. See Section 120. See also Section 140(t) (defining “sale”).

A right to receive equal service and pricing from a business, even if they exercise their privacy rights under the Act, but with significant exceptions. See Section 125.

The Act also creates a limited right for consumers to sue businesses for data security breaches, based on California’s existing data breach notification law. See Section 150. Most of the Act’s enforcement punch, however, rests with the California Attorney General (AG), who can file civil actions against violations of the Act. See Section 155. The AG is also responsible for promulgating regulations to flesh out or update the CCPA framework. See Section 185.

As we explained above, the CCPA was put together quickly, and with many important terms undefined or not clearly defined. As a result, these rights in some cases look better than they really are. Fortunately, the new CCPA is generally understood to be a work in progress. Legislators, privacy advocates, and regulated companies will all be seeking substantive revisions before the law goes into effect. The rest of this post focuses on EFF's suggestions.

Opt-in Consent to Collection

Many online services gather personal data from technology users, without their knowledge or consent, both when users visit their websites, and, by means of tracking tools, when users visit other websites. Many online services monetize this personal data by using it to sell targeted advertising. New legislation could require these online services to obtain the users’ opt-in consent to collect personal data, particularly where that collection is not necessary to provide the service.

The CCPA does not require online services to obtain opt-in consent before collecting personal data from users. Nor does it provide users an opportunity to opt-out of collection. The law does require notice, at or before the point of collection, of the categories of collected data, and the purposes of collection. See Section 100(b). But when it comes to users’ autonomy to make their own decisions about the privacy of their data, while notice is a start, consent is much better. The legislature should amend the Act to require it.

Some limits are in order. For example, opt-in consent might not be required for a service to perform actions the user themselves have requested (though clear notice should be required). Also, any new regulations should explore ways to avoid the “consent fatigue” that can be caused by a high volume of opt-in consent requests.

“Right to Know” About Data Gathering and Sharing

Technology users should have an affirmative “right to know” what personal data companies have gathered about them, where the companies got it, and with whom the companies shared it, subject to some limits to ensure that the right to know does not impinge on other rights.

The CCPA creates a right to know, empowering “consumers” to obtain the following information from “businesses”:

The categories of personal information collected. See Sections 100(a), 110(a)(1), 110(c)(1), 115(a)(1).

The categories of sources of the personal information. See Sections 110(a)(2), 110(c)(2).

The purposes for collecting the personal information. See Sections 110(a)(3), 110(c)(3).

The categories of third parties with whom businesses shares personal information. See Sections 110(a)(4).

The categories of personal information sold. See Sections 115(a)(2), 115(c)(1).

The Act defines a “consumer” as any natural person who resides in California. See Section 140(g). The Act defines a “business” as a for-profit legal entity with: (i) annual gross revenue of $25 million; (ii) annual receipt or disclosure of the personal information of 50,000 consumers, households, or devices; or (iii) receipt of 50% or more of its annual revenue from selling personal information. See Section 140(c).

The Act’s right-to-know would be more effective if it was more granular. It allows people to learn just the “categories” of sources and recipients of their personal data. People should be able to learn the specific sources and recipients.

Moreover, the Act’s right-to-know should be tailored to avoid impacting news gathering, which is protected by the First Amendment, when undertaken by professional reporters and lay members of the public alike. For example, if a newspaper tracked visitors to its online edition, the visitors’ right-to-know could cover that tracked information, but should not also extend to a reporters’ investigative file.

Data Portability

Users generally should have a legal right to “data portability”, that is, to obtain a copy of the data they provided to an online service. People might use this data in myriad ways, including self-publishing their own content, better understanding their service provider, or taking their data to a rival service.

The CCPA advances data portability. Consumers may obtain from businesses the “specific pieces” of personal information collected about them. See Sections 100(a), 110(c)(5). Moreover, the Act provides that if “provided electronically, the information shall be in a portable and, to the extent technically feasible, in a readily useable format that allows the consumer to transmit their information to another entity.” See Section 100(d).

It will be important to ensure that “technical infeasibility” does not become an exception that swallows the rule.

Also, it may be appropriate to address scenarios where multiple users’ data is entangled. For example, suppose Alice posts a photo of herself on social media, under a privacy setting that allows only certain people to see the photo, and Bob (one of those people) posts a comment on the photo. If Bob seeks to obtain a copy of the data he provided to that social media, he should get his comment, but not automatically Alice’s photo.

Consent to Data Sharing

As discussed above, EFF supports properly tailored legislation that requires companies to get opt-in consent before collecting a user’s personal data. Opt-in consent should also be required before a company shares that data with a third party. The more broadly that personal data is disseminated, the greater the risk of theft by malicious hackers, misuse by company employees, and expanded uses by company managers. Technology users should have the power to control their personal data by deciding when it may be transferred from one entity to another.

The CCPA addresses sale of personal data. It defines “sale” to include any data transfer “for monetary or other valuable consideration.” See Section 140(t). Adults have a right to opt-out of sales. See Sections 120(a), 120(c). To facilitate such opt-outs, businesses must provide a “do not sell my personal information” link on their homepages. See Section 135(a)(1). Minors have a right to be free from sales absent their opt-in consent. See Sections 120(c), 120(d). Also, if a third party buys a user’s personal data from a company that acquired it from the user, the third party cannot re-sell that personal data, unless they notify the user and give them an opportunity to opt-out. See Section 115(d).

However, the Act’s provisions on consent to data sharing are incomplete. First, all users—adults as well as minors—should be free from data sales and re-sales without their opt-in consent. While opt-out consent is good, opt-in consent is a better way to promote user autonomy to make their own decisions about their data privacy.

Second, the opt-in consent rules should apply to data transfers that do not yield (in the Act’s words) “valuable consideration.” For example, a company may find it to be in its business interests to give user data away for free. The user should be able to say “no” to such a transfer. Under the current Act, they cannot do so. By contrast, the original ballot initiative defined “sale” to include sharing data with other businesses for free.

Notably, the Act empowers the California Attorney General to issue regulations to ensure that the Act’s various notices and information are provided “in a manner that may be easily understood by the average consumer.” See Section 185(a)(6). We hope these regulations will address the risk of “consent fatigue” that can result from opt-in requests.

Deletion

The CCPA provides that a consumer may compel a business to “delete” personal information that the business collected from the consumer. See Section 105(a).

The Act provides several exceptions. Two bear emphasis. First, a business need not delete a consumer’s personal information if the business needs it to “exercise free speech, ensure the right of another consumer to exercise his or her right of free speech, or exercise another right provided for by law.” See Section 105(d)(4). Second, a business may keep personal information “to enable solely internal uses that are reasonably aligned with the expectations of the consumer based on the consumer’s relationship with the business.” See Section 105(d)(7). Confusingly, another exception uses similar language, and it’s unclear how these interact. See Section 105(d)(9) (“Otherwise use the consumer’s personal information, internally, in a lawful manner that is compatible with the context in which the consumer provided the information”).

Deletion is a particularly tricky aspect of data privacy, given the potential countervailing First Amendment rights at issue. For example, suppose that Alice and Bob use the same social media service, that Alice posts a photo of herself, that Bob re-posts it with a caption criticizing what Alice is doing in the photo, and that Alice becomes embarrassed by the photo. A statute empowering Alice to compel the service to delete all copies of the photo might intrude on Bob’s First Amendment interest in continuing to re-post the photo. EFF is working with privacy and speech advocates to find ways to make sure the CCPA ultimately strikes the right balance.

But EFF will strongly oppose any provision empowering users to compel third-party services (including search engines) to de-list public information about them. Laws outside the United States that do this are often called the “right to be forgotten.” EFF opposes such laws, because they violate the rights to free speech and to gather information. Many of us may be embarrassed by accurate published reports about us. But it does not follow that we should be able to force other people to forget these reports. Technology users should be free to seek out and locate information they find relevant.

Non-discrimination

The CCPA provides that if a user exercises one of the foregoing statutory data privacy rights (i.e., denial of consent to sell, right to know, data portability, or deletion), then a business may not discriminate against the user by denying service, charging a higher price, or providing lower quality. See Section 125(a)(1). This is a critical provision. Without it, businesses could effectively gut the law by discriminating against users that exercise their rights.

Unfortunately, the Act contains a broad exemption that threatens to swallow the non-discrimination rule. Specifically, a business may offer “incentives” to a user to collect and sell their data, including “payments.” See Section 125(b)(1). For example, if a service costs money, and a user of this service refuses to consent to collection and sale of their data, then the service may charge them more than it charges users that do consent. This will discourage users from exercising their privacy rights. Also, it will lead to unequal classes of privacy “haves” and “have nots,” depending upon the income of the user. EFF urges the California legislature to repeal this exemption from the non-discrimination rule.

This problem is not solved by the Act’s forbidding financial incentives that are “unjust, unreasonable, coercive, or usurious.” See Section 125(b)(4). This will not stop companies from charging more from users who exercise their privacy rights.

The Act also allows price and quality differences that are “reasonably related” or “directly related” to “the value provided to the consumer by the consumer’s data.” See Sections 125(a)(2), 125(b)(1). These exemptions from the non-discrimination rule are unclear and potentially far-reaching, and need clarification and limitation.

Empowering Users to Enforce the Law

One of the most powerful ways to ensure enforcement of a privacy law is to empower users to take violators to court. This is often called a “private cause of action.” Government agencies may fail to enforce privacy laws, for any number of reasons, including lack of resources, competing priorities, or regulatory capture. When a business violates the statutory privacy rights of a user, the user should have the power to decide for themselves whether to enforce the law. Many privacy statutes allow this, including federal laws on wiretaps, stored electronic communications, video rentals, driver’s licenses, and cable subscriptions.

Unfortunately, the private right of action in the CCPA is woefully inadequate. It may only be brought to remedy certain data breaches. See Section 150(a)(1). The Act does not empower users to sue businesses that sell their data without consent, that refuse to comply with right-to-know requests, and that refuse to comply with data portability requests. EFF urges the California legislature to expand the Act’s private cause of action to cover violations of these privacy rights, too.

The Act empowers the California Attorney General to bring suit against a business that violates any provision of the Act. See Section 155(a). As just explained, this is not enough.

Waivers

Too often, users effectively lose their new rights when they “agree” to fine print in unilateral form contracts with large businesses that have far greater bargaining power. Users may unwittingly waive their privacy rights, or find themselves stuck with mandatory arbitration of their privacy rights (as opposed to their day in an independent court).

So we are very pleased that the CCPA expressly provides that contract provisions are void if they purport to waive or limit a user’s privacy rights and enforcement remedies under the Act. See Section 192. This is an important provision that could be a model for other states as well.

Rule Making

The CCPA empowers the California Attorney General to adopt regulations, after it solicits broad public participation. See Section 185. These regulations will address, among other things, new categories of “personal information,” new categories of “unique identifiers” of users, new exceptions to comply with state and federal law, and the clarity of notices.

EFF will participate in this regulatory process, to help ensure that new regulations strengthen data privacy without undue burden, particularly for nonprofits and open-source projects.

Next Steps

The CCPA is just a start. Between now and the Act’s effective date in January 2020, much work remains to be done. The Act itself makes important findings about the high stakes:

The proliferation of personal information has limited Californians’ ability to properly protect and safeguard their privacy. It is almost impossible to apply for a job, raise a child, drive a car, or make an appointment without sharing personal information. . . . Many businesses collect personal information from California consumers. They may know where a consumer lives and how many children a consumer has, how fast a consumer drives, a consumer’s personality, sleep habits, biometric and health information, financial information, precise geolocation information, and social networks, to name a few categories. . . . People desire privacy and more control over their information.

EFF looks forward to advocating for improvements to the Act in the months and years to come.