On 1 January 2020, in California, USA, the new California Consumer Privacy Act (CCPA) enters into force. This law obliges large businesses to offer an opt-out to their users for the selling of personal information to third parties. But what does 'selling' mean? To whom does the law apply, should businesses in Europe care?
Chris Hoofnagle, adjunct Professor of Law and Faculty Director, Berkeley Center For Law & Technology, University of California, helped draft the referendum to create a new privacy law in California, together with his former student Ashkan Soltani. Close to the date of the ballot, the funder of the referendum, Alastair Mactaggart, decided to withdraw the initiative in return for the promise from the legislature to adopt a law with similar effects. This new law is the CCPA. For more information on the differences between the CCPA and the GDPR, take a look at our factsheet.
* click on the questions to see the answers.
The United States lacks a generally-applicable privacy law like the GDPR in the EU. Instead, the U.S. has a sector-by-sector and state-by-state approach, which on one hand creates a kind of privacy thicket while on the other, leaves some kinds of companies, such as data brokers, free to sell data promiscuously.
The CCPA is a state law, so its application is technically limited to California. But this limit is really just a technicality because of two dynamics. First, the California market is so large that companies nationwide will attempt to comply with it. Second, the CCPA’s breadth makes it more GDPR-like than the sector-by-sector approaches so dominate in the U.S. Mactaggart carefully studied industry practices and wrote the CCPA with such breadth and detail that many privacy sleights of hand were anticipated and forclosed upon.
The CCPA strengthens data breach legislation and introduces new privacy rights for consumers, such as the right to opt-out from the sales of their information to other companies, the right to be informed what personal data will be processed for what purposes, the right to access the personal information a company has on them and all the third parties those data are shared with, as well as a right to delete personal information. The CCPA also enables consumers to sue—but only for data breaches. The privacy provisions are enforceable by public authorities.
There’s something of a privacy zeitgeist at the moment. Shoshana Zuboff’s notion of “surveillance capitalism” has resonated. In addition, privacy advocates have struggled to articulate broadly threatening privacy harms, but that changed with the Cambridge Analytica scandal. Finally, I think artistic critique is important to shaping worldview and HBO’s Silicon Valley and Joshua Cohen’s Book of Numbers have provided credible, derogatory portrayals of how platforms think of users.
The CCPA is not the GDPR. But CCPA has created privacy traction in California that even the GDPR could not. The reason why is that the CCPA has more political credibility among tech companies. Companies see CCPA as homegrown law, rather than rules imposed from afar. Thus, European businesses should expect to see CCPA compliance language in contracts.
The law applies to for-profit organizations that collect personal information about residents in California and do business in California, when they determine the purpose and means of the processing, and meet one or more of the following three criteria:
The definition in the CCPA is: “information that identifies, relates to, describes, is capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular consumer or household.” The key difference here is household. This difference is in response to an old data-industry stratagem: to characterize information as “household level” and thus not subject to privacy rules. Into the 2000s, industry groups characterized data such as telephone numbers as non-personal, because at least some telephone numbers pertained to a household. We all know, however, that data brokers use telephone numbers as unique identifiers and even as authenticators. The CCPA forecloses this and other privacy sleights of hand, such as arguing that IP addresses or device identifiers are not personal information.
Reconceptualizing data “selling” was a major goal of the CCPA. The CCPA interprets sale to mean any kind of transfer of personal information to another business for “monetary or other valuable consideration.” The CCPA sets such a low bar for sales precisely because companies have used imprecise and misleading terms to mask data transfers. For instance, companies would claim that they do not sell personal data but only “share” it with “trusted partners.”
Some practitioners are still resisting the definition and attempting to read “valuable consideration” such that information is tangential and not considered part of the “sale.” But I think courts will see through this sleight of hand. As a strategy, it does not ring true with the nature of data transactions, which often involve lengthy negotiations about what data is alienated. Courts will ask, “if the data are not the consideration, why did you spend so much time negotiating over the terms of API access?”
Companies dislike being told how to communicate with their consumers. The command to “Provide a clear and conspicuous link on the business’s Internet homepage, titled ‘Do Not Sell My Personal Information,’” is thus one that will be avoided at almost any cost. So, no, I don’t think we will see a barrage of cookie banners.
Instead, companies will try to fit network advertisers—who indeed are “sold” personal information under the CCPA—under the legal exception that they don't sell the information but only transmit it to a service provider. This is the service provider exception. Companies can still have a lot of network advertisers, but according to the exception, these service providers are under limitations and obligations similar to data processors. How this exception will be interpreted in practice, remains to be seen. The CCPA embraces the idea that first parties can do behavioral advertising, but it attempts to limit the excesses of real time bidding and the promiscuous transfer of data to third parties. RTB shops don’t have many options, and are likely to misinterpret the law in order to preserve operations, and so I think we’ll see the first enforcement actions against adtech.
The CCPA is opt out because of First Amendment concerns. Because the CCPA covers even non-sensitive personal information, there is a real risk that an opt-in or affirmative consent requirement for collection or sale of data would be challenged successfully as an impermissible infringement on commercial speech.
At the same time the CCPA has an innovative opt out provision, inspired by the environmental non-profit Catalog Choice. Catalog Choice centralises opt-out procedures and makes it easy for people to opt out of junk mail. The CCPA goes farther, allowing one to designate an agent to perform opt outs. My vision for this is for non-profits and even for-profit entities to become assertive privacy agents. They’ll be able to opt people out at scale, and detect violations of those choices.
I was attempting to build a record to support the CCPA against future first amendment challenges, and to emphasize the advantages of privacy markets. On the speech point, CCPA will be challenged both on the requirement to state “Do Not Sell My Personal Information,” and on the rights to opt out and deletion. My point in that domain is that existing business representations have misled consumers systemically. Early in my career, with colleagues, I documented that most people falsely believe that companies are legally barred from selling data. Dramatic corrective disclosures—like “Do Not Sell” my data—are necessary to disabuse the majority of consumers who think that companies are, by default, statutorily banned from selling personal data.
The second point is anathema to Europeans, but I do think that the fastest way to privacy is to create commercial incentives to protect people. The ability to delegate opt out will give another service for companies like LifeLock to sell. Tens of millions of people sign up for such service, which will mean massive opt-out activity once CCPA is in effect.
The main point is in the fourth finding: “Even before the CCPA had gone Into effect, the Legislature considered many bills In 2019 to amend the law, some of which would have significantly weakened It. Unless California voters take action, the hard-fought rights consumers have won could be undermined by future legislation.”
What MacTaggart is saying here is that tricky amendments have been introduced that purport to be “technical,” but that would eviscerate the whole law. MacTaggart is coming back with an initiative so that he does not have to fight a constant rear-guard battle with Google and a few other companies that are trying to eliminate privacy rules. Enacted as an initiative, the California privacy rights would be more difficult for lobbyists to dilute because any amendment would have to be consistent with the initiative’s intent to promote privacy rights.
The states tend to copycat each other. Already Nevada has enacted a mini-CCPA. In the short term, we will see state laws that duplicate and sometimes go farther than the CCPA. However, there is a real, legitimate concern that privacy rights that are directly enforceable will create nuisance litigation. MacTaggart fears this, and that is why there is a kind of safe harbor for private suits concerning security breaches.
There is more movement in D.C. for a federal privacy bill but it is still unlikely to be enacted. We shouldn’t rush the process, because one function of the process is to build a case for privacy as a value. Google and Facebook still see privacy as an illegitimate, even archaic cause. Thus, they’ll find technical means to defeat any privacy law that passes. We need the process to build a consensus that privacy is still a relevant value, one worth protecting, so that the Googles and Facebooks of the world feel an obligation to comply.