A Review of the “Final” CCPA Regulations from the CA Attorney General

On June 2, the California Attorney General’s office (the DOJ) released hundreds of pages of new material about its CCPA regulations, including 11,000+ words of its “final” regulations and a 59 page “final statement of reasons” purportedly explaining the DOJ’s thinking. This blog post recaps the regulations.

Timeline for Development of the CCPA Regulations

June 28, 2018: California enacted the CCPA. The law was initially scheduled to take effect January 1, 2020.

January to March 2019: the DOJ held 7 listening sessions around California to collect oral suggestions about the regulations.

October 11, 2019: more than 15 months following the law’s passage, and less than 3 months before the law took effect, the DOJ posted its first draft of the CCPA regulations. My blog post with comments on the first draft. On the same day, Governor Newsom signed a package of statutory amendments to the CCPA, a number of which negated or conflicted with the first draft of the regulations.

Early December 2019: the DOJ held 4 listening sessions around the state to collect oral feedback on the first draft of the regulations.

January 1, 2020: the CCPA went into effect, but the associated regulations were still 3 drafts and 5+ months away.

February 10, 2020: the DOJ posted the second draft of the CCPA regulations. My blog post with comments on the second draft.

March 27, 2020: the DOJ posted the third draft of the CCPA regulations. My blog post with comments on the third draft.

June 2, 2020 (the documents are dated June 1): the DOJ publicly released the regulation’s “final” version and submitted the regulations to the Office of Administrative Law (OAL) for their review and approval (a process that can take up to 90 days, though the DOJ requested expedited review).

July 1, 2020: pending the OAL review, the DOJ may start enforcing the law–as soon as 28 days after it posted its final version of the regulations.

A Note About the Drafts. The drafts were a roller-coaster. The initial draft was unforgiving on businesses and created some costly requirements that weren’t in the CCPA. In a major shift in tone, the second draft mostly made business-friendly changes. In line with the first draft’s tone, the third draft walked back many of the second draft’s business-friendly changes. The fourth/final draft was identical to the third draft–meaning the DOJ ignored all comments to the third draft.

I submitted comments to all three drafts. For reasons that are unclear to me, the DOJ did not reference my third set of comments in its index, even though I properly submitted my comments before the deadline. It likely didn’t matter because they rejected all comments to the third draft. Still, it makes me wonder why my comments got munched and how many other comments suffered a similar fate.

Some Good Points

I’ll start with a few (minor) good points in the regulations:

Narrowing of “Household” Definition. The CCPA defined personal information to include “household” information, which it did not define. The GDPR does not cover household information, and this innovation was problematic (because multiple people, sometimes unrelated, share households). The regulations narrow the scope of the “household” provision and alleviates many of the concerns.

Limits on Searching for Personal Information. The CCPA nominally requires businesses to delete (upon request) personal information from archived materials, such as stored surveillance video. The regulations provide some limits on the business’ need to immediately delete personal information from archival materials.

Limits on Disclosures of Personal Information. The regulations specify sensitive categories of personal information that are not subject to disclosure requests. This helps reduce the risk of identity theft.

Deletion Requests Require Authentication. Businesses can ignore unauthenticated deletion requests. In prior drafts, the DOJ tried to treat unauthenticated deletion requests as data sales opt-outs.

Some Low Points

Some of my least-favorite parts of the regulations:

“User-enabled global privacy controls.” This terribly designed provision wasn’t in the CCPA. It contemplates that browser software (or plug-ins) will tell websites not to sell the consumers’ information, and businesses must honor this signal as a valid opt-out request.

This approach has many problems. The most obvious: these privacy controls do not exist today. That means the regulations envision hypothetical technology and then try to regulate it (emphasis added):

By requiring that a privacy control be designed to clearly communicate or signal that the consumer intends to opt-out of the sale of personal information, the regulation sets clear parameters for what the control must communicate so as to avoid any ambiguous signals. It does not prescribe a particular mechanism or technology; rather, it is technology-neutral to support innovation in privacy services to facilitate consumers’ exercise of their right to opt-out. The regulation benefits both businesses and innovators who will develop such controls by providing guidance on the parameters of what must be communicated. And because the regulation mandates that the privacy control clearly communicate that the consumer intends to opt-out of the sale of personal information, the consumer’s use of the control is sufficient to demonstrate that they are choosing to exercise their CCPA right….
This subsection is forward-looking and intended to encourage innovation and the development of technological solutions to facilitate and govern the submission of requests to opt-out. Given the ease and frequency by which personal information is collected and sold when a consumer visits a website, consumers should have a similarly easy ability to request to opt-out globally. This regulation offers consumers a global choice to opt-out of the sale of personal information, as opposed to going website by website to make individual requests with each business each time they use a new browser or a new device….
the CCPA is the first law to vest consumers with the right to stop the sale of their personal information. Fortifying this right so that it is meaningful for consumers requires that the OAG establish a robust set of rules and procedures.

In other words, the DOJ contemplates that someone will build technology to meet these specifications, at which point businesses must honor the technology. Why not impose the regulation after the technology exists? Otherwise, the DOJ is basically trying to regulate sci-fi. Consider: how often have regulators properly anticipated how non-existent technology will emerge? NEVER.

The regulations assume that businesses will know when privacy controls emerge. But how? There are dozens of browser software programs, each of which releases new versions all of the time. Add in plug-in releases, and businesses must constantly monitor a virtually infinite universe of potentially qualifying software in case one of them has become a privacy control. Instead, most businesses rationally will simply ignore this requirement until the DOJ confirms that a software program constitutes a privacy control. (The DOJ could have easily avoided this problem by creating a certification process to qualify software as a privacy control). Further, the DOJ did not provide a phase-in period for honoring new privacy controls, so every business–even the most conscientious–will necessarily be out of compliance the moment each new privacy control is launched.

Privacy controls will encounter other technological challenges, such as:

  • the privacy controls need to be fine-grained enough to reflect true consumer intent. A simple binary toggle (i.e., categorically opting-out of all data sales at every website) almost certainly won’t be sufficiently precise. Instead, they will cause unexpected problems for consumers given the CCPA’s overbroad definition of data sales (which can include ordinary and beneficial features).
  • it’s unclear how the privacy controls will authenticate which consumer requested the opt-out. If a browser is shared by more than one consumer, how should the businesses interpret that signal?

Transparency Reports. This is another major terrible provision not mandated by the CCPA. The regulations require a business that collects 10M+ Californians’ personal information to generate detailed transparency reports about their privacy operations. The final statement of reasons justified this new obligation by claiming (1) the transparency reports would act as diagnostics to help the DOJ identify which businesses are failing consumers, and (2) researchers will benefit from the data.

It’s costly to produce transparency reports–costs that we as consumers ultimately bear. So the DOJ’s purported benefits–more data might help!–don’t come for free. The DOJ has other available diagnostics, including consumer complaints about recalcitrant businesses. And while researchers will study the CCPA, it’s not clear the transparency reports will provide useful data.

There is an extensive literature on how to properly design transparency obligations and why so many transparency laws fail. See, e.g., Full Disclosure. The DOJ apparently did not consult any of that literature, so it may have created a future case study.

Verifying Consumer Requests. The CCPA provides several powerful rights to consumers, but it intentionally punted on a key question: how to authenticate that consumers are exercising their rights for themselves? Any malefactor who successfully impersonates a consumer gets a CCPA-required treasure trove. The CCPA dumped responsibility for preventing that to the DOJ.

The DOJ basically dumped this question directly onto businesses by relying a lot on standards, instead of rules, for verifying consumers. In the final statement of reasons, the DOJ says: “determining the appropriate verification standard is fact- and scenario-specific.” The DOJ provided a six factor test to consider. This essentially turns each verification request into an individualized determination, which is not scalable for large businesses dealing with a high volume of incoming requests. The DOJ did provide a few efficiency-increasing “rules”:

  • Password-protected accounts often can qualify as reliable verification.
  • If a business can’t reasonably verify the consumer, it can decline the request.
  • For requests to know categories of collected information, businesses may match “at least two data points provided by the consumer with data points maintained by the business.”
  • For requests to know a consumer’s specific data, businesses may match 3 data points with their records and require consumers to sign a declaration of identity under penalty of perjury. The “penalty of perjury” sounds scary, but the DOJ hasn’t indicated it will actually prosecute perjured declarations, so it’s an empty threat. Consider the irrelevance of the “penalty of perjury” declaration in 17 USC 512(c)(3) takedown notices. Furthermore, for deletion requests (as opposed to right to know requests), the DOJ relies again on standards: whether to use the 2- or 3-data point approach depends “on the sensitivity of the personal information and the risk of harm to the consumer posed by unauthorized deletion.”
  • In the final statement of reasons, the DOJ indicates that businesses cannot require notarized documents unless the business covers the notary costs.

No Verification of Data Sales Opt-Outs. The regulations require businesses to honor unverified data sale opt-outs. This makes no sense due to the over-expansive definition of data sales, which counterintuitively may include valuable and sometimes essential services. As a result, malefactors can weaponize this obligation by submitting opt-out requests in victims’ names, disrupting the victim’s services unexpectedly and forcing the victim to undo the damage. Businesses may reject opt-outs they think are fraudulent, but businesses will rely on that exception cautiously.

The Non-Discrimination Provisions. The CCPA’s non-discrimination provisions are baffling. No one really understands exactly what circumstances they govern. The regulations do not fix these problems. In circumstances where a business is engaged in allegedly discriminatory practices, it must document its math using one of 8 different methods. I’m not sure how expensive those methods will be and how businesses will try to cut corners to save costs.

What’s Missing

Some of the things the regulations should have addressed but didn’t:

Opt-Out Button. The CCPA told the DOJ to develop a standardized button or logo for signaling to consumers how they could opt-out. The DOJ proposed a logo in draft 2 (see right) and abandoned it in draft 3 after criticism from all sides. The DOJ apparently has given up on the button.

Applicability to Non-California Companies. The CCPA purports to reach activity outside of California. That looks like extraterritorial reach (a Dormant Commerce Clause problem); and it creates the risk of inconsistent rules as other states adopt CCPA variants. The DOJ should have clarified its enforcement standards against non-California companies.

No Phase-in Period. The CCPA has numerous numerical thresholds for its obligations. The DOJ should have adopted rules permitting companies to phase-in their compliance after reaching the thresholds. Without a phase-in period, companies must anticipatorily comply with the rules and incur the associated costs, even if they never actually reach the thresholds.

What Constitutes “Personal Information.” The second draft clarified when IP addresses wouldn’t be personal information–a helpful clarification given that treating IP addresses as PI creates some of the stickiest compliance challenges. The third draft walked that back.

GDPR Compliance Safe Harbor. The DOJ has rejected any GDPR compliance safe harbor because it says the GDPR isn’t sufficiently protective of Californians. Instead, we pay extra for the cost of dual compliance.


Standards for Disability. The regulations require that online notices comply with the W3C Web Content Accessibility Guidelines, version 2.1 (June 5, 2018). While this adds to the compliance obligations, at least businesses know what they need to do (unlike the ADA jurisprudence for websites). Indeed, I expect websites will implement WCAG across their entire service, not just the privacy policy pages.

Lessons From the Rule-Making Process

It’s too much for any one state to properly develop sensible comprehensive privacy regulations–even a state as large and wealthy as California. We’ve learned that the California legislature and DOJ are overwhelmed with the challenge of properly superintending the CCPA. Thus, the legislature made relatively few and mostly minor amendments to the CCPA in 2019 and will make even fewer in 2020; the legislature can’t resolve key questions such as whether the CCPA applies to employee data; and the DOJ needed two years for its rule-making process (and skipped hard topics like the opt-out button). It’s naive to think that California will eventually get the CCPA “right”; and it’s even more naive to think that smaller states with fewer resources can do better than California. As a result, it’s clear that only Congress and the federal government can properly handle a project of this magnitude, despite their own obvious governance challenges.

The CCPA’s rollout also undermines confidence in our legal institutions more generally. It hurt the rule of law when the CCPA came into effect on January 1 without the required regulations. It further hurt the rule of law when the DOJ promulgated its final draft of massive regulations just a few weeks before enforcement. It also undermines respect for the law when the legislature and the DOJ refuse to acknowledge the pandemic’s devastation, imposing expensive and complex obligations on businesses who are just trying to keep the lights on. This is not the right way to manage the world’s fifth-largest economy.

What’s Next?

Enforcement Actions. After OAL approves the regulations, the DOJ surely plans to bring some enforcement actions ASAP. However, the DOJ must give businesses a 30-day cure period. Perhaps the DOJ has already issued nocompliance notices so that they can expedite the first round of enforcements. Either way, the first wave of public enforcements will be designed for maximum propaganda value. I think the second wave of enforcements will better indicate where the DOJ is investing its enforcement budget.

Court Challenges to the CCPA and Regulations. I’ve heard many rumors of potential legal challenges to the CCPA or the regulations. At this point, I doubt there will be any preemptive strikes. A legal challenge would be costly, time-consuming and not certain to win, and few businesses want to take the publicity hit of being “anti-privacy.” Instead, any court challenges are likely to come from contested enforcements, but most businesses will acquiesce to any DOJ demands rather than litigate. So I have no idea when a judge will review the CCPA.

Further Legislative Amendments. The legislature is working on another package of amendments. I doubt we will see many significant changes. For example, the legislature is likely to punt the employee data issue for another year.

The Next Ballot Initiative. The November ballot will include The California Privacy Rights and Enforcement Act of 2020 (CPRA), a 20k+ word beaut of legislative drafting from the same team that funded the CCPA ballot initiative. Yes, before we know how the CCPA works, and while we are in the middle of a pandemic and a related economic depression that is already devastating small- and medium-sized businesses and the California economy, and while we are focusing on a presidential election that may be our last chance to defeat authoritarianism, the ballot proponents think it’s a great time to abandon the CCPA “experiment,” waste a ton of investments that have been already made by both governments and businesses to accommodate the CCPA, create a brand new government agency from scratch, and make significant parts of consumer privacy law impossible to superintend except through additional ballot initiatives.

The decision to double down on a new initiative before the CCPA even became effective is a profound rejection of legislative governance. It’s also an F-U to the businesses that have worked hard and spent lots of money to jump through the CCPA’s often arbitrary and stupid hoops.

It’s possible that tens of millions of dollars will be wasted on trying to sway voters about the CPRA–especially to help voters understand that the question isn’t “privacy good/bad” but how much CPRA improves the existing CCPA baseline. Obviously the proponents think that voters are still so angry about privacy that they will vote yes on anything pitched as pro-privacy–even a massive dumpster-fire initiative that will make the CCPA look like a small trashcan fire, not the massive dumpster fire it is.

Federal Legislation. We are now guaranteed to live in a dystopian world of privacy regulation: the CCPA is bad enough, but when other legislatures clone-and-revise it, the multi-state regulatory thicket will be overwhelming. Furthermore, given the CPRA’s massive disrespect towards the legislative process, OF COURSE there will be additional privacy-related ballot initiatives in California–creating the possibility of a perpetually moving target and a usurpation of legislative governance.

As a result, the only remaining post-CCPA win left is preemptive federal legislation. Whether or not the CPRA passes, we have to stop the madness of nonstop ballot initiatives; we have to sideline state legislatures who can’t handle the challenges of making and maintaining comprehensive privacy laws; and we have to avoid the multi-state pileup from multiple CCPA variants. A new federal law with preemption is our only hope, and even a bad federal privacy law will be better than the current dystopia we’ve created for ourselves.

Unfortunately, Congress has too many other dumpster fire-level problems, so Congress won’t act until we experience more terrible privacy regulatory outcomes. So as bad as things are with the CCPA, we haven’t hit bottom yet.

Prior CCPA Posts

* The CCPA Proposed Regs’ Data Valuation Calculation Provisions Provide Flexibility, But Raise Ambiguity & Transparency Concerns (guest blog post)
My Third Set of Comments to the CA DOJ on the CCPA Regulations
Comments on the DOJ’s Proposed Modifications to the CCPA Regulations
Eric Goldman’s Comments to the California DOJ Draft Regulations for the Consumer Privacy Act (CCPA) (Part 3 of 3)
Some Lessons Learned from the California Consumer Privacy Act (CCPA), 18 Months In (Part 2 of 3)
Resetting the California Consumer Privacy Act (CCPA)…with 2 Weeks To Go! (Part 1 of 3)
And At the End of the Day, the CCPA Remains Very Much the Same (Guest Blog Post)
A Recap of the Senate Judiciary Committee Hearing on Amending the California Consumer Privacy Act (Guest Blog Post)
Want Companies to Comply with the CCPA? Delay Its Effective Date (Guest Blog Post)
Recap of the California Assembly Hearing on the California Consumer Privacy Act
A Status Report on the California Consumer Privacy Act
41 California Privacy Experts Urge Major Changes to the California Consumer Privacy Act
California Amends the Consumer Privacy Act (CCPA); Fixes About 0.01% of its Problems
Recent Developments Regarding the California Consumer Privacy Act
The California Consumer Privacy Act Should Be Condemned, Not Celebrated
A First (But Very Incomplete) Crack at Inventorying the California Consumer Privacy Act’s Problems
Ten Reasons Why California’s New Data Protection Law is Unworkable, Burdensome, and Possibly Unconstitutional (Guest Blog Post)
A Privacy Bomb Is About to Be Dropped on the California Economy and the Global Internet
An Introduction to the California Consumer Privacy Act (CCPA)