California Amends the Consumer Privacy Act (CCPA); Fixes About 0.01% of its Problems
Recently, Gov. Brown signed SB 1121, the first of possibly several amendments designed to fix and rehabilitate the California Consumer Privacy Act (CCPA). Here is the complete statute as revised. I prepared a redline showing the amendments.
The amendments make a number of beneficial changes, and on balance it’s a positive development. However, these amendments barely scratch the surface of the necessary changes, and I’m baffled how the legislature prioritized which problems to address now vs. later. As you’ll see, there remain numerous outright typos in the amended law, plus a seemingly infinite number of conceptual problems.
This post will recap what changed, what still needs to change, and what’s next.
Highlights of the Changes
There are numerous tiny fixes in the bill, such as the correction of “business'” to “business’s” and the deletion of obvious surplusage after 1798.100(e). Here are some of the bigger changes, many of which are just clarifications:
- The amendments more expressly excludes noncommercial activity from the law (new 1798.145(k)). I think this was clear before, but I’m glad it’s even more express.
- The amendments more expressly says that there is no private cause of action for any CCPA violation other than the data security breaches (1798.150(c)). I also think this was clear before, but still, I’m glad to make it extra-clear. Note that the AG’s office supports a wider private right of action, but this amendment didn’t address the request.
- The amendments refined the interplay between the CCPA and HIPAA (1798.145(c)). Among other things, the amendments excluded “business associates” under HIPAA, in addition to “covered entities,” from the CCPA. The uneasy fit with HIPAA was a known problem with the initial law, so this is a beneficial change.
- The amendments also clarified CCPA’s inapplicability to the financial sector, also a positive change (1798.154(e)).
- The amendments remove the need for private litigants to tender their lawsuit to the AG before suing and the AG’s ability to nix the lawsuit (1798.155(b)). The AG’s ability to kill a lawsuit was likely unconstitutional, so removing that provision was a good idea. Among others, AG Becerra had asked for this change.
- The amendments made it clearer that the specifically identified list of things that qualify as personal information only count if they meet the general definition of personal information (1798.140(o)). This is a helpful clarification, but it doesn’t change much given the law’s massively overbroad definition of personal information. That definition remains my #1 priority for substantive improvements.
- The amendments deleted a reference to B&P 17206, which had details about damage-setting for violations (1798.150(b)). The deletion means that the law is self-contained for calculating damages for violations.
- The amendments changed who gets the revenue from CCPA enforcement actions (1798.155(c)). The ;law as passed put 20% of enforcement proceeds into a dedicated fund for CCPA enforcement, with the remainder presumably going to the general fund. The amendments allocate 100% of enforcement proceeds into the dedicated fund. This raises a number of questions, including: will the fact the AG’s office keeps all the money spur more enforcements and turn the law into a perpetual motion machine?; does this reallocation satisfy the AG’s concern that the legislature didn’t provide enough money to the AG’s office to do the work?; will the fact that 100% of enforcement revenues flow back to the AG’s office motivate the AG’s office to prioritize enforcements over other ways it could promote privacy or support the community?; and which government functions “lost” the 80% revenue stream, and what do they think of this change?
- The amendments preempt conflicting local laws effective immediately (new 1798.199) but delay AG enforcement as much as six months to July 1, 2020 (or sooner, if the AG gets its regs out before January 1, 2020) (1798.185(c)). Delaying the laws’ effectiveness up to 6 months is good, but this fix doesn’t help very much. Businesses deserve at least a full year to address the AG’s regs, which could affect the law’s meaning in substantial ways, but at most they will have 6 months. It’s possible (perhaps unlikely) that the AG will not complete the regs before July 1, 2020, but I suspect the legislature would intervene if that happens. Still, the potentially quick turnarounds could lead to lots of avoidable drama and unnecessary costs. Note that the CCPA requires a 12 month lookback on disclosures of certain data, which means that if the law comes into effect July 1, 2020, businesses will need to have their data architecture in place July 1, 2019 sufficient to accommodate the lookback, so businesses are spending money today that might be mooted by future developments.
What Still Needs to Change
Here’s a very non-exhaustive list of changes the legislature still needs to address. This section is principally derived from my prior roundup of problems and additional points I made in my August post:
1798.120(c): The language is inconsistent about 16 year olds (or, if you read the restriction as applying only to 14 and 15 year olds, then it’s inconsistent about 13 year olds): “a business shall not sell the personal information of consumers if the business has actual knowledge that the consumer is less than 16 years of age, unless the consumer, in the case of consumers between 13 and 16 years of age, or the consumer’s parent or guardian, in the case of consumers who are less than 13 years of age, has affirmatively authorized the sale of the consumer’s personal information.” SB 1121 proposes to renumber this section as 1798.120(c) but doesn’t fix the drafting error.
1798.140(b): “an individual’s deoxyribonucleic acid (DNA)” isn’t data, it’s material from which data can be obtained.
1798.140(k): The term “health insurance information” is defined but never used.
1798.140(o)(2): I think the language “‘Publicly available’ does not include consumer information that is deidentified or aggregate consumer information” meant “personal information,” not “publicly available.” After all, it’s in the definition of “personal information” and the exclusion of deidentified or aggregate consumer information from “publicly available” is too narrow an exclusion.
SCU Privacy Law Fellow, Lydia de la Torre, disagrees with me. She points out that 1798.145(a)(5) excludes deidentified and aggregate consumer information from most obligations in the CCPA, so an exclusion in 140(o)(2) is redundant/unnecessary. But this doesn’t make any sense either, because if 145(a)(5) excludes “deidentified and aggregate consumer information,” then why would it matter if they are in or out of “publicly available”? Either way, 145(a)(5) would moot that characterization. So if my initial assessment isn’t correct, I have no idea what the language is supposed to be.
Again, 1798.140(o)(2): Publicly available information is defined as “information that is lawfully made available from federal, state, or local government records, if any conditions associated with such information.” The bolded phrase is clearly missing words or should be deleted.
1798.105(d)(7) allows a business to reject a deletion request to “enable solely internal uses that are reasonably aligned with the expectations of the consumer based on the consumer’s relationship with the business.” Will any business risk relying on this exception, or will everything need to be based on consumer consent?
This one is from AB 375’s legislative history: “Section 1798.125, where these anti-discrimination and incentive provisions reside, is internally inconsistent to a certain extent. It specifically prohibits “charging different prices or rates for goods or services.” But, it also specifically authorizes in the following paragraph “charging a consumer a different price or rate.” The same tension exists for “providing a different level or quality of goods or services.””
1798.125(b)(1) includes the sentence: “A business may also offer a different price, rate, level, or quality of goods or services to the consumer if that price or difference is directly related to the value provided to the consumer by the consumer’s data.” Is the first “consumer” in the bolded language supposed to be “business”?
More generally, I am uncertain whether 1798.125 will apply to price discrimination in all contexts, not just a restriction on pay-for-privacy. Even charging different prices for different geographies bases the price distinction on “personal information,” i.e., location information. There are exclusions when such price discrimination is allowed, but if it nominally regulates every form of differential pricing, that will stun the entire business community.
1798.130(a)(2) is supposed to be the data portability provision. However, it only covers the prior 12 months, so anyone holding an account for more than 12 months will only get partial portability. Lydia thinks consumers can take advantage of 1798.100(d), which isn’t limited to 12 months. Does 130(a)(2) qualify 100(d), or are each independent disclosure obligations?
1798.140(c)(1) establishes numerical thresholds for enforcement: $25M/yr revenue, 50%+ revenue from data brokerage, or personal information for 50k/yr consumers. Given that in most cases these numbers are not publicly reported, how will the California AG’s office, the enforcement entity, know if a business has cleared these thresholds or not? If the AG’s office has no idea, how will that affect its enforcement discretion?
Also on 1798.140(c)(1), the law doesn’t specify if the $25M threshold or 50,000 consumers must all take place in California, or involve California consumers, or if they are enterprise-wide. If it’s the latter, a business with $25M revenue worldwide theoretically must comply with the law the moment it makes a single sale to California (there may still be personal jurisdiction limits).
Also on 1798.140(c)(1), once a business clears one of the thresholds, the law doesn’t provide any further grace period for compliance and the business must instantly comply with the entire law. The law could say that compliance must be done at some future date (e.g., by the end of the next calendar year after a business clears a threshold), but that’s not what it currently says. And if the business’ numbers spiked one year and it has good reasons to believe that the threshold will not be met again in the foreseeable future, then the business must still implement the law for the possibly-very-brief spike period. I’m sure the AG’s office will exercise some discretion in some of these circumstances, but this seems like an excellent place for the AG’s office to provide some regulations.
1798.140(c)(2) defines business to include: “Any entity that controls or is controlled by a business, as defined in paragraph (1), and that shares common branding with the business. “Control” or “controlled” means ownership of, or the power to vote, more than 50 percent of the outstanding shares of any class of voting security of a business; control in any manner over the election of a majority of the directors, or of individuals exercising similar functions; or the power to exercise a controlling influence over the management of a company. “Common branding” means a shared name, servicemark, or trademark.”
This definition seemingly means that parents and subsidiaries of a company must honor opt-in/opt-out/erasure/disclosure requests across the entire enterprise if they have a “shared name, servicemark, or trademark.” What does that mean? If a company has three different subsidiaries, named Brand California, Brand New York, and Brand Europe, must all of these entities honor opt-in/opt-out/erasure/disclosure requests as if they were a single integrated entity? If so, that will blindside the many companies that do not have an integrated data architecture across subsidiary lines. Or will the fact that the brand isn’t identical (Brand California isn’t an identical brand as Brand New York) avoid this issue?
Lydia raised the question of whether definition reaches non-profit subsidiaries of for-profit businesses. For example, if a for-profit company owns a non-profit company that acts as its charitable arm, it appears the non-profit company may qualify as an “entity.” Fixing this latter problem would be as simple as changing “entity” to “business,” but that wouldn’t fix the underlying concerns about cross-enterprise obligations.
Lydia also raised the question of whether 140(c)(2) treats intra-enterprise data transfers as transfers to third parties, or if the definition of “business” includes the entire enterprise as an integrated unit. The statutory language on this point is inscrutable.
1798.140(e) defines “collects,” “collected,” or “collection” as “buying, renting, gathering, obtaining, receiving, or accessing any personal information pertaining to a consumer by any means. This includes receiving information from the consumer, either actively or passively, or by observing the consumer’s behavior.” Let’s consider an example. 140(o)(1)(H) says personal information includes “olfactory information.” So, in theory, when an employee passively smells a customer in the ordinary course of business (if you’ve ever worked retail, you know exactly what I’m talking about), this would constitute a “collection” that needs to be disclosed and would possibly trigger other obligations. Really? I think the definition was meant to say that it’s collected only when it’s stored in a tangible medium, such as paper or electronic records. Even then, applying the law to paper records would be miserable for the applicable business.
1798.140(o)(1) says personal information can refer to “households,” but the term “households” is never defined and could be located outside California or include non-California residents. This would be a good example of how the law could reach extra-territorially for Dormant Commerce Clause purposes. Lydia thinks all references to “household” should be deleted (or tie it better to California). The same concerns arise with 140(x)’s references to “family” and the numerous references to “devices.”
1798.140(o)(2) says government data “used for a purpose that is not compatible with the purpose for which the data is maintained and made available in the government records or for which it is publicly maintained” drops out of the exclusion for “publicly available” data and just becomes “personal information.” How will third party transferees of this data recognize this?
1798.140(t)(2) unhelpfully cross-references 1798.110 and 1798.115 for information that they don’t contain and raise some questions about how the provisions would be implemented on acquisition.
1798.140(v) defines “service providers” to include only for-profit businesses. Does this give non-profit businesses an advantage, or put them at a disadvantage? For example, non-profit service providers are classified as “third parties,” with different rights and obligations.
1798.145(a) says “The obligations imposed on businesses by this title shall not restrict a business’s ability to…[numerous exclusions].” These exclusions should apply to everyone, not just “businesses” as defined by the statute. For example, the law imposes restrictions on “service providers” and other third parties who should equally qualify for the exclusions.
1798.150(a)(1) creates the private cause of action for data security breaches in the case of “unauthorized access and exfiltration, theft, or disclosure as a result of the business’ violation of the duty to implement and maintain reasonable security procedures and practices.” This grammar is hard to parse. Is “unauthorized access” a precondition for all elements, or only the exfiltration? Does the unauthorized modifier extend to disclosure, or could the cause of action apply to authorized disclosures?
Again in 1798.150(a)(1), who defines what constitutes a “reasonable security procedure and practice”? Will it be AG regulations? Will there be safe harbors based on industry standards or other published specs? Or will it be completely judge-made? Judge-made legal standards pose numerous problems. Plaintiffs can allege unreasonable security procedures/practices after every data security breach and, in theory, at least survive a motion to dismiss. Furthermore, precedent will take years to develop and even longer to change, so by the time a court declares a specific practice reasonable or unreasonable, there will be years of accumulated breaches that may become newly actionable–giving plaintiffs their choice of defendants to pursue.
Again with 1798.150(a)(1): does the statute create a cause of action in some circumstances where there’s actually no obligation to notify consumers of the breach?
1798.150(a)(1)(A) imposes statutory damages of $750 “per consumer per incident.” What is a security “incident”? If a hacker is able to get into a network multiple times, does each intrusion count as an “incident”? If the exfiltrated data is sold to multiple different buyers, are those multiple “incidents”?
1798.155(a) still requires the DOJ to give an opinion to anyone who seeks guidance about compliance, something California AG Xavier Becerra objects to.
General: for consumer who never signed up with the business, how will the business provide them with opt out/disclosure/erasure rights?
General: in a related issue, it’s unclear when the law applies to adtech companies or how they would comply with it. This Digiday article explains more.
General: does the erasure obligation give students unhappy with a course grade the ability to erase it from edtech companies?
General: in several places, the law provides instructions for resolving “conflicts” with other laws. But when do laws “conflict”? This isn’t a new legal question or specific to CCPA, but it will create plenty of uncertainty and countless potential edge cases.
In light of the overlap, the legislature should repeal the private right of action in the existing data breach law in 1798.80. Similarly, the legislature should repeal the existing Shine the Law law (1798.83 and associated sections) due to the overlap and inconsistencies.
According to Bloomberg, AG Becerra’s office says it hopes to release regulations in June 2019. I love that optimism, but I’ll believe it when I see it. I still have no idea who has been tasked with leading this project in the DOJ; and last I heard, the DOJ was still hiring for the team. If the regs are actually released in June 2019, the law would start Jan. 1, 2020, and the amendment’s extra 6 month delay would be negated. The Bloomberg article also indicated that A.B. 1828 and S.B. 862 were expected to provide the DOJ with $700k and 5 new staff members to work on the regulations, but I couldn’t parse those bills to confirm or refute this.
There has been a lot of discussion about the possibility of other states cloning-and-revising the CCPA, but I haven’t heard any specific initiatives yet. It would be a terrible decision for any state to start with this garbage, especially in its current status before California does the hard work to substantively improve it.
There has been a lot of conversations in Congress and the federal executive branch about dealing with the mess made by the CCPA. I’m not sure if anything can get through this Congress, but California’s mistakes give a lot more impetus than I would have imagined.
Shortly after passage, there was some backroom chatter about constitutional challenges to the law. At least in my circles, that conversation seems to have dried up. As far as I can tell, some of the major companies have decided to take the fight to Congress and/or acquiesce and comply with CCPA as best as it can be improved. I think the brand risks to challenging the law are just too high. As a result, it’s possible the most likely constitutional plaintiffs have been coopted into CCPA and moved their attention elsewhere. That leaves the less-well-funded businesses to fend for themselves.
* Recent Developments Regarding the California Consumer Privacy Act
* The California Consumer Privacy Act Should Be Condemned, Not Celebrated
* A First (But Very Incomplete) Crack at Inventorying the California Consumer Privacy Act’s Problems
* Ten Reasons Why California’s New Data Protection Law is Unworkable, Burdensome, and Possibly Unconstitutional (Guest Blog Post)
* A Privacy Bomb Is About to Be Dropped on the California Economy and the Global Internet
* An Introduction to the California Consumer Privacy Act (CCPA)