A First (But Very Incomplete) Crack at Inventorying the California Consumer Privacy Act’s Problems
If you haven’t seen it, I summarized the California Consumer Privacy Act in a 3,000 word primer. If you aren’t familiar with the law, read that first.
This post addresses the law’s multitudinous errors and major ambiguities. The list in this post represents a mix of errors I’ve found plus many crowdsourced suggestions (finding all of the law’s issues is more than any one person could do in a lifetime). However, this post isn’t a comprehensive inventory of the law’s problems. Feel free to send additional suggestions my way.
A technical corrections bill, SB 1121, is pending in the California legislature. Also see the associated legislative history, which is more insightful than the online bill text. I’ll be contacting the legislature to identify the needed changes enumerated in this post. If you’re involved in the backroom conversations, I’d be grateful for extra help getting this post in front of the legislators.
1798.1o0(e) has two subclauses (1) and (2) that are missing an introductory sentence. SB 1121 would delete the subclauses.
1798.120(d): The language is inconsistent about 16 year olds (or, if you read the restriction as applying only to 14 and 15 year olds, then it’s inconsistent about 13 year olds): “a business shall not sell the personal information of consumers if the business has actual knowledge that the consumer is less than 16 years of age, unless the consumer, in the case of consumers between 13 and 16 years of age, or the consumer’s parent or guardian, in the case of consumers who are less than 13 years of age, has affirmatively authorized the sale of the consumer’s personal information.” SB 1121 proposes renumber this section as 1798.120(c) but doesn’t fix the drafting error.
1798.140(b): “an individual’s deoxyribonucleic acid (DNA)” isn’t data, it’s material from which data can be obtained.
1798.140(o)(2): I think the language “‘Publicly available’ does not include consumer information that is deidentified or aggregate consumer information” meant “personal information,” not “publicly available.” After all, it’s in the definition of “personal information” and the exclusion of deidentified or aggregate consumer information from “publicly available” is too narrow an exclusion.
SCU Privacy Law Fellow, Lydia de la Torre, disagrees with me. She points out that 1798.145(a)(5) excludes deidentified and aggregate consumer information from most obligations in the CCPA, so an exclusion in 140(o)(2) is redundant/unnecessary. But this doesn’t make any sense either, because if 145(a)(5) excludes “deidentified and aggregate consumer information,” then why would it matter if they are in or out of “publicly available”? Either way, 145(a)(5) would moot that characterization. So if my initial assessment isn’t correct, I have no idea what the language is supposed to be.
Again, 1798.140(o)(2): Publicly available information is defined as “information that is lawfully made available from federal, state, or local government records, if any conditions associated with such information.” The bolded phrase is clearly missing words.
1798.180 says: “This title is a matter of statewide concern and supersedes and preempts all rules, regulations, codes, ordinances, and other laws adopted by a city, county, city and county, municipality, or local agency regarding the collection and sale of consumers’ personal information by a business.” I believe the bolded language is redundant…?
SB 1121’s legislative history has an extensive list of other typos.
1798.105(d)(7) allows a business to reject a deletion request to “enable solely internal uses that are reasonably aligned with the expectations of the consumer based on the consumer’s relationship with the business.” Will any business risk relying on this exception, or will everything need to be based on consumer consent?
This one is from AB 375’s legislative history: “Section 1798.125, where these anti-discrimination and incentive provisions reside, is internally inconsistent to a certain extent. It specifically prohibits “charging different prices or rates for goods or services.” But, it also specifically authorizes in the following paragraph “charging a consumer a different price or rate.” The same tension exists for “providing a different level or quality of goods or services.””
1798.125(b)(1) includes the sentence: “A business may also offer a different price, rate, level, or quality of goods or services to the consumer if that price or difference is directly related to the value provided to the consumer by the consumer’s data.” Is the first “consumer” in the bolded language supposed to be “business”?
More generally, I am uncertain whether 1798.125 will apply to price discrimination in all contexts, not just a restriction on pay-for-privacy. Even charging different prices for different geographies bases the price distinction on “personal information,” i.e., location information. There are exclusions when such price discrimination is allowed, but if it nominally regulates every form of differential pricing, that will stun the entire business community.
1798.130(a)(2) is supposed to be the data portability provision. However, it only covers the prior 12 months, so anyone holding an account for more than 12 months will only get partial portability. Lydia thinks consumers can take advantage of 1798.100(d), which isn’t limited to 12 months. Does 130(a)(2) qualify 100(d), or are each independent disclosure obligations?
1798.140(c)(1) establishes numerical thresholds for enforcement: $25M/yr revenue, 50%+ revenue from data brokerage, or personal information for 50k/yr consumers. Given that in most cases these numbers are not publicly reported, how will the California AG’s office, the enforcement entity, know if a business has cleared these thresholds or not? If the AG’s office has no idea, how will that affect its enforcement discretion?
Also on 1798.140(c)(1), the law doesn’t specify if the $25M threshold or 50,000 consumers must all take place in California, or involve California consumers, or if they are enterprise-wide. If it’s the latter, a business with $25M revenue worldwide theoretically must comply with the law the moment it makes a single sale to California (there may still be personal jurisdiction limits).
Also on 1798.140(c)(1), once a business clears one of the thresholds, the law doesn’t provide any further grace period for compliance and the business must instantly comply with the entire law. The law could say that compliance must be done at some future date (e.g., by the end of the next calendar year after a business clears a threshold), but that’s not what it currently says. And if the business’ numbers spiked one year and it has good reasons to believe that the threshold will not be met again in the foreseeable future, then the business must still implement the law for the possibly-very-brief spike period. I’m sure the AG’s office will exercise some discretion in some of these circumstances, but this seems like an excellent place for the AG’s office to provide some regulations.
1798.140(c)(2) defines business to include:
Any entity that controls or is controlled by a business, as defined in paragraph (1), and that shares common branding with the business. “Control” or “controlled” means ownership of, or the power to vote, more than 50 percent of the outstanding shares of any class of voting security of a business; control in any manner over the election of a majority of the directors, or of individuals exercising similar functions; or the power to exercise a controlling influence over the management of a company. “Common branding” means a shared name, servicemark, or trademark.
This definition seemingly means that parents and subsidiaries of a company must honor opt-in/opt-out/erasure/disclosure requests across the entire enterprise if they have a “shared name, servicemark, or trademark.” What does that mean? If a company has three different subsidiaries, named Brand California, Brand New York, and Brand Europe, must all of these entities honor opt-in/opt-out/erasure/disclosure requests as if they were a single integrated entity? If so, that will blindside the many companies that do not have an integrated data architecture across subsidiary lines. Or will the fact that the brand isn’t identical (Brand California isn’t an identical brand as Brand New York) avoid this issue?
Lydia raised the question of whether definition reaches non-profit subsidiaries of for-profit businesses. For example, if a for-profit company owns a non-profit company that acts as its charitable arm, it appears the non-profit company may qualify as an “entity.” Fixing this latter problem would be as simple as changing “entity” to “business,” but that wouldn’t fix the underlying concerns about cross-enterprise obligations.
Lydia also raised the question of whether 140(c)(2) treats intra-enterprise data transfers as transfers to third parties, or if the definition of “business” includes the entire enterprise as an integrated unit. The statutory language on this point is inscrutable.
1798.140(e) defines “collects,” “collected,” or “collection” as “buying, renting, gathering, obtaining, receiving, or accessing any personal information pertaining to a consumer by any means. This includes receiving information from the consumer, either actively or passively, or by observing the consumer’s behavior.” Let’s consider an example. 140(o)(1)(H) says personal information includes “olfactory information.” So, in theory, when an employee passively smells a customer in the ordinary course of business (if you’ve ever worked retail, you know exactly what I’m talking about), this would constitute a “collection” that needs to be disclosed and would possibly trigger other obligations. Really? I think the definition was meant to say that it’s collected only when it’s stored in a tangible medium, such as paper or electronic records. Even then, applying the law to paper records would be miserable for the applicable business.
1798.140(o)(1) says personal information can refer to “households,” but the term “households” is never defined and could be located outside California or include non-California residents. This would be a good example of how the law could reach extra-territorially for Dormant Commerce Clause purposes. Lydia thinks all references to “household” should be deleted (or tie it better to California). The same concerns arise with 140(x)’s references to “family” and the numerous references to “devices.”
1798.140(o)(2) says government data “used for a purpose that is not compatible with the purpose for which the data is maintained and made available in the government records or for which it is publicly maintained” drops out of the exclusion for “publicly available” data and just becomes “personal information.” How will third party transferees of this data recognize this?
1798.140(v) defines “service providers” to include only for-profit businesses. Does this give non-profit businesses an advantage, or put them at a disadvantage? For example, non-profit service providers are classified as “third parties,” with different rights and obligations.
1798.145(a) says “The obligations imposed on businesses by this title shall not restrict a business’s ability to…[numerous exclusions].” These exclusions should apply to everyone, not just “businesses” as defined by the statute. For example, the law imposes restrictions on “service providers” and other third parties who should equally qualify for the exclusions.
1798.150(a)(1) creates the private cause of action for data security breaches in the case of “unauthorized access and exfiltration, theft, or disclosure as a result of the business’ violation of the duty to implement and maintain reasonable security procedures and practices.” This grammar is hard to parse. Is “unauthorized access” a precondition for all elements, or only the exfiltration? Does the unauthorized modifier extend to disclosure, or could the cause of action apply to authorized disclosures?
Again in 1798.150(a)(1), who defines what constitutes a “reasonable security procedure and practice”? Will it be AG regulations? Will there be safe harbors based on industry standards or other published specs? Or will it be completely judge-made? Judge-made legal standards pose numerous problems. Plaintiffs can allege unreasonable security procedures/practices after every data security breach and, in theory, at least survive a motion to dismiss. Furthermore, precedent will take years to develop and even longer to change, so by the time a court declares a specific practice reasonable or unreasonable, there will be years of accumulated breaches that may become newly actionable–giving plaintiffs their choice of defendants to pursue.
Again with 1798.150(a)(1): does the statute create a cause of action in some circumstances where there’s actually no obligation to notify consumers of the breach?
1798.150(a)(1)(A) imposes statutory damages of $750 “per consumer per incident.” What is a security “incident”? If a hacker is able to get into a network multiple times, does each intrusion count as an “incident”? If the exfiltrated data is sold to multiple different buyers, are those multiple “incidents”?
General: for consumer who never signed up with the business, how will the business provide them with opt out/disclosure/erasure rights?
General: in a related issue, it’s unclear when the law applies to adtech companies or how they would comply with it. This Digiday article explains more.
General: does the erasure obligation give students unhappy with a course grade the ability to erase it from edtech companies?
General: in several places, the law provides instructions for resolving “conflicts” with other laws. But when do laws “conflict”? This isn’t a new legal question or specific to CCPA, but it will create plenty of uncertainty and countless potential edge cases.
For more issues, see CDT’s annotated statute
* Ten Reasons Why California’s New Data Protection Law is Unworkable, Burdensome, and Possibly Unconstitutional (Guest Blog Post)
* A Privacy Bomb Is About to Be Dropped on the California Economy and the Global Internet
* An Introduction to the California Consumer Privacy Act (CCPA)