Ten Reasons Why California’s New Data Protection Law is Unworkable, Burdensome, and Possibly Unconstitutional (Guest Blog Post)

By guest blogger Jeff Kosseff

[Jeff Kosseff is an assistant professor of cybersecurity law at the U.S. Naval Academy. The views in this post are only his, and do not represent the Naval Academy, Department of Navy, or Department of Defense. He can be reached at jkosseff@gmail.com.]

When it comes to new technology regulations, I try to avoid alarmism. In many cases, the new regulations are necessary or, at the very least, helpful on balance. For instance, I hope that Congress passes comprehensive cybersecurity legislation that addresses modern threats, such as Internet of Things. And last year, I testified to the House Judiciary Committee that narrowly amending the platform immunity of Section 230 of the Communications Decency Act to address online sex trafficking would not destroy the Internet.

So when I heard that the California legislature was considering a data protection bill last month, I wasn’t worried. Then I read the 10,000-plus-word bill, and my initial reaction was that nothing this unclear, burdensome, and constitutionally problematic could ever become law. (Eric laid out similar concerns). Yet the California legislature passed it, and Governor Brown signed it into law. More than any other state or federal technology law, the California Consumer Privacy Act(CCPA) will upend the operations of not only technology companies, but nearly any company that processes data of California residents (i.e., the vast majority of U.S. companies).

This post summarizes 10 of my top concerns with CCPA. This far from an exhaustive list; we all are still digesting the impacts of this massive statute.

  1. Companies just spent two years – and billions of dollars – on GDPR

In 2016, the European Union passed a sweeping new law, the General Data Protection Regulation (GDPR). Companies have spent billions of dollars assessing whether they have to comply and, if so, how to do so. If companies did not have operations in Europe, they still had to abide by the GDPR’s 99 articles if they monitored the behavior of people in Europe or sold goods or services to people located in Europe (whether a company triggers either of those prongs requires application of a legal rule that has created great uncertainty for companies based outside of Europe).

GDPR went into effect on May 25, 2018. The CCPA, which goes into effect on Jan. 1, 2020, requires companies to engage in 18 more months of data flow analysis and regulatory compliance. While I suspect that large companies like Google can shoulder this burden, small and mid-sized companies will have a tough time finding yet additional legal and information technology resources for another data protection law. If a company is complying with GDPR (or, at least attempting to comply), there is no guarantee that the California requirements will entirely overlap.

  1. Many small companies must comply with CCPA.

Perhaps recognizing the tremendous burden of CCPA compliance, the drafters attempted to exclude small businesses. However, exempt businesses must have gross annual revenues under $25 million. Many start-ups that are in growth mode will not fall under the exemption.

Moreover, the “small business” exception will not apply if a company buys, receives, sells, or shares for commercial purposes the personal information of 50,000 California residents in a year, nor will it apply if 50 percent or more of the company’s revenues come from selling personal information. Consumer-facing small businesses, such as retailers, may well receive the personal information of at least 50,000 California residents a year. The 50,000-resident carve-out ignores the reality that many small companies that have nothing to do with consumer marketing or analytics may receive records of tens of thousands of Californians as a matter of course.

  1. Many companies’ systems are not set up for CCPA

The core of CCPA requires companies to allow California residents to request that businesses disclose “the categories and specific pieces of personal information the business has collected.” If the business receives such a request, CCPA requires the business to “promptly take steps to disclose and deliver, free of charge to the consumer, the personal information” required by CCPA. CCPA also requires companies to allow residents to request deletion of personal information that the companies collected from the residents.

As companies experienced during the hectic months before GDPR went into effect, disclosure, access, and deletion rights do not simply materialize once companies declare those rights in their privacy policies. Disclosures require companies to conduct a comprehensive analysis of their data inventories and data flows. GDPR’s access and deletion rights require companies to configure their systems and databases to access all information about a particular individual. Companies often have a single individual’s information across many legacy systems. CCPA imposes similar requirements for data that a company has collected from the individual. As companies learned with GDPR, such rights may require complete overhauls of their systems. For a mid-stage growth company with, say, $50 million in revenues that is not yet in the black, a comprehensive overhaul of its consumer information systems simply is not feasible.

Even companies that currently comply with GDPR (and I am skeptical when a company claims to be in full compliance, as many terms remain unclear), CCPA will impose additional burdens because U.S. data often is handled differently from European data, and is processed by different contractors than in the European Union and stored on different systems.

  1. CCPA fails to clearly define the roles of controllers and processors

For every company with whom a consumer has a relationship, there might be dozens of companies that access the consumer’s personal data, such as cloud providers and backend office outsourcers. GDPR recognizes this reality and distinguishes between a controller (the company that “determines the purposes and means of the processing of personal data”) and a processor (a company that “processes personal data on behalf of the controller.”).

GDPR delegates duties to both controllers and processors, and requires contracts between the controllers and processors to specify particular duties, such as the processor’s obligation to maintain appropriate data security safeguards and to assist the controller with regulatory inquiries.

CCPA, on the other hand, refers not to processors, but “service providers.” CCPA requires, for example, companies to “direct any service providers to delete the consumer’s personal information from their records.” Unlike GDPR, CCPA does not contain many explicit requirements for a contractual relationship with a service provider; a contract between a company and service provider need only specify the purpose of processing and prohibit the provider from using personal information for other commercial purposes.

Because CCPA requires companies to “direct” their service providers to comply with the law, companies will need to engage in a new round of contractual negotiations with every service provider that processes California resident information. This comes after spending two years negotiating GDPR terms. Ask any controller or processor that has negotiated GDPR contracts: it is a massive task that results in significant legal fees, as controllers and processors allocate risks and indemnity. Negotiating CCPA contracts will be even more difficult, as the CCPA fails to provide many of the baseline contractual requirements of GDPR. And companies likely cannot rely on their GDPR addenda, as those contracts typically apply only to EU data and operations. CCPA surely will be a windfall for privacy and corporate lawyers.

  1. CCPA’s sales restrictions raise First Amendment concerns

In Sorrell v. IMS Health (2011), the Supreme Court held that Vermont violated the First Amendment by restricting the sale or disclosure of records of a doctor’s prescription habits for marketing purposes without the doctor’s consent. The Court found the Vermont law to be a content-based restriction of commercial speech because it prohibited the disclosure for marketing, but not for other purposes, such as “educational communications.” The Court pointed to the statute’s legislative history, and Vermont lawmakers’ stated goal of cracking down on “detailing,” a marketing practice in which drugmakers attempt to promote their prescription medications to doctors in person. “The capacity of technology to find and publish personal information, including records required by the government, presents serious and unresolved issues with respect to personal privacy and the dignity it seeks to secure,” Justice Kennedy wrote for a six-justice majority. “In considering how to protect those interests, however, the State cannot engage in content-based discrimination to advance its own side of a debate.”

CCPA is more expansive than the Vermont law in Sorrell, covering personal information across industries. Among its many requirements, CCPA requires companies to notify consumers of the sale of their personal information to third parties, and to opt out of the sale. However, CCPA exempts “third parties” from coverage if they agree in a contract to process the personal information only for the purposes specified by the company and do not sell the information. Although CCPA restricts a wider range of third-party activities than the Vermont statute, it still leaves the door open for some third parties to be excluded from the disclosure restrictions, provided that their contracts with companies are broadly written and they do not sell the data. For instance, imagine a contract that allows a data recipient to conduct a wide range of “analytics.” Because the recipient is not selling the data, the company might be able to disclose personal information to that recipient without honoring an opt-out request.

Under Sorrell, such distinctions might lead a court to conclude that CCPA imposes a content-based restriction on speech. Moreover, the findings and declarations section of the bill cites the revelations about Cambridge Analytica’s use of Facebook user data, and states “[a]s a result, our desire for privacy controls and transparency in data practices is heightened.” This could cause a court to conclude that the legislature was targeting a particular type of business arrangement when it passed CCPA.

To be sure, the Vermont law was narrowly focused on detailers, and CCPA has a broader sweep. Therefore, the Vermont law on balance probably had more constitutional defects than the CCPA. But there is a reasonable argument that CCPA targets large platforms such as Facebook, and in doing so imposes substantial burdens on speech.

  1. CCPA opens the door to 49 more overlapping (and conflicting) state data protection laws, raising Dormant Commerce Clause concerns

Perhaps the most significant constitutional problems with CCPA come not from the First Amendment, but from the Dormant Commerce Clause, which prohibits states from imposing “undue burdens” on interstate commerce.

Don’t let the name fool you; the California Consumer Privacy Act will have sweeping ramifications in every state. Even small businesses that are located outside of California likely process some data of California residents.

Imagine what would happen if the other 49 states passed their own data protection laws. Now imagine if each of these laws different just a bit from one another; they might require different disclosures in privacy policies, or the procedures for accessing and requesting personal information might vary. Companies would need to establish separate compliance systems for each state.

Such minor variances sound ridiculous, but there is precedent. Every state and the District of Columbia have enacted data breach notification laws. Although the laws are quite similar, many states have added their own flourishes, resulting in a patchwork of requirements that not only vary, but often conflict with one another. For example, Massachusetts prohibits companies from describing the circumstances of data breaches in their notices to consumers, while other states require descriptions.

Variations in data breach notification laws are inconvenient, particularly because they matter most in the critical days after a company learns of a breach. Variations in data protection laws would be exponentially more troubling. These laws apply to companies at all times, not just after data breaches. And they require far more significant operational changes than breach notification, because data protection laws dictate how data is handled and processed.

CCPA is so burdensome that there is a very real risk that a court would conclude that it violates the Dormant Commerce Clause. In a 1970 case, Pike v. Bruce Church, the Supreme Court held that a state law that does not discriminate against other states and only incidentally affects interstate commerce will violate the Dormant Commerce Clause only if “the burden imposed on such commerce is clearly excessive in relation to the putative local benefits.” This is a high bar, but state laws that require companies to overhaul their operations might not pass muster under the Dormant Commerce Clause.

In a 1997 case, American Library Association v. Pataki, the federal court in Manhattan found that the New York state legislature violated the Dormant Commerce Clause by prohibiting the online communication of content that is “harmful to minors.” The Court concluded that subjecting a single content creator to the rules of up to 50 different states was unduly burdensome. “Further development of the Internet requires that users be able to predict the results of their Internet use with some degree of assurance,” the Court wrote. “Haphazard and uncoordinated state regulation can only frustrate the growth of cyberspace. The need for uniformity in this unique sphere of commerce requires that New York’s law be stricken as a violation of the Commerce Clause.”

recent article concluded it was an “open question” whether the New York Department of Financial Service’s comprehensive cybersecurity regulations violated the Dormant Commerce Clause. CCPA is even more onerous than the New York regulations because its requirements extend far beyond specific safeguards such as encryption and multi-factor authentication. In many cases, CCPA will require companies to entirely overhaul their data processing and storage processes. Under Pike and American Library Association, it is difficult to see how the Dormant Commerce Clause would allow a state law that requires an out-of-state company to make such sweeping changes to its internal systems merely because it has some California customers.

Even if other states do not follow suit, and California is the only state to pass a data protection law, CCPA still would raise Dormant Commerce Clause concerns. If GDPR compliance provides any indication, companies will need to spend significant amounts of money to meet CCPA’s many requirements. Companies would have a strong argument that California has imposed an undue burden on companies that have only an incidental connection to California.

  1. CCPA provides another unclear cause of action for data breach victims

When a U.S. company experiences a large data breach, it probably will be the defendant in a lawsuit alleging violations of a hodgepodge of claims under the common law (such as negligence, breach of contract, and breach of warranty) and state consumer protection statutes that prohibit unfair and deceptive trade practices. For instance, the complaint against Target for its 2013 data breach was 123 pages. Unfortunately, these laws do not provide clear guidelines for companies that want to adequately secure their information. The claims often are dismissed, as courts find various flaws in their application to data breach cases.

CCPA contains its own cause of action for victims of data breaches and other security incident. Rather than clarifying the security obligations of businesses, it merely adds another vague cause of action. The act allows class action lawsuits with statutory damages of $100 to $750 per victim (or actual damages if greater) if a company experiences “exfiltration, theft, or disclosure” of unencrypted and unredacted personal information that is caused by the company’s “violation of the duty to implement and maintain reasonable security procedures and practices appropriate to the nature of the information to protect the personal information.” When determining the amount of damages to award to the class, a court considers “the nature and seriousness of the misconduct, the number of violations, the persistence of the misconduct, the length of time over which the misconduct occurred, the willfulness of the defendant’s misconduct, and the defendant’s assets, liabilities, and net worth.” The California Attorney General has the authority to block the lawsuit, though CCPA does not proscribe any standards for this decision.

Perhaps the most confusing aspect of this section is its requirement that a plaintiff give the business 30 days’ notice before suing, and that the plaintiff cannot recover statutory damages if the business “actually cures the noticed violation and provides the consumer an express written statement that the violations have been cured and that no further violations shall occur.” At first blush, this sounds reasonable. However, it is difficult to imagine how a company could “cure” a data breach that already has occurred. Let’s say that 100,000 customers’ credit card data was stolen from a retailer because an employee’s password was “password,” and hackers easily infiltrated his account. If the company changed the employee’s password, would that “cure the noticed violation” even though the 100,000 credit card numbers were in the hands of hackers? What if the employer began requiring multifactor authentication? That might help to prevent future incidents, but that move would not mitigate the harm to the 100,000 customers whose credit card data was in the hands of hackers. Or can a company only “cure” a security incident by somehow ensuring that the hackers no longer have access to the customer information?

There is no way to confidently answer these questions because CCPA does not define “cure.” Nor does the statute specify the types of security procedures that are “reasonable.” Rather than clarifying companies’ responsibilities and encouraging responsible security practices, CCPA will merely add another cause of action to the laundry list for post-data breach class actions.

The availability of statutory damages will make the CCPA a particularly attractive tool for class action plaintiffs’ lawyers. (Just look at the litigation under the Video Privacy Protection Act or Telephone Consumer Protection Act). Even with statutory damages of $100 per individual, a large-scale data breach could result in millions of dollars in total damages (much of which likely would go to the lawyers).

  1. Where are the benefits?

Our laws need to do a better job protecting individual privacy. CCPA may very well accomplish this task. Or it might not. Because of the rushed legislative process, we can’t look to a debate over the costs and benefits of the CCPA, as we might with another law that was the product of traditional deliberation. How many Californians will request disclosures of data processing? How many will request deletion of their personal data? Will Californians read the expanded privacy policies that are required under the law? (Research has found privacy policies to be ineffective and unread). Before requiring companies to overhaul their information technology systems, it would be useful to know if consumers would avail themselves of these new rights.

I would expect to find persuasive rationale in the law’s findings and declarations section. Rather, the drafters merely offered general statements such as a laundry list of the potential “devastating effects” of unauthorized information disclosure and the unsupported conclusion that “[p]eople desire privacy and more control over their information.”

  1. CCPA will affect all industries, not just “the Internet”

“California Passes Sweeping Law to Protect Online Privacy,” read the headline to the New York Times article about CCPA’s passage. “California Just Passed the Strictest Online Privacy Bill in the Country,” Slate declared. While CCPA affects online privacy, it is not limited to Internet-related companies or services. CCPA’s restrictions apply to all covered businesses, whether they are Facebook or a chain of retail stores that doesn’t even have a website. If the recent experiences with GDPR provides any lessons, it is that many businesses likely will be caught off guard in late 2019, when they begin to learn that CCPA applies to their operations.

  1. The ‘wait and see’ approach is impractical

CCPA’s supporters have suggested that the legislature might amend the law to address any concerns. And the law allows the California Attorney General to “solicit broad participation to adopt regulations” that address a number of issues such as clarifying the definition of personal information and developing procedures for opt-out requests.

CCPA goes into effect on Jan. 1, 2020. To even come close to complying with the law, companies will need to get to work immediately. The European Union released the final text of GDPR two years before it went into effect, and many companies still are not compliant. The same will be true for CCPA. Companies cannot wait months or an entire year to get further clarification on key provisions of CCPA.

* * *

So where do we go from here? I understand that the legislature rushed a bill into law to avoid a ballot measure. As a non-Californian who is not tied into the state’s politics, I don’t know what is politically possible. But I sure know that as a matter of legal and business reality, the CCPA as passed is flawed to its core. Minor amendments or Attorney General guidance will not cure the operational defects or potential constitutional infirmities. At the very least, California should consider delaying the implementation date until it can hammer out a more thoughtful and practical law, and provide companies with a reasonable time frame in which to comply.

I do not mean to suggest that data protection has no place in the United States. Indeed, Congress should consider passing an effective and reasonable national data protection law. But such a law should set a uniform national standard, and it should be the product of careful and informed discussion.

Thanks to Lydia De La Torre and Eric Goldman for helpful feedback.

One Pingback/Trackback

  • Mr. Kosseff’s ten-point whine-fest makes me chuckle. Let’s clarify a few things he missed, shall we?

    First of all, the CCPA had support from Google, to name but one tech company. If a behemoth like Google –which, let’s face it, has amassed gargantuan amounts of consumer data from email alone –can get on board with consumer privacy protections, then it is quite disingenuous to cry wolf about the impending destruction of Corporate America, but hey, whatever you have to tell yourself, Jeff.

    Secondly, above and beyond anything else, the CCPA brings privacy protections for consumers firmly into the 21st century, regardless of the ensuing tantrums thrown by individuals like Mr. Kosseff, who apparently believes that if he slings enough legalese at the wall, then something, anything, is bound to stick and then he can kick and scream and otherwise cry about the sky falling. Sorry, but the CCPA establishes the same privacy for book purchases that is very similar to the same privacy laws that exist for library records. This is critical since reading choices reveal very intimate facts about consumer lives –everything from political to religious to health. Digital books and book services provide an even more focused dossier since that data can include not just books that were read but also browsed –particular pages viewed, the duration of how long the reader spent on each page as well as any electronic notes the reader may have made. Without strong privacy protections like the ones in the Reader Privacy Act, reading records can be easily targeted by government scrutiny as well as used for legal proceddings such as divorce and custody cases.

    I can’t speak for Mr. Kosseff but I certainly don’t want my private reading lists bought/sold or traded by unknown parties for unknown reasons. This is already in effect with social security numbers and credit bureaus (cough Equifax breach cough.)

    But perhaps Mr. Kosseff would like his own private reading lists published for all to gawk at. And by the way, I’m not referring to the shiny, happy, exclusively created for public consumption list, the same one that would be publicized and otherwise held up as what a shining beacon of propriety he is. No?

    I bet Jeff is from the same tiny but exceptionally vocal group that was deeply disappointed when he found that he couldn’t convince any US court to install cameras in bedrooms to enforce the “proper” sexual preferences and practices among the population that he and his ilk prefer.

    Put your money where your mouth is, Mr. Kosseff. My privacy isn’t for sale so don’t pretend to speak for me or any other consumers.

  • LOL – thanks for deleting my comment , professor. Why bother having this section if you exercise thought policing when anyone disagrees.

  • Pingback: CLBR #306: Sacramento Gone Wild with With Jared Gordon and Joshua de Larios-Heiman – Cyber Law & Business Report()