Recap of the California Assembly Hearing on the California Consumer Privacy Act
Yesterday, the California Assembly Committee on Privacy and Consumer Protection held a hearing on the California Consumer Privacy Act. I believe this is the first legislative hearing ever on the law. The initial passage took place in a frenetic week, too short to notice a hearing; and I don’t believe there was a hearing for the technical amendments bill.
About 100 people attended. Legislator attendance was so-so. Only 3 legislators stayed most of the time; and by the end, only Rep. Chau (the chair) and some staff members remained.
Note: I heard there will be a California Senate Judiciary Committee hearing on the law on March 5, but I don’t have any details. Yes, that would be the same day as the final AG listening session on rule-making at Stanford Law.
Video and audio from the hearing is available.
This was my first time testifying in a California legislative hearing (I did give remarks at a California legislative event on privacy a few years ago that we held on campus). A few comments on how the process differs from testifying before Congress:
- the committee members aren’t segregated by party. In Congress, seats are strictly divided on party lines, which I feel only reinforces partisanship. I liked California’s seating arrangements much better!
- I wasn’t required to submit any written testimony in advance. Nevertheless, I did write up my remarks, and I am sharing them below.
- Congress runs time management very, very tightly. Usually, every committee member and witness gets 5 minutes each. There is a prominent clock, with color warnings, right in front of each person; and when a person is out of time and the red light goes on, usually the chair waits only a bit before prompting the speaker to wrap up. This also means that committee members expect short, succinct answers to their questions because every witness answer eats into their 5 minutes In contrast, there was no time management function visible at all in the Assembly hearing. Most witnesses were quite respectful of the committee’s time anyway, and the chair prompted a speaker to wrap up only once. The relaxed time management made the entire hearing feel a little more relaxed.
This post recaps what I heard at the event, with some of my commentary, usually indicated as my comments. As usual, most of the recap is not verbatim what the speaker actually said; it’s my impression of what I heard. If you want to confirm any speaker’s remarks, please refer to the tapes. After the recap, I’ve included the prepared written remarks from some of the speakers. These are also not verbatim because most speakers deviate, at least slightly, from the prepared remarks, but they may be useful references nonetheless.
(As usual with CCPA posts, I’ve included the dumpster fire visual metaphor. This is my view and no one else’s).
___
Opening Remarks
Rep. Chau (chair): He extolled the law’s virtues. He explained the animating goals: your data, your privacy, your choice. He said it is clear that the legislature’s work is not done. He cautioned against prejudging requests from “business” and “tech.” He wants stakeholders to find consensus where possible. He doesn’t want to erode the law’s rights in the name of cleanup or clarity. Instead, he wants surgical, precise changes that don’t roll back protections.
[Eric’s general note: this law is a great example of the endowment effect. Now that the law is on the books, any change–even what seem to me to be obvious and non-controversial cleanup changes–is treated as a denigration of consumer rights. Thus, virtually every change that doesn’t expand the bill will have a price tag associated with it, even in situations where everyone knew before the bill’s passage that changes would be needed. It’s a fascinating case study of human psychology as applied to legislative drafting.]
Rep. Obernolte: The law passed on an abbreviated timeframe, and everyone knew it needs substantial cleanup. He thinks it will take years to claw our way through the cleanup. [Eric’s comment: I liked his remarks the best. Note he’s a former software programmer.]
___
First Panel
Alastair MacTaggart: He gave his standard dog-and-pony show, which is a slick mix of lucid and cogent analysis and atextual fantasy. As usual, he focused almost exclusively on the law’s impact on Internet giants, not small offline retail operations. For example, he claimed the law should only cover large companies, not mom-and-pop businesses, even though the law reaches millions of mom-and-pop operations. He thinks web browsers will adopt “do not sell” extensions that will automate opt-outs of data sales.
Veronica Abreu, Square: She raised concerns about the law’s reach to small businesses and its deleterious effects on fraud prevention. She endorsed the GDPR’s definition of “personal information” and noted that the GDPR authorizes some data uses irrespective of consumer consent. She expressed concerns about the challenges of authenticating access/deletion/porting requests, especially via the 1-800 phone number all covered businesses must have.
Tanya Forsheit, Frankfurt Kurnit Klein & Selz: See her remarks below. She compared the long windup to the GDPR with the short ramp for the CCPA. She also explained the expense and complexity of getting ready for GDPR compliance.
Comment from Rep. Gallagher: He think there may be issues that need to be clarified, but he wants to guard against undermining substantive rights. He thinks it’s a stretch to worry about undermining fraudsters because businesses know more than customers expect. He thinks it’s unfair that data is less regulated than the farm industry.
Q from Rep. Wicks: she asked MacTaggart if he preferred an opt-in for data sales. MacTaggart explained that, based on the advice of Berkeley Prof. Chris Hoofnagle, they chose opt-out to avoid the IMS v. Sorrell case and the risk of constitutional problems. He thinks the browser extension will get where he wants to go anyway.
Q from Rep. Irwin about the data breach private right of action. MacTaggart answered that encryption is off-the-shelf technology. [Eric’s comment: it’s flip to say that encryption is off-the-shelf technology. Putting aside that this is another increased cost of doing business, properly deploying an encryption program is well beyond the technological expertise of many small and even medium sized businesses. Irrespective of whether businesses comply with the technical letter of the law, online security is a process–a demanding, time-consuming, relentless, expensive, and often ultimately doomed process.]
Q from Rep. Irwin about the percent of businesses that were/are GDPR compliant. Forsheit responded that there may be no such thing as 100% GDPR compliance because it’s an ongoing process. She emphasized that fake data subject requests are a thing.
Q from Rep. Irwin about GDPR enforcement in the US. Forsheit explained that she expected US businesses without an EU establishment to fight any enforcement efforts. Still, the EU claims it has jurisdiction.
Q from Rep. Irwin to Abreu about whether Square is GDPR-compliant. Abreu said that some of Square’s products aren’t offered in the EU. Otherwise, yes, and Square is trying to help its customers comply too. She said she doesn’t want to reduce the CCPA’s consumer rights, but they do provide a security attack vector without any hacking required.
Q from Rep. Irwin about the 50,000 consumer/devices standard for “businesses” regulated by the law. Abreu said 50,000 IP addresses reaches small businesses. Rep. Irwin then invited a response from MacTaggart. He said the AG’s office will issue regs on the steps required to authenticate access/deletion/porting requests, but he thinks that there will not be enough data to verify a person based on their IP address. [Eric’s comment: this is a crucial point, and I don’t think MacTaggart’s guess about what the AG rule might say should provide us much comfort.] He reminded everyone that the initiative’s definition of “business” was 100,000 consumers/year [and, I’ll note, $50M in revenue], and it was the legislature that reduced the numbers. [Eric’s comment: I don’t know how or why those numbers changed in the backroom negotiations, but that was a terrible mistake that should be fixed.]
Q from Rep. Berman: The law’s passage was rushed and the legislature didn’t understand all of the law’s implications. He wants to address the overly onerous provisions to businesses, such as the requirement that businesses have a 1-800 phone number. MacTaggart explained that the 1-800 number requirement helps provide access to lower-income people who may not have computer access.
___
Second Panel
Stacey Schesser, AG’s office. Multiple times, she asked the legislature for more money, even though the office will keep 100% of the fines it generates. The AG’s office really, really does not want to issue individualized advisory guidance to companies. She also warned that legislative changes may further slow down the AG’s rule-making. The AG’s office is calling the 30 day cure period a “get out of jail free” card, and the AG’s office doesn’t want to be in the business of giving out “fix it” tickets. [Eric’s comment: I felt these remarks were pretty condescending. Most compliance obligations cannot be satisfied in 30 days, so any sensible business will not deliberately violate the law and just wait for the AG’s call because there won’t be enough remaining time to fix any serious problem. Furthermore, most legit businesses will be striving mightily to do the right thing even though they have no fucking clue what to do and currently no guidance from the AG about how to do it. Finally, there are so many detailed obligations that it’s inevitable that many well-meaning businesses will have minor compliance violations that are indeed best handled by, you guessed it, fix-it tickets.] She also reiterated that the AG favors opening up the entire law to a private right of action.
The most important takeaway: she said the office is shooting to meet its June 30, 2020 deadline for completing rule-making. [Eric’s comment: That means the AG’s office has no intent to issue final rules before Jan. 1, 2020 (not that there was ever a realistic chance), so the AG’s enforcement authority will kick in July 1, 2020. It also seems guaranteed that final rules will be available only a few months or weeks before the July 1, 2020 enforcement commencement, so businesses will have a lot of expensive scrambling to do to accommodate the final rules. Bummer. Given this, the legislature should slip the July 1, 2020 enforcement commencement; if not, I hope the AG’s office will announce its own enforcement delays to give businesses adequate time to adjust to the final rules.]
Q from Rep. Chau: he cited another area of the law where the AG’s office issues advisory opinions.
Qs from Rep. Irwin about local enforcement (no) and the possibility of issuing written guidance.
___
Third Panel
Sarah Boot, CalChamber: The AG’s office needs adequate resources, but the goal should be compliance, not enforcement. Businesses need the opportunity to cure–we should view it as part of their education. A private right of action would be frightening; see the bad PAGA lawsuits. CCPA is a hard law to comply with. The majority of affected businesses are not tech giants. The news coverage emphasizes the law’s application to big tech. The reality is that this law reaches small businesses but the law treats them the same as the tech giants. We should set businesses up for success.
Prof. Eric Goldman, Santa Clara University School of Law: see my remarks below. I explained some problems with the definition of “personal information.”
Kevin McKinley, Internet Association: see his remarks below. He discussed the overbroad definition of “sell.” He mentioned how the law interferes with ad-based businesses and hinders the aggregation of niche audiences.
Tanya Forsheit: see her remarks below. She discussed the definition of “consumer,” especially how it reaches employees and business contacts. She also discussed the current private right of action for data security breaches, which will be inevitable because every business has, or will have, data breaches. The high statutory damages have a diminishing marginal return on deterrence.
Margaret Gladstein, California Retailers Association: see her remarks below. She discussed the application of the non-discrimination provisions on loyalty programs. The restrictions on data sales may hinder cross-promotions with other companies.
Sarah Boot: she discussed how the restrictions on data sales could hinder government entities’ ability to do things like fraud prevention or law enforcement.
Alastair MacTaggart responded to some of the remarks. He cited the increasing ability of companies to associate data. [Eric’s comment: this is something he and I actually agree on. We disagree about the policy implications. The law was not properly designed to treat every scrap of data as personal information. It makes the disclosure, access, deletion, and portability rights all impossible.] He also made a number of mixed truth/atextual statements about what the law does or doesn’t do.
Elizabeth Galicia, Common Sense: Her main points seemed to ask why should Californians have less rights than Europeans? [Eric’s comment: it’s a nice soundbite, but it completely obscures massive divergences in California and European legal foundations and approaches to regulation (including, of course, our Constitution), cultural norms, economic drivers, and historical experiences. But other than that, we’re almost the same.]
___
Fourth Panel
Nicole Ozer, ACLU: She waxed poetic about the California Constitution’s protection for privacy. She brought up Cambridge Analytica a lot, even though the law applies to millions of businesses that have nothing in common with Cambridge Analytica. She wants more granular disclosures about who receives a company’s shared data. She also wants more consumer control over data sharing, opt-in, stronger anti-discrimination provisions, and a private right of action.
Lee Tien, EFF: He said rights don’t protect themselves. He expressed confusion about all of the love for the GDPR, when it also hammers ad-supported businesses. [Eric’s comments: I share the skepticism over the new-found love for the GDPR, but if California is going to impose privacy protections that deviate from the GDPR and impose the associated duplicative compliance costs, the new CA approach should be that much better to justify the extra compliance costs]. He expressed concerns about the lack of a private right of action: he cited how the federal government shutdown hinder the FTC’s investigation of Facebook, he’s worried about regulatory capture, and he likes the GDPR’s decentralized enforcement. [Eric’s comment: this last point is confusing. If other states enact their own privacy laws that mimic CCPA, that will unleash many other state prosecutors onto the privacy beat, which will then look a lot like the GDPR’s decentralized enforcement. So it does not make sense to consider California’s privacy enforcement in isolation.]
Todd Weaver, Purism: he did a sales pitch for his company’s product. [Eric’s comments: along the way, he said several ridiculous things, including that the GDPR didn’t put people out of business–a statement that I’m sure will be disproven by forthcoming empirical papers, including one draft I’ve seen–and the CCPA is easy to comply with, which instantly eliminated any remaining credibility he had.]
Prof. Scott Jordan, UC Irvine Computer Science Department: he was the only speaker that went so long that Rep. Chau asked him to wrap up. He talked about the differences between data uses that are consistent with users’ expectations in context and new data uses that would surprise users. [Eric’s comment: he lost a lot of credibility with me when he praised the statutory definition of deidentified information, which is garbage.]
Ashkan Soltani: He talked about how data brokerage is scary. [Eric’s comment: He said the CCPA isn’t that complicated, so I guess I’d love to know what it takes for a law to be complicated to him. From my perspective, a 10,000 word law that cuts across 40M people and millions of businesses is per se complicated.]
Sarah Boot: The law reaches businesses with low profit margins. How much will the law raise consumer prices? A private right of action would be devastating. She reminded the legislature that tech giants are a trivial fraction of the affected businesses.
Tanya Forsheit: She noted the panoply of existing laws that already protect California consumers.
Ashkan Soltani: he said that it would be rare for any offline business to meet the definition of “business” without having an online presence. [Eric’s comment: needs citation. This statement is unenlightening because most businesses have some online presence, but many tiny businesses like high-volume retailers will cross the statutory threshold without counting their online presence at all. If you add the numbers from their online presence, then that just expands the law’s reach further.] He also said that cloud services will provide compliance tools. [Eric’s comment: again, needs a citation, and again, this assumes the problem is exclusively with online data.]
___
Public Comments
I counted 15 audience members who spoke. Only 1, from Consumer Reports, wanted an expansion of the law. The other 14 included representatives from the Association for National Advertisers (ANA), Allclear, California school adminstrators, Nonprofit Alliance, national payroll companies, the California News Publisher Association, the securities industry, the California Latino Chamber of Commerce, a small business association, Technet, the Consumer Data Association, the CTIA, and the Employment Council. Some of the more interesting points:
- California school administrators are concerned that the law overrides the existing privacy rules.
- Nonprofits are excluded from the law, but the law impacts them anyway because it will starve them of data. The speaker claimed that nonprofits will be the canaries in the coalmine for the law’s deleterious effects.
- The California Latino Chamber of Commerce is worried that the law will limit or eliminate free online services that help new businesses get started cheaply, and this will disproportionately hurt their members.
____
[Written Remarks: I’m sharing the prepared written remarks that I was able to get.]
California State Assembly Committee on Privacy and Consumer Protection
Informational Hearing: Understanding the Rights, Protections, and Obligations Established by the California Consumer Privacy Act of 2018: Where should California go from here?
Testimony of Tanya Forsheit, Partner and Chair, Privacy & Data Security Group, Frankfurt Kurnit Klein & Selz
Panel One
February 20, 2019
Mr. Chair and Members, thank you for the opportunity to testify here today. My name is Tanya Forsheit. I am a partner with the law firm Frankfurt Kurnit Klein & Selz, Chair of the Firm’s Privacy & Data Security Group, and I am based in the Los Angeles office. I am also an adjunct professor at Loyola Law School. I have been practicing law in California since 1997 and I have been practicing privacy and data security law since 2006. I am certified by the International Association of Privacy Professionals as a CIPP/US and a CIPT.
I am here today in my capacity as a privacy lawyer who has represented hundreds of businesses in such matters. I am not here representing any particular client. I am here on this panel to provide you with some background on the burdens and challenges companies face in complying with or attempting to comply with the EU General Data Protection Regulation (“GDPR”).
The GDPR has extraterritorial application under Article 3, so companies doing business in California that offer goods or services to individuals located in the EU are already required to implement very significant compliance measures to protect consumer privacy. These include: data mapping (which cannot be fully automated, and requires discussions with each business unit and IT); amending contracts with all service providers (which requires legal advice); updating privacy policies (which also requires legal counsel); setting up and maintaining mechanisms for consumers to make requests for access, portability, and deletion, and to exercise other rights; and training personnel.
If done correctly, this requires significant investment in technology and personnel resources, including either new hires or diverting existing personnel away from their regular jobs. Businesses in California have already spent anywhere from tens of thousands to hundreds of thousands or even millions of dollars getting ready for GDPR and complying since it took effect on May 25, 2018. According to CSO magazine, 88% of companies spent more than $1 million on preparing for the GDPR.
The GDPR does not have a private right of action with statutory damages. It can be enforced by EU regulators (data protection authorities) or in representative actions (for example, brought by an advocacy group). But there is no class action available in Europe and nothing like statutory damages available without any showing of harm.
The GDPR became law in April 2016 and companies had two years to prepare before enforcement began on May 25, 2018. Despite that longer runway under the GDPR and these considerable investments in time and expense, many if not most US companies are still not in compliance because the undertaking is so significant. According to a Capgemini survey released one week before the law took effect in May 2018, 85 percent of firms in Europe and the United States were not ready to fully comply on time and one in four would not be fully compliant by the end of 2018.
The CCPA is even more complex and challenging for reasons that you will hear described today. For example, even the inclusion of the word household in the definition of personal information, which is not part of the GDPR, changes the entire scope of compliance. As a consequence, despite having done significant work on GDPR, companies doing business in California are not ready for CCPA. As currently drafted, many businesses will have to reinvent their compliance programs for CCPA, with an even shorter timeline for implementation. The CCPA became law on June 28, was already amended once on September 23, and will take effect on January 1, 2020.
Experience with the GDPR has demonstrated that we are going to need some changes to the CCPA, and to this timeline, for compliance to be realistic.
Thank you.
___
Assembly Privacy and Consumer Protection Committee Hearing on the California Consumer Privacy Act
Testimony of Prof. Eric Goldman, Santa Clara University School of Law
February 20, 2019
Mr. Chair and members of the committee, thank you for the opportunity to speak at this hearing. I am a professor at Santa Clara University School of Law, where I co-direct the school’s High Tech Law Institute and its Privacy Law Certificate. I have been teaching and writing about Internet Law for a quarter-century.
My remarks address the definition of “personal information.” This definition is a critical piece of the law because it distinguishes regulated from unregulated information. An over-broad definition of “personal information” creates problems throughout the entire law, which is the case now.
The definition of “personal information” includes “information that….is [reasonably] capable of being associated with…a particular consumer or household,” with about a dozen different examples of what that might include. Personal information isn’t limited to sensitive or obviously identifiable information, such name, phone number, or social security number. It sweeps in any scrap of data that has the theoretical *capability* of being associated with a consumer.
Unfortunately, the definition of personal information creates many problems. I’ll highlight four:
- First, effectively any information related to consumers counts as personal information, no matter what steps the business takes to avoid that. The law excludes some consumer data from the definition of personal information, but the exclusions do not work. The reason why is that computer scientists increasingly can find ways to associate innocuous bits of data with individual consumers. As a result, virtually every piece of data from or about consumers technically meets the statutory requirement of being reasonably capable of being attributed to a particular consumer or household. So, despite the law’s clear intent otherwise, the law treats all data in a business’ possession as personal information and subject to the law’s strict regulation. Most of the law’s obligations—ranging from portability to disclosures to deletion—become impossible to administer when all information is personal information.
- Second, the law creates an untenable situation when a consumer’s personal information resides in multiple databases and a consumer takes advantage of his/her statutory right to access or delete his/her personal information. Here’s an example of how that might arise: an online retailer allows consumers to create a personal profile, while it maintains a separate database to customize special deals to a person’s IP address. Assume a consumer asks the retailer to delete all of his/her personal information. It may be easy to delete the consumer’s profile, but how will the retailer delete the consumer’s IP address and associated data in the separate offers database? The law seemingly requires the retailer to determine the consumer’s IP address (if possible—many IP addresses are shared, like on wi-fi at a retailer), determine it belonged to the consumer, and then delete it. This request becomes even more unrealistic with offline data—say, security camera footage of the consumer shopping in the store, or a salesperson’s notes of their interactions with the consumer. The law says linking across databases isn’t required [1798.145(i)], but this only applies to data that isn’t maintained as “personal information.” As I just mentioned, virtually all consumer data meets the broad definition of personal information, so this exclusion does not help. Instead, the law may counterproductively encourage online services to digitize more data and proactively link databases to make future access/deletion requests easier.
- Third, the definition of personal information excludes some government-published data, but even that data is not freely usable. Broadly restricting the flow of government-published information is almost certainly unconstitutional.
- Fourth, as Sarah discussed, the definition of “household” means that a person can see and affect the information of other people. This is the opposite of “personal.”
The CCPA’s definition of “personal information” reaches beyond the analogous GDPR definition. The GDPR defines “personal data” as “any information relating to an identified or identifiable natural person.” I have numerous reservations about the GDPR, but this is an area where the California legislature could benefit from adopting the GDPR’s approach. The GDPR definition isn’t perfect, but it would solve some of the problems I’ve identified today.
Again, I appreciate the opportunity to speak with you.
___
Assembly Privacy and Consumer Protection Committee Hearing
February 20, 2019
“Understanding the Rights, Protections, and Obligations Established by the California Consumer Privacy Act of 2018: Where should California go from here?”
Testimony of Kevin McKinley, Internet Association
Mr. Chairman and members of the Committee, thank you for the opportunity to speak today. My name is Kevin McKinley, and I am the Director of California Government Affairs for Internet Association, which represents over 45 leading internet companies many of which are headquartered here in California.
Internet Association members support the spirit of CCPA to provide people with meaningful transparency and control over their data, including the ability to, control, access, delete, and move it. In granting these rights, CCPA redefined some commonly understood concepts. One example is the concept of “sell.” We understand the will of the Legislature was to make sure consumers know when, how, and to whom their information is sold. This is something internet companies also support. However, in redefining “sell” we believe that CCPA unintentionally jeopardizes the privacy protective practices used in online advertising.
I’ll talk briefly about how online ads work, why we’re concerned with the new definition, and how the Legislature can fix it.
Online advertising was designed with privacy in mind. Instead of carrying personally identifiable information around the web, online advertising uses identifiers that are not directly tied to individuals. Advertisers do not need to know who you are—your name, address, or other personally identifiable information—in order to show you an ad online. But they do need basic data to cross the web in order to do so. Clicking on a web page sends information that is needed for the web page to populate. A web page may draw from numerous sources – one database showing the latest news, another showing a photo, another showing stock prices, still another showing advertisements which help support and pay for the content the user views on the web page. More specifically, clicking on a page sends the IP address, time and date stamp, browser type or mobile operating system type, some additional technical information about the actual device being used to format the content properly, and any cookie information. These all direct which content will populate the page, including which ad best fits. We do not believe these bits of non identifiable technical information, which are basic mechanics of the internet, were intended to be covered by CCPA’s definition of sell. In common understanding, this information is neither personal nor is it being sold in this process. Treating this as a sale of personal information jeopardizes the underpinnings of the internet.
In addition to the common sense nature of not considering these data flows as sales, I want to spend a moment on what it would mean if they were considered sales. Many of the internet services we rely on everyday are supported by ad revenue. This model also empowers creators and allows them to support themselves through their content. Also, small publishers that make a living by writing their own blogs on everything from cooking and gardening to covering community news are generally supported by online advertising. Large publishers and content providers also rely on ad revenue to support their business.
Further, many small brick and mortar businesses have been able to to find their niche audiences efficiently through advertising online. Together, this model is foundational to the internet sector and critical for California to preserve. Mischaracterizing advertising as a sale of data would confuse those concerned about real sales of their actual personal data – data that is tied to their name and identity. If the advertising data flow were put in the same basket as actual sales of personal information, an opt-out of sales would end revenue for the services, publishers, and limit the ability of businesses to find their customers. It would endanger a business model fundamental to the internet without any privacy benefit.
Again, we do not believe—and the co-sponsors have confirmed—that it was not the goal of CCPA to jeopardize this practice. However, the statute is unclear at the moment. In order to capture the intent of CCPA and to protect this important business model, we will be asking for an amendment to make clear that this practice is not considered a sale under the Act. We look forward to working with the Legislature on this solution.
___
Assembly Privacy and Consumer Protection Committee Hearing
February 20, 2019
“Understanding the Rights, Protections, and Obligations Established by the California Consumer Privacy Act of 2018: Where should California go from here?”
Testimony of Margaret Gladstein, Capitol Advocacy
on behalf of the California Retailers Association
Mr. Chair and members, thank you for the opportunity to present today. My name is Margaret Gladstein. I’m here on behalf of the California Retailers Association. I concur with the testimony by my fellow panelists and want to emphasize that CRA is deeply concerned with the liability in CCPA.
CRA is also concerned that Section 1798.125 – known as the non-discrimination section – raises doubts about the legality of loyalty and rewards programs. We appreciate that Mr. McTaggart and some consumer groups believe these programs can continue. But we are uncertain about the clarity of the law.
What IS clear is that consumers embrace these programs. 80% of Americans belong to at least one, and the most popular are supermarket and drugstore programs. CRA’s members offer rewards programs because we want to encourage customer loyalty. Just as you use data about individual voters to help earn a vote, we use data to help earn our customer’s business.
The law currently says a business CANNOT discriminate against a consumer for exercising their rights under the CCPA by:
(A) Denying goods or services to the consumer.
(B) Charging different prices or rates, including through the use of discounts or other benefits.
(C) Providing a different level or quality of goods or services to the consumer.
The same section goes on to say: “A business MAY offer financial incentives, including payments to consumers as compensation for the collection, sale or deletion of personal information. A business may also offer a different price, rate, level, or quality of goods or services to the consumer if that price or difference is directly related to the value provided to the consumer by the consumer’s data.”
Confusing? I would say so.
I appreciate the hard work and countless hours that went into drafting this law, but this section and its interplay with other parts of the law leaves retailers, grocers, hotels and airlines questioning what we can and cannot do with our rewards programs. While I understand the legislative intent is that we can continue to offer them – and CRA’s members want to do so without running afoul of the law – ultimately, unless this section is clarified, neither this Committee nor any other legislative body will determine the meaning of these words. Instead, it will be up to the courts to reconcile the various sections of the law that could bear on these programs. That’s the uncertainty we want to end – before it ends loyalty programs.
Perhaps many of you are thinking: are loyalty programs selling my data? CRA’s members are not in the business of selling data. We want to sell goods and services. But some of the most popular parts of our members’ loyalty programs could potentially be considered a sale under the sweeping definition in CCPA. For example, Safeway Club Card members can earn discounts at Chevron. JCPenney’s rewards members can use their points for purchases at Sephora in JCPenney. We simply want to continue to offer benefits like these to consumers.
We look forward to working with you to protect these programs while protecting the framework of CCPA.
___
California State Assembly Committee on Privacy and Consumer Protection
Informational Hearing: Understanding the Rights, Protections, and Obligations Established by the California Consumer Privacy Act of 2018: Where should California go from here?
Testimony of Tanya Forsheit, Partner and Chair, Privacy & Data Security Group, Frankfurt Kurnit Klein & Selz
Panel Three
February 20, 2019
Good morning, again, Mr. Chair and Members. For my time on this panel, I will discuss the concerns raised by the current definition of consumer in the CCPA, and overreach of the private right of action in its current form.
The definition of consumer is any California resident. Thus, a “consumer” need not have a customer relationship with a business in order to exercise their rights under the CCPA. Without clarification, this could be interpreted to include employees and present a number of concerns. Keeping in mind that the CCPA applies to any business that collects even 137 unique credit cards or IP addresses a day, the operational costs for small businesses of including employees (past and current), job applicants, and other related individuals who do not have a true “consumer” relationship with the business will be exorbitant. For example, a small family-owned restaurant that serves 150 tables a day will also have to operationalize CCPA for its kitchen and wait staff in a business with high turnover.
Access to personal information in the employment context is already established in California law. But the CCPA would allow a separated spouse who is part of a household to gain access to payroll records. That cannot be what the legislature intended.
The definition of consumer is also problematic in that it currently encompasses business representatives in the context of business to business interactions. The opportunity to delete or opt-out of the disclosure of business data in a business to business transaction could result in fraud and make it impossible to comply with third party due diligence requirements. Imagine also what this means for, as just one example, a small medical practice that already complies with HIPAA for patient data, but that will now be required to honor opt out and deletion requests from suppliers with whom they email for business purposes but who never have access to patient data.
In Washington State, Senate Bill 5376 explicitly excludes employees or contractors of a business acting in their roles as employees or contractors from the definition of consumer in Section 3.
California should do the same.
I will now turn to the private right of action. The CCPA allows a consumer to seek statutory damages ranging from $100 to $750 per person, per incident without any demonstration of harm if that consumer’s personal information is subject to an unauthorized access and exfiltration, theft, or disclosure as a result of the business’s violation of the duty to implement and maintain reasonable security procedures and practices. As we understand it, the CCPA was meant to exclude encrypted data as well as redacted data from this private right of action – but the current language does not do so.
What do we know about data breaches? We know that every business has either had a data security breach or will have one even if they have implemented reasonable security procedures and practices. It is the nature of the world we live in today. The Ponemon Institute and Balbix just issued the results of a global survey of 600+ cybersecurity leaders and professionals showing that 67% of organizations are not confident that they can avoid a data breach.
One of the most pervasive threats to companies is phishing attacks, which have become increasingly sophisticated. Even if an organization encrypts all of its personal information, that will be no shield to sophisticated phishing attacks, which appear to be legitimate business requests.
Let’s go back to the small family-owned restaurant. The restaurant’s night manager discovers that his email has been compromised through a phishing scam that appeared to come from his outsourced IT guy and asked him to enter his credentials. Now he also suspects that the bad guys probably had access to personal information of restaurant customers, such as credit card numbers. California law already says the restaurant must notify those affected individuals. Under the CCPA, each of those affected individuals can sue the restaurant for $100-$750 per incident, or bring a class action on behalf of same.
You can see how the situation quickly spirals out of control. If that small family-owned restaurant serves even just 1,000 unique customers per week who pay by credit card, or slightly more than 50,000 unique credit card customers per year, the damages attributable to the class for that restaurant alone would range between $5,000,000 and $37,500,000. Even the low end of that range, plus attorneys’ fees, would be enough to bankrupt any small business. Faced with that potential liability, these business owners will be forced to enter into immediate settlement – even if they think they have a strong defense based on reasonable security – because the cost of litigation is so high. The net result is that the trial attorneys win but consumers won’t be any more protected than they are today. As noted already, it is if, not when, a company will experience a breach. And the attacks are becoming even more sophisticated, with organized crime and nation states in the game. California businesses are the victims but are punished by this law. This is a sledgehammer, not a stick, and there comes a point where deterrence has diminishing returns.
Even one of the nation’s leading plaintiffs’ lawyers in the privacy space, Jay Edelson, thinks that $750 is way too much money, as he noted in an August 10, 2018 episode of the Privacy Advisor podcast.
You have the opportunity to make this legislation even more protective of consumers (as opposed to their lawyers) and less damaging to the California economy by creating a safe harbor to this onerous liability or by simply removing the minimum amount of statutory damages.
Thank you for your time and consideration.
___
Related Posts
* A Status Report on the California Consumer Privacy Act
* 41 California Privacy Experts Urge Major Changes to the California Consumer Privacy Act
* California Amends the Consumer Privacy Act (CCPA); Fixes About 0.01% of its Problems
* Recent Developments Regarding the California Consumer Privacy Act
* The California Consumer Privacy Act Should Be Condemned, Not Celebrated
* A First (But Very Incomplete) Crack at Inventorying the California Consumer Privacy Act’s Problems
* Ten Reasons Why California’s New Data Protection Law is Unworkable, Burdensome, and Possibly Unconstitutional (Guest Blog Post)
* A Privacy Bomb Is About to Be Dropped on the California Economy and the Global Internet
* An Introduction to the California Consumer Privacy Act (CCPA)