Is Sacramento The World’s Capital of Internet Privacy Regulation? (Forbes Cross-Post)

Photo credit: An imitation California license plate // ShutterStock

Photo credit: An imitation California license plate // ShutterStock

It’s only two hours between Sacramento, California’s state capitol, and the Silicon Valley, the world’s technology capital, but when it comes to regulating the Internet, philosophically they are worlds apart. The two worlds collided in 2013 when Sacramento enacted a fleet of new Internet privacy laws telling Silicon Valley Internet companies how to conduct their affairs (see my posts 1, 2, 3, 4). The breathtaking sweep of Sacramento’s regulatory efforts made it clear that the two communities have a lot to talk about. To facilitate that conversation, in December, Santa Clara University hosted an informational hearing of three California Assembly committees. The ensuing discussion raised a host of questions about Sacramento’s role in regulating the Internet for Californians—and the rest of the world.

[Note: These notes aren’t a comprehensive recap of the day, and I mostly summarized a speaker’s remarks rather than quote them. As a result, please don’t attribute these remarks to the speakers without checking the video recordings from the day.]

Panel 1

Prof. Paul Schwartz from UC Berkeley started the day by touting the “California Effect,” California’s ability to adopt policy innovations that propagate through the country and the world. He cited the (notorious, in my opinion) example of data breach notification laws, which started in California in 2002 and have propagated into 45 other states, federal health law and European law. He said that historically federal lawmakers would respond to California’s policy innovations through a productive dialogue (what he called “flight to DC” and “defensive preemption”). [My comment: although Paul and some other Californians extol this propagation of regulatory ideas originating in California, it contributes to the perception among many non-Californians that California is crazy and Congress must clean up California’s policy messes].

Paul explained that the state-federal dialogue has broken down because the current Congress is gridlocked. Meanwhile, California continues enacting “a tidal wave of California privacy laws.” He cautioned against waiting for a “federal Godot,” i.e., expecting Congress to reengage productively on privacy regulation.

What should California do in light of this broken dialogue between federal and state legislatures? He thinks California should keep regulating because California can make a significant contribution to the international dialogue. However, California should be extra cautious given the likely lack of a federal response.

Thus, Paul advocated that the California legislature consolidate its efforts and amend existing statutes. Fall 2013 was a busy time for the legislature, so the legislature should take stock of what it’s done and learn lessons from its successes and failures. He called attention to two laws that deserve legislative attention: (1) the Song Beverly Credit Card Act, which inhibits retailers from taking reasonable steps to prevent fraud and identity theft (see my recent post criticizing Song-Beverly), and (2) the Confidentiality of Medical Information Act, which has been eclipsed by federal legislative developments.

He also wants to make it easier for people to electronically access materials about California’s privacy laws, such as legislative history. This would help interested parties across the globe better understand what’s going on in California.

[Prof. Schwartz posted a version of his testimony.]

David Lieber, Google privacy policy counsel, says everyone agrees that consumers care about privacy and security, but there’s less consensus what this means. For 13 years in a row, identity theft is consumers’ most-frequent complaint to the FTC, and consistently consumers tell Google they are concerned about account integrity. Thus, he sees privacy and security issues as intertwined for consumers. He said there’s a cloud around the Internet industry due to the domestic surveillance programs. He also said Google supports ECPA reform to require warrant to obtain user content.

Prof. Deirdre Mulligan of UC Berkeley, like Prof. Schwartz, praised California’s long reputation for privacy leadership. She said regulators outside California look to California as a laboratory of experimentation, and those experiments have ripple effects across the globe. Critics of California’s efforts should look hard at the history. Not all initiatives succeed, either in business or in policy, but California has been respectful in its regulatory experiments. California is often an early policy leader, which makes getting it right especially complicated, but California’s policy experiments can pay off for people across the globe.

She cited three emerging privacy challenges:

1) The Internet of Things.

2) Fewer distinctions between public and private actors, such as NSA using Google’s cookies to track people. All government actors want more information to do their job, but we need to be sensitive to how government actors will leverage the private sector’s infrastructure. We need to avoid creating the perfect surveillance state.

3) Big Data. Big data may be reifying impermissible discrimination. We need to sensitize computer scientists to the ethics of their coding choices.

Michael Beckerman of the Internet Association observed how a strong Internet sector translates into a strong California economy, with the added benefit that users get valuable free services. Other states are competing to lure away California start-ups. He noted the Internet industry already provides more transparency to users about their privacy practices than any other industry.

Assemblymember Mark Stone expressed skepticism that privacy policies properly inform consumers. He wondered if the legislature should take stronger measures to improve consumers confidence, rather than just letting companies “paper over” their practices, especially when disclosures waive consumer privacy protections. He’s concerned that consumers don’t have enough control and that most consumers really don’t know what’s going on. He believes California can drive consistent rules across the globe.

[My comment: Assemblymember Stone’s final assertion deserves careful scrutiny. The idea that a single state legislature’s efforts will result in consistent global standards assumes that everyone will embrace its solution. Can you imagine how the California legislature would respond if, say, a North Dakota or Mississippi legislator expressed the same vision—that it should enact new regulations because California will surely adopt its result? Of course, if other regulatory bodies don’t follow the lead of the California legislature, then California’s regulatory idiosyncrasies ensure globally inconsistent regulatory treatment. Worse, if every state legislature acts on a vision of setting global standards, then anarchy ensues—which is basically what we’re seeing as states compete to out-tough each other about regulating privacy.]

Prof. Schwartz responded that privacy policies are designed to accomplish three objectives simultaneously: (1) give actionable information to consumers, (2) get companies to think about how they manage information, and (3) manage legal liability. Unfortunately, these objectives pull companies in different ways. To be understandable, consumer notices need to be simple, but companies are squeezed by the adverse legal consequences of drafting imprecisions.

David Lieber cited the different ways Google has experimented with informing consumers and giving them choice, including videos, just-in-time notices, contextual notices (a prominent disclosure at specific places where notice would help), a dashboard to make granular privacy choices, and an ad setting so users can see the categories Google associates them with.

Prof. Mulligan responded that we need to move past privacy as a legal formality and make privacy into an intuitive practice (she didn’t use the phrase “Privacy by Design” or “values-in-design,” but I believe she was invoking these concepts). She gave the example of how iPhone shared address book information with all app developers, and how this didn’t comport with her mental model as a consumer. It would have been better if Apple had seamlessly integrated privacy principles into its product framework.

Assemblymember Wagner said his constituents aren’t asking for protection against Google and Amazon. They are asking for protection from identity theft and government overreach. He asked what concrete harms can’t be fixed without legislation?

Prof. Mulligan responded that it’s hard to define the harm. People understand the real economic costs associated with identity theft. Many people incorrectly believe that they don’t lose control when they share their information with companies. Consumers don’t expect anyone—government or private company—to be rifling through their private information.

Panel 2

Aleecia McDonald, Director of Privacy at Stanford’s Center for Internet & Society, led the second panel. She argued that online privacy policies don’t actually provide notice and choice. She pointed to three flaws:

1) Consumers often don’t have real choice. Only 2 of top 100 websites don’t track their users; and Mozilla couldn’t find a way to run a national advertising campaign without tracking individual users.
2) The time burden of reading privacy policies.
3) Companies don’t expect users to read their privacy policies. She gave examples where users were timed-out while reading the policies.

Companies can’t tell consumers about their collection practices because the companies don’t know what they are doing. She gave an example of a company that admitted it couldn’t disclose what cookies it set because they didn’t know (in part due to the fact they had acquired numerous companies). She said this is analogous to PG&E saying it doesn’t know where its pipelines are located. [My comment: what do you think of that analogy? Do cookies raise the same safety concerns as a regulated utility not knowing where its highly flammable fuel is going?]

She says we need to educate companies about privacy before they can educate consumers. For example, companies still publicly release datasets mistakenly believing the datasets have been anonymized because they don’t understand re-identification principles.

She also observed that the notice-and-choice model predates the ecosystem of third party trackers. If you go to NY Times website, it presents different ad trackers to the same user. NY Times doesn’t know who is delivering these ad trackers because its ad spots are allocated via third party auction.

Consumers are bearing the costs of the wild west of privacy, with the benefits accruing to companies. We have a system designed to create market failure, and that prompts government regulation.

The EU perceives that the architecture of US law isn’t safe for privacy. This has led to competitive harms and Balkanization of the Internet. She said she can’t overstate the level of anger and frustration in the EU today at the United States for its lax privacy standards.

Jules Polonetsky of the Future of Privacy Forum said that as a former state legislator, his initial inclination was to regulate privacy. Now, he realizes it’s hard for policy-makers to get things right, especially when the field is moving rapidly.

He enjoys reading privacy policies because they are amusing. Clearly they aren’t a principal mechanism to educate consumers. But privacy policies create accountability, and companies often make promises that exceed their legal obligations. We’ve barely scratched the surface of making privacy user-friendly. Companies should bring in designers to improve the disclosures.

He discussed the feeling that new technologies are “creepy.” Portable cameras were initially creepy, but we’ve adjusted. Initially it was creepy that people with cameraphones might take photos in locker rooms, but people have adjusted. Similarly, we’ll adjust to Google Glasses.

Chris Hoofnagle of UC Berkeley said notice-and-choice is based on rational choice theory, but consumers don’t always act rationally. He’s frustrated that defenders of notice-and-choice constantly move the goalposts about measuring the point of notice-of-choice. First, the goal was that all consumers would read privacy policies; then it became that experts would read them; then it became that the FTC would read them.

Consumers misunderstand privacy, and this undermines markets for privacy. Almost everyone fails an online privacy quiz he administers. Digital natives do the worst on it.

He believes consumers see the words “privacy policies” as seals, i.e., certifications of minimum protections. He favors correcting this by establishing minimum substantive legal standards for anyone who uses the term “privacy policy.”

[My comment: If consumers are systematically misunderstanding the phrase “privacy policy,” that is a good argument to regulate the term so that it complies with consumer expectations. But California law requires most online companies to have a “privacy policy” and call it a “privacy policy.” I could imagine a different regulatory solution would let companies decide what to call their privacy disclosures, subject to minimum protections if the companies want to use the term “privacy policy.” I imagine most companies would choose alternative nomenclature (if the law let them) rather than dealing with the hassle of satisfying regulatorily required minimum standards. Also, if consumers consistently misinterpret the term “privacy policy” as a seal, it raises interesting questions about why the California Attorney General office is trying to make more online companies comply with California’s requirement to display “privacy policies.”]

Jules Polonetsky pointed out that he prefers the term “data use policy” over “privacy policy.” For example, many people are trying to manage their personal brands, and they may not think about that in “privacy” terms.

Aleecia McDonald said that it’s hard to get people to read privacy policies when they mistakenly think they are already protected. Companies rarely lie outright, but often there are omissions or ambiguous disclosures. Companies have trained people not to read privacy policies because consumers think privacy policies are useless.

Panel 3

The third panel specifically considered California’s “Shine the Law” law, which requires companies who sell user data to third party marketers to tell consumers about their sales practices upon request.

Joanne McNabb of the California Attorney General’s Privacy Enforcement office said the Shine the Light law has not increased consumer awareness or consumer control about the sharing of their personal information. This may be due to the complexity of the law’s mechanisms.

Jim Halpert of DLA Piper said that providing good notice is an operational and coding challenge. Legal regulation isn’t the best solution in those situations.

The Shine the Light law made privacy policies longer and more complicated; and companies can be sued if don’t make adequate disclosures. His association’s members get one or two Shine the Light requests from consumers per year. When thinking about how to encourage better privacy, there are better approaches than Shine the Light. Companies are trying to provide user-friendly solutions—this happens without any legislation because it’s good business practices. He favors short-form notices that provides better user experience. Plus, the FTC’s enforcement actions are requiring clear notice to prevent intrusive company practices.

Chris Conley of the ACLU said transparency is a catalyst. He gave the example of how the California Attorney General’s effort to get app developers to disclose exactly what data they collect has actually changed consumer behavior. Similarly, the Shine the Light law has impacted companies—they considered and sometimes changed their practices in response to the law.

Transparency should allow consumers to access their information—what’s collected and who is it shared with. The most relevant information to consumers is information about them. He’s not concerned about a deluge of notices about privacy practices because tools will develop to help manage the incoming notices. However, those tools won’t develop unless companies make the disclosures.

Panel 4

Prof. Mulligan, making a second appearance, led off the fourth panel. She said that California has a strong track record of “light touch” regulation that has yielded great results. Internet-related laws need ongoing refinements because they don’t age well as technology evolves. However, California should pursue more “light touch” regulations.

She spoke about the California’s data breach notification laws. Companies view security as a cost center, not a profit center. She said that corporate Chief Information Security Officers and Chief Privacy Officers welcomed the breach notification laws because it provided metrics for the costs of insecure systems and helped them get resources. The notification laws work especially well because they don’t impose any costs to the state and don’t impose any costs to companies if they do their job well.

[My comment: though I didn’t engage Prof. Mulligan on this assertion at the hearing, I nearly went apoplectic about her method of cost accounting. California’s data breach notification law “didn’t cost California anything” only if we count out-of-pocket costs paid by the California treasury, and assuming the state does zero enforcement. Meanwhile, the law imposes potentially enormous compliance costs on companies even when companies “do their job well” but nevertheless make innocent mistakes or are themselves the victim of a crime.]

She further lauded California’s leadership on privacy for the Internet of Things, such as California’s law protecting automobile black boxes. Black boxes are useful, and she thinks the California privacy protection law spurred the adoption of the new technology.

Eric Goldman (that’s me) spoke last. I’ve posted my complete remarks. To summarize, I mentioned three problems with California’s regulation of Internet privacy:

1) Unlike environmental issues, there are no “local” conditions on the Internet that require California-specific solutions.
2) California’s laws regulating the Internet inherently affect individuals outside California. I know the regulators view that as a feature, not a bug, but it’s a big Dormant Commerce Clause problem.
3) Congress has handcuffed some legislative endeavors due to 47 USC 230.

I offered four ways that the California legislature could address these concerns:

1) Regulators should solicit input from the entire Internet community, not just incumbents with lobbyists.
2) Any Internet regulation adopted by the California legislature should limit its effects to California, i.e., it should apply only to California service providers and only when they know they are dealing with a California user.
3) Every Internet law should contain an automatic sunset provision and order an empirical study of the law’s effects.
4) Instead of creating new restrictions, the legislature should enact more safe harbors and immunities.

For more coverage of the hearing, see coverage from Bloomberg BNA and Huffington Post.