California’s New ‘Online Eraser’ Law Should Be Erased (Forbes Cross-Post)

By Eric Goldman

People mocked Google CEO Eric Schmidt for his 2010 suggestion that teenagers should change their names when they turn 18 to avoid the indiscreet and ill-advised Internet posts they made as youths. The California legislature thought it had a better solution for this problem and enacted a law, SB 568 (California Business & Professions Code Sec. 22581), that allows kids to use an “online eraser” to wipe away some of their past posts. Unfortunately, California’s solution is no less mockable than Schmidt’s.

What the Law Does

The new law says that websites and apps “directed” to minors, or that have actual knowledge that a user is a minor, must allow registered users under 18 to remove (or ask the provider to remove or anonymize) publicly posted content and make certain disclosures to these users. A website/app is “directed” to minors when it is “created for the purpose of reaching an audience that is predominately comprised of minors, and is not intended for a more general audience comprised of adults.”

The law is riddled with ambiguities, so let me explore just three:

First, it may not be clear when a website/app is “directed” to teens rather than adults. The federal law protecting kids’ privacy (Children’s Online Privacy Protection Act, or COPPA) only applies to pre-teens, so this will be a new legal analysis for most websites and apps.

Second, the law is unclear about when the minor can exercise the removal right. Must the choice be made while the user is still a minor, or can a centenarian decide to remove posts that are over 8 decades old? I think the more natural reading of the statute is that the removal right only applies while the user is still a minor. If that’s right, the law would counterproductively require kids to make an “adult” decision (what content do they want to stand behind for the rest of their lives) when they are still kids.

Third, the removal right doesn’t apply if the kids were paid or received “other consideration” for their content. What does “other consideration” mean in this context? If the marketing and distribution inherently provided by a user-generated content (UGC) website is enough, the law will almost never apply. Perhaps we’ll see websites/apps offering nominal compensation to users to bypass the law.

The law takes effect January 1, 2015. That gives plenty of time for court challenges or the legislature to rethink its errors, though I’m not sure either are likely.

Why It’s A Bad Law

I don’t believe any state has ever passed a beneficial Internet regulation, so it’s hardly surprising I dislike this law too. Some of my objections to this law:

Dormant Commerce Clause. My position is that states categorically lack authority to regulate the Internet because the Internet is a borderless electronic network, and websites/apps typically cannot make their electronic packets honor state borders. In this case, unless they ask for geographic information, many websites/apps don’t know what state a registered user comes from. Now what? Do all websites/apps around the country have to comply with California’s law on the chance that some users may come from California? That would violate the Dormant Commerce Clause, a Constitutional doctrine that says only Congress can regulate interstate commerce. Or does this law only apply to websites/apps with physical operations in California? The law doesn’t clarify its jurisdictional nexus, leaving that question open for future fights and a potential Constitutional challenge.

The Illusion of Control. The law only allows minors to remove their content from the site where they posted it; and the removal right doesn’t apply where someone else has copied or reposted the content on that site. Removing the original copy typically accomplishes the minor’s apparent goal only when it’s the only copy online; otherwise, the content will live on and remain discoverable. Given how often publicly available content gets copied elsewhere on the Internet–especially when it’s edgy or controversial–minors’ purported control over the content they post will be illusory in most circumstances.

Collateral Damage. Removing content from the Internet can create collateral damage. Many UGC websites encourage users to engage each other in conversations through comments and threaded discussions. Removing a piece of the discussion can make the entire thread nonsensical. To avoid this bad user experience, the website/app might choose to delete the whole thread, in which case the minor’s decision to remove his/her content will detrimentally affect other people’s content too. Even if the website/app preserves other people’s contributions, the content removal breaks incoming links from around the web and may render those remote discussions nonsensical.

Admittedly, these adverse consequences are currently possible when websites/apps voluntarily allow users to remove their content, as many UGC websites/apps do. Indeed, I think UGC industry best practices give users substantial control over publishing, editing and removing their content because users demand such controls (see, e.g., this recap of a 2001 user revolt against Epinions when it tried to restrict users’ edit/delete rights).

But websites sometimes justifiably restrict users’ content removal, especially when other users have responded to or build upon the content. For example, Tumblr restricts its users’ removal rights when other users have “reblogged” the content. (For what it’s worth, I believe Tumblr’s reblogging functionality fits within an exception in the new law, so perhaps more websites/apps will replicate the functionality). This law reduces websites/apps’ discretion on how to maintain the editorial integrity of their databases, and odd consequences will surely follow.

More generally, consistent with the “right to forget” meme generally, this law mandates that minors can try to rewrite history….but rewriting history hinders society’s ability to understand where we came from and why things are the way they are.

Extending COPPA. For over a decade, we’ve know how to deal with COPPA: if at all possible, avoid dealing with kids 12 and under, in which case COPPA doesn’t apply. This law creates a new class of websites/apps that can ignore COPPA but must comply with this law because they deal with teens. Thus, the law burdens a large swath of websites with the obligation to research if the law applies to them, and it will impose compliance costs on some of those.

The First Amendment. The law completely ignores the possibility that UGC websites/apps have their own First Amendment interests independent of its users’ First Amendment interests. By forcing UGC websites/apps to stop publishing users’ content when the websites/apps might view that content as contributing to their own expressive statements, the law creates a potential First Amendment collision. Two examples should illustrate the point:

Example 1: A newspaper prepares a collection of stories, written by teens, about their first-hand experiences with cyber-bullying. These stories are combined with other content on the topic: articles by experts on cyberbullying, screenshots of cyberbullying activity online, and photos of victims and perpetrators. After the newspaper publishes the collection, one of the teenagers changes his/her mind and demands that the newspaper never reprint the collection, and seeks a court order blocking republication. Does the newspaper have a potential First Amendment defense to the court order? Yes, and I don’t think the question is even close.

Example 2: a UGC website creates a topical area on cyberbullying and asks its registered users, including teens, to submit their stories, photos, screenshots and videos on the topic. The website “glues” the materials together with several articles written by its employees. Does the website have a First Amendment interest in continuing to publish the entire collection? Yes, and like the newspaper example, I don’t think it’s close.

The First Amendment analysis gets more complicated because we’re dealing with teenagers, who typically have the legal right to void their contracts if they choose. So, even if a website gets an irrevocable copyright license from the teen, the teen should be able to change his/her mind. However, once the contract is “complete,” it’s no longer voidable. For an example of how publication of a copyrighted work might “complete” a website’s contract with a teen, see the 2008 AV v. iParadigms case. Alternatively, the website could obtain parental ratification for the copyright license (required under COPPA for under-13 users). It’s interesting the online eraser law doesn’t address the possibility that parents may supervise, approve or ratify their kids’ publications that are subject to the removal right.

[Copyright geeks: I haven’t researched how 17 U.S.C. 201(c) applies to contributions from minors.]

As a practical matter, most websites/apps won’t assert First Amendment protection for continuing to publish their users’ content. (It’s clear Google and Facebook won’t because they acquiesced to the law). Indeed, the constitutional issue will only come up if (1) the website/app seeks an irrevocable license from users, which most websites/apps don’t do, (2) minor users can’t void that license, and (3) the website/app doesn’t provide technological tools that permit users ongoing access to edit or delete their content. It’s a rare situation where all three of those requirements will be satisfied. Plus, those websites/apps could avoid the issue by paying the kids a nominal amount for their contributions.

Still, by disrespecting the possibility that the website/app may have its own First Amendment interests, the law should be vulnerable to a First Amendment challenge in some circumstances.

What Should Businesses Do?

Let’s assume the law survives any court challenges, and the California legislature doesn’t backtrack. What should UGC websites do?

Don’t Collect Age Information. To avoid being obligated to comply with COPPA, it’s already a standard recommendation that websites shouldn’t collect age information unnecessarily or casually. This law reinforces that advice. Websites/apps should NEVER ask for age information unless they have a good business reason for doing so; in which case, they must be prepared to deal with the consequences of knowing users’ ages (such as bouncing under-age users).

Unfortunately, the law doesn’t address the possibility that websites/apps might learn a user’s age involuntarily (another sign of sloppy drafting). For example, a user might self-report his/her age to customer service representatives, or one user might reveal that another user is under-age. What then? Apparently, the legal obligation will spring into effect in any of those circumstances, which leads to my other suggestion….

Content Removal Doesn’t Have to Be Automated. The law implicitly anticipates, but doesn’t require, that most websites/apps will provide automated removal tools to their users. Instead, websites/apps can require users to manually request content removal, such as making the request via physical mail and providing adequate information to authenticate age. Content removal needs to be available to minors, but it doesn’t have to be easy.

Conclusion

This law is just the latest attempt by legislatures to tell content database managers how to manage their databases–an endeavor that legislatures have repeatedly proved that they are terrible at doing (see, e.g., the Fair Credit Reporting Act). Given how the law substantially overlaps with current industry best practices, it’s mostly annoying because it imposes extra compliance costs for little benefit. However, to the extent it overrides the limited cases where websites/apps would justifiably choose to restrict content removal, the law may harm the information ecosystem. It’s a legislature’s choice to preference the individual interests of minors over these social considerations, though it’s probably a poor choice. Given that it won’t actually provide minors with a well-functioning digital eraser, the choice appears even more puzzling.

This law also reminds us that regulators cannot resist loving the Internet to death. California alone considered an astounding 215 bills containing the word “Internet” this legislative session. The sheer volume of this regulatory frenzy, combined with sloppy drafting we see in state legislation all too frequently, will undoubtedly harm the Internet industry, even if any single proposal might have been in our interests.

[Photo credit: Removing word with pencil’s eraser, Erasing Oops ! // ShutterStock]