Is the California Legislature Addicted to Performative Election-Year Stunts That Threaten the Internet? (Comments on AB2408)

It’s an election year, and like clockwork, legislators around the country want to show they care about protecting kids online. This pre-election frenzy leads performative bills that won’t actually help any kids. Today I’m blogging about one of those bills, California AB 2408, “Social media platform: child users: addiction.” (For more on how the California legislature is working to eliminate the Internet, see my posts on the pending bills AB587 and AB2273).

This bill assumes that social media platforms are intentionally addicting kids, so it creates business-ending liability to thwart those alleged addictions. The consequences depend on which the platforms choose to play it.

The platforms are most likely to toss all kids overboard. This is almost certainly what the legislators actually want given their antipathy towards the Internet, but it’s not a good outcome for anyone. It hurts the kids by depriving them of valuable social outlets and educational resources; it hurts adults by requiring age (and likely identity) verification to sort the kids from adults; and the age/identity verification hurts both kids and adults by exposing them to greater privacy and security risks. I explain all of this in my post on AB 2273 (the AADC), which redundantly also would require platforms to authenticate all users’ ages to avoid business-ending liability.

If platforms try to cater to kids, they would have to rely on an affirmative defense that hands power over to a censor (euphemistically called an “auditor” in the bill) who can declare that any feature is addictive, requiring the platform to promptly remove the feature or face business-ending liability. Handing control of publication decisions to a government-designated censor is as disrespectful to the Constitution as it sounds.

What the Bill Says

Who’s Covered? 

The bill defines “social media platform” as

a public or semipublic internet-based service or application that has users in California and that meets all of the following criteria:
(A) A substantial function of the service or application is to connect users in order to allow users to interact socially with each other within the service or application.
(B) A service or application that provides email or direct messaging services shall not be considered to meet this criterion on the basis of that function alone.
(C) The service or application allows users to do all of the following:
(i) Construct a public or semipublic profile for purposes of signing into and using the service.
(ii) Populate a list of other users with whom an individual shares a social connection within the system.
(iii) Create or post content viewable by other users, including, but not limited to, on message boards, in chat rooms, or through a landing page or main feed that presents the user with content generated by other users.

I critiqued similar language in my AB 587 blog post. Putting aside its clunky drafting, I assume this definition reaches all UGC services, subject to the statutory exclusions for:

  • email and direct messaging services (the bill doesn’t define either type).
  • tools that allow employees and affiliates to talk with each other (Slack, perhaps?).
  • businesses earning less than $100M/year in gross revenues. See my article on defining Internet service size for critiques about the pros and (mostly) cons of revenue metrics.
  • “A social media platform whose primary function is to allow users to play video games.” This is so interesting because video games have been accused of addicting kids for decades, but this bill would give them a free pass. Or maybe the legislature plans to target them in a bill sequel? If the legislature is willing to pass this bill, no business is safe.

What’s Restricted?

This is the bill’s core restriction:

A social media platform shall not use a design, feature, or affordance that the platform knew, or which by the exercise of reasonable care should have known, causes child users to become addicted to the platform.

Child = under 18.

Addiction is defined as: “(A) Indicates preoccupation or obsession with, or withdrawal or difficulty to cease or reduce use of, a social media platform despite the user’s desire to cease or reduce that use. and (B) Causes physical, mental, emotional, developmental, or material harms to the user.”

The restriction excludes third-party content and “passively displaying” that content (as we’ve discussed repeatedly, “passively publishing content” is an oxymoron). Parents cannot waive the bill’s liability for their kids.

The Affirmative Defense. The bill provides an affirmative defense against civil penalties if the platform: “(1) Instituted and maintained a program of at least quarterly audits of its practices, designs, features, and affordances to detect practices or features that have the potential to cause or contribute to the addiction of child users. [and] (2) Corrected, within 30 days of the completion of an audit described in paragraph (1), any practice, design, feature, or affordance discovered by the audit to present more than a de minimis risk of violating this subdivision.” Given that the defense would negate some, but not all, potential remedies, this defense doesn’t really help as much as it should.

Problems with the Bill

Social Media Benefits Minors

The bill enumerates many “findings” about social media’s evilness. The purported “findings” are mockably sophomoric because each fact claim is easily rebutted or disproven. However, they are a tell about the drafters’ mindset. The drafters approached the bill as if social media is never legitimate, which explains why the bill would nuke social media. Thus, with zero self-awareness, the findings say: “California should take reasonable, proportional, and effective steps to ensure that its children are not harmed by addictions of any kind.” The bill’s response is neither reasonable nor proportional–and it would be “effective” only in the sense of suppressing all social media activity, good and bad alike.

Of course, everyone (other than the bill drafters) know that social media has many benefits for its users, both adults and children alike. For example, the infamous slide showing that Instagram harmed 20% of teenage girls’ self-image also showed that it benefited 40% of teenage girls. Focusing on the 20% by eliminating the 40% is a policy choice, I guess. However, millions of unhappy Californian voters will be shocked by the legislature’s casual disregard towards something they value highly and care passionately about.

The Age Authentication Problem

The bill imposes liability for addicting children, but it doesn’t define when a platform knows that a user is a child. As I’ve discussed with other performative protect-kids-online bills, any attempt to segment kids from adults online doesn’t work because there’s no great method for age authentication. Any age authentication solution will set up barriers to moving around the Internet for both adults and children (i.e., welcome to our site, but we don’t really want you here until we’ve authenticated your age), will make errors in the classifications, and will expose everyone to greater privacy and security risks (which counterproductively puts kids at greater risk). If users have a persistent identity at a platform (necessary to avoid redundantly authenticating users’ ages each visit), then age authentication requires identity authentication, which expands the privacy and security risks (especially for minors) and subverts anonymous/pseudonymous Internet usage, which hurts users with minority characteristics and discourages critical content and whistleblowing. So protecting “kids” online comes with a huge package of unwanted consequences and tradeoffs, none of which the bill acknowledges or attempts to mitigate.

Another option is that the platform treats adults like kids, which I’m sure the bill drafters would be just fine with. However, that highlights the bill’s deceptive messaging. It isn’t really about protecting “kids.” It’s really about censoring social media.

Holding Manufacturers Liable for Addiction

This bill would hold platforms liable for addicting their customers–a very, very rare liability allocation in our legal system. Consider other addictions in our society. Cigarette manufacturers and retailers aren’t liable for the addictive nature of nicotine. Alcohol manufacturers and retailers aren’t liable for alcohol addiction. Casinos aren’t liable for gambling addiction. Those vices may be restricted to adults (but remember parents can’t waive 2408 for their kids), but virtually every marketplace product or service can “addict” some of its most fervent customers without facing liability. This bill seemingly opens up a major new frontier in tort law.

The Causation Problem

The bill sidesteps a key causation problem. If a practice is standard in the industry and a user uses multiple platforms, how do we know which platform caused the addiction? Consider something like infinite scrolling, which is used by many platforms.

This problem is easy to see by analogy. Assume that a gambling addict started gambling at Casino A, switched loyalty to Casino B, but occasionally gambles at Casino C. Which casino caused the addiction?

One possible answer is to hold all of the casinos liable. Or, in the case of this bill, hold every platform liable so long as the plaintiff can show the threshold condition of addiction (“preoccupation or obsession with, or withdrawal or difficulty to cease or reduce use of, a social media platform despite the user’s desire to cease or reduce that use”). But this also means platforms could be liable for addictions they didn’t “cause,” at least not initially.

The Impossibility of Managing the Liability Risk

There’s a fine line between standard product marketing–where the goal is to increase consumer demand for the product–and causing customers to become addicted. This bill erases the line. Platforms have no idea which consumers might become addicted and which won’t. There’s no way to segregate the addiction-vulnerable users and treat them more gently.

This means the platform must treat all of its customers as eggshell victims. Other than the affirmative defense, how can a platform manage its legal exposure to a customer base of possibly millions of California children, any one of which may be an eggshell? The answer: it can’t.

The unmanageable risk is why platforms’ dominant countermove to the bill will be to toss children off their service.

The Affirmative Defense

Platforms that don’t toss children overboard will rely on the affirmative defense. The affirmative defense is predicated on an audit, but the bill provides no details about the auditor’s credentials. Auditor-censors don’t need to have any specific certification or domain expertise. In theory, this permits self-auditing. More likely, it sets up a race to the bottom where the platforms retain auditor-censors based on their permissiveness. This would turn the audit a form of theater: everyone plays their statutory part, but a permissive auditor-censor nevertheless greenlights most features. In other words, auditing without certification doesn’t create any benefits for anyone.

If the auditor-censor’s report comes back clean, the platform has satisfied the defense. If the auditor-censor’s report doesn’t come back clean, the 30 day cure period is too short to fix or remove many changes. As a result, platforms will necessarily run all potential site changes by their auditor-censor before launch to preempt getting flagged in the next quarterly report. Thus, every quarterly report should come back clean because any potential auditor-censor concerns were resolved beforehand.

The affirmative defense mitigates civil penalties, but it does not address any other potential remedies created by the bill, including injunctive relief and criminal sanctions. As a result, the incomplete nature of the affirmative defense doesn’t really provide the legal protection that platforms need. This will further motivate platforms to toss kids overboard.

Section 230 Preemption

The bill has a savings clause to exclude any claims covered Section 230, the First Amendment, and the CA Constitution equivalent. That’s great, but what’s left of the bill after Section 230’s preemption? At their core, platforms are remixing third-party content, and any “addiction” relates to the consumption of that content. This bill tries to avoid reaching third-party content, but that’s all it does. Thus, it should squarely fall within Section 230’s preemption.

Constitutionality

If platforms conduct the audit theater, the auditor functions as a government-designated censor. The auditor-censor’s report is the only thing potentially protecting platforms from business-ending liability, so platforms must do whatever the auditor-censor says. This gives the auditor power to decide what features the platforms publish and what they don’t. For example, imagine a government-designated censor at a newspaper, deciding if the newspaper can add a new column or feature, add a new topical section, or change the size and layout of the paper. That censor overrides the publisher’s editorial choices of what content to present and how to present it. This bill does the same.

There are also the standard problems about who is and isn’t covered by the bill and why they were included/excluded, plus the typical Dormant Commerce Clause concern.

I’ll also note the serious tort doctrine problems (like the causation problem) and questions about whether the bill actually benefits any constituency (especially with the audit theater). Even if the bill gets lesser constitutional scrutiny, it still may not survive.

Conclusion

Numerous lawsuits have been filed across the country premised on the same theory underlying this bill, i.e., social media addicts kids. Those bills will run into tort law, Section 230, and Constitutionality challenges very soon. It would make sense for the California legislature to see how that litigation plays out and discover what, if any, room is left for the legislature to regulate. That would save taxpayers the costs of the inevitable, and quite possibly successful, court challenge to this bill if passed.