Everything You Wanted to Know About the Moody v. NetChoice Supreme Court Opinion

Normally, when a major Internet Law development occurs, I write two posts. First, I write up a quick hit for the media. See my short statement on the Moody v. NetChoice. decision. I then follow up with a comprehensive blog post–often a 5k+ word post that can take me 20+ hours to write.

In advance of the Moody ruling, the editors of the Cato Supreme Court Review asked me to submit a book chapter based on my inevitable comprehensive post. This led to a 10,000 word chapter, “Speech Nirvanas” on the Internet: An Analysis of the U.S. Supreme Court’s Moody v. NetChoice Decision, that I’m now excited to share with you.

I turn in the chapter to the book editors next week, so it’s still in draft form. I would be grateful for any feedback you have. Please email me your comments.

* * *

Outtakes on Justice Alito’s Concurrence

The chapter has a strict word count cap of 10,000 words. That led to many cuts from my rough draft (in contrast, I don’t have any self-imposed word count caps on my blog posts!). Here are some outtakes from my analysis of Justice Alito’s concurrence. TL;DR: his concurrence sucked (I have more criticisms in the full paper).

Justice Alito favorably cites Packingham v. North Carolina, 582 U.S. 98, 107 (2017) for the passage that social media has become the “modern public square.” However, in his concurrence in that case, he called that phrase “undisciplined dicta.” Credit to Daphne Keller for pointing out his inconsistency.

Justice Alito writes that “research suggests that social media are having a devastating effect on many young people, leading to depression, isolation, bullying, and intense pressure to endorse the trend or cause of the day.” Alito slip opinion at 2. However, Texas and Florida did not justify their social media censorship laws on the basis that they will benefit “young people.” For example, Florida’s law doesn’t contain a single reference to “child,” “minor,” or “teen.” Texas’ law contains a single reference to children, but only with respect to platforms’ ability to honor takedown requests from organizations with “the purpose of preventing the sexual exploitation of children” and other sexual abuse victims.

If Florida, Texas, and Justice Alito are really concerned about children, they’ve chosen a terrible “solution.” The social media censorship laws would severely hinder or eliminate social media platforms’ ability to engage in the content moderation necessary to protect young people from depressing or bullying online interactions.

Justice Alito writes: “‘Content moderation’ is the gentle-sounding term used by internet platforms to denote actions they take purportedly to ensure that user-provided content complies with their terms of service and “community standards.” The Florida law eschews this neologism and instead uses the old-fashioned term ‘censorship.’” Alito slip opinion at 5.

“Censorship” traditionally refers to content restrictions by the government, not private entities, so the Florida usage of the term to describe private editorial decisions is less precise than the “neologism.” It’s unclear why Justice Alito thinks “content moderation” is “gentle-sounding” and why content moderation “purportedly” ensures compliance with the platform’s editorial standards, but he uses these gratuitous adjectives to imply that content moderation is somehow nefarious instead of essential to the Internet’s proper functioning.

The opinion initially contained a painful typo where Justice Alito referred to “§203 of the Communications Decency Act of 1996″ when he meant Section 230. Alito slip opinion at 7. I emailed this correction to the court, which they have since made. (I doubt I was the only person to submit the correction).

The opinion says “NetChoice appears to represent all—or nearly all—regulated parties,” but he made that statement without any supporting citation. Alito slip opinion at 14 n.14. In fact, no one (including Justice Alito) knows for certain who is governed by the laws. Indeed, given the size-based thresholds for who qualifies as a regulated social media platform, the list of regulated entities could change daily. For more on challenges with size-based definitions of Internet services, see this piece.

The opinion says “The typical newspaper regulates the content and presentation of articles authored by its employees or others, but that same paper might also run nearly all the classified advertisements it receives, regardless of their content and without adding any expression of its own.” Alito slip opinion at 18. This is an afactual hypothetical. Old-school newspapers didn’t run nearly all of the ads they received. They promulgated and enforced their own house rules for classified advertising. We discuss this in Chapter 17 of our Advertising Law book.

With respect to the compelled editorial disclosures, the opinion says “Various platforms already make similar disclosures [about their content moderation practices]—both voluntarily and to comply with the European Union’s Digital Services Act—yet the sky has not fallen.” Alito slip opinion at 27. Justice Alito’s celebration of the DSA is premature at best. The DSA has just went into effect in February 2024. Furthermore, not all entities regulated by the Florida and Texas law are required to comply with the DSA in full (or at all); the details of the legal compulsions (or voluntary disclosures) differ in important ways; and the EU and US legal contexts (such as enforcement mechanisms) vary substantially in ways that can lead to different outcomes even if the substantive obligations are similar.

* * *

Case library (see also NetChoice’s library) [note: I have not compressively maintained this list following the 11th Circuit ruling]