The Ninth Circuit Finds Two New Ways to Undermine Section 230âDoe v. Twitter
The Ninth Circuit swiss-cheesed Section 230 with two new exceptions. đ
* * *
The district court dismissed the victimsâ case against Twitter. My prior blog post.
The Ninth Circuit panel summarizes its ruling:
Though expansive, there is nuance to § 230 immunity. Here, we conclude that Twitter is immune from liability on Plaintiffsâ claim that it knowingly benefitted from sex trafficking and on their product-defect claim based on Twitterâs failure to remove posts under review as being child pornography and its creation of search features that amplify child-pornography posts. These claims hinge on Twitterâs role as a publisher of third-party content, which triggers § 230. But Plaintiffsâ claims for negligence per se and their product-liability theory based on defective reporting-infrastructure design are not barred by § 230 immunity because they do not arise from Twitterâs role as a publisher.
The FOSTA/230 Interplay
In general, the FOSTA claims are preempted by Section 230, as indicated by Doe v. Reddit and Doe v. Grindr. The court explains:
Plaintiffsâ thinking is there can be no monitoring duty if the posts are already on Twitterâs radar. But contrary to Plaintiffsâ reasoning, their theory for liability imposes a monitoring obligation. Plaintiffs allege that, given Twitterâs advertising structure and other revenue-generating activities, â[a]s long as content on Twitterâs platform remains live, Twitter monetizes that content.â With that alleged one-to-one relationship between posted content and Twitter monetizing that content, the only way for Twitter to avoid the unlawful benefit from hosting child pornography would be to remove third-party postsâa quintessential publishing activity.
In my previous post, I wrote âFOSTA was not designed as an anti-CSAM law, so the plaintiffsâ claims donât really fit the legal doctrine.â The court explains:
Twitterâs failure to respond to demands to remove the videos is not the type of âaffirmative conductâ that constitutes âassistance, support, or facilitationâ of sex trafficking for which § 1591 attaches criminal (and, correspondingly, civil) liabilityâŠWhile we understand the logic of Plaintiffsâ argument that continuing to make available known child pornography is tantamount to facilitating sex trafficking, that reasoning fails under our prior holding that merely turning a blind eye to illegal revenue-generating content does not establish criminal liability under § 1591.
The court also notes a causation problem:
We require âa causal relationship between affirmative conduct furthering the sex-trafficking venture and receipt of a benefit.â Generic advertising revenue schemes that apply sitewide do not satisfy that causal demand. But that is Plaintiffsâ theory of knowing benefit here: Twitter profits from all the posts on its website, it knew the posts at issue here contained child pornography, and therefore it knowingly benefited from a child-pornography trafficking venture.
âReporting Mechanism Architectureâ Duty
After running through a standard FOSTA/230 analysis and concluding that Section 230 applied to the FOSTA claims, the court then turns to various ancillary negligence claims that the plaintiff alleged.
âPlaintiffs allege that Twitter makes it too difficult to report child pornography that is posted on Twitter.â The court says this claim isnât preempted by Section 230 due to Lemmon v. Snap:
This aspect of Plaintiffsâ design-defect claim relates solely to product design, and in that way, is analogous to the products-liability claim that we allowed to proceed in Lemmon v. Snap, Inc., 995 F.3d 1085 (9th Cir. 2021). The plaintiffs in Lemmon challenged a Snapchat filter that showed the speed a user was traveling, which the plaintiffs alleged encouraged reckless driving. When a young driver used the speed filter shortly before a fatal crash, his parents sued for negligent design. We allowed the claim to proceed, reasoning that it turned on Snapâs design architecture rather than the publication of any content. We underscored that the plaintiffsâ claim treated Snap as a âproduct designerâ rather than a âpublisher or speakerâ because âSnap could have satisfied its âalleged obligationâ . . . without altering the content that Snapchatâs users generate.â
So too here. Twitter could fulfill its purported duty to cure reporting infrastructure deficiencies without monitoring, removing, or in any way engaging with third-party content. This claim thus does not seek to hold Twitter responsible as a publisher or speaker. Increased removal of third-party content may well be the outcome of a more robust reporting structure. But a claim alleging a duty that does not treat a defendant as a publisher is not barred by § 230, even if that legal duty âmight lead a company to respond with monitoring or other publication activities.â HomeAway.com, 918 F.3d at 682. âWe look instead to what he duty at issue actually requires: specifically, whether the duty would necessarily require an internet company to monitor third-party content.â Here, Twitterâs improvement of its reporting mechanismâfor example, by allowing people to report child pornography sent via private messagingâwould not necessarily require Twitter to monitor third-party content.
I canât believe what Iâm reading. This is so confused. Some of the OBVIOUS problems with this line of thinking:
- Why would plaintiffs be reporting CSAM? To have it depublished. Itâs bafflingly tendentious to say that that the reporting mechanism is somehow independent of the publishing mechanism. The whole point of submitting notices is to override the publisherâs decisions.
- This is especially true with CSAM, where notice of CSAM publication FORCES the service to remove the content (and report it to NCMEC) to avoid CRIMINAL liability. So in practice virtually every notice of CSAM will cause content to come down, even items that are not necessarily CSAM.
- Every plaintiff will ALWAYS claim that the service had a âdeficient reporting infrastructure.â After all, they are suing because the service didnât take down the targeted content, and the plaintiffs can always claim that the service would have responded differently only if the service had a proper âreporting infrastructure.â In other words, the Ninth Circuit has created another guaranteed fast lane for getting around Section 230 that every plaintiff can use/misuse.
- The court improperly summarizes the Lemmon case, saying that Section 230 stepped aside because of âSnapâs design architecture.â The phrase âdesign architectureâ appears nowhere in the Lemmon opinion, and that phrase twists what the court said. The opinion says that the content authoring tool motivated users to engage in risky behavior, even if they never published content. Here, the lawsuit comes about only because content has already been published, so the court has turned the Lemmon precedent on its head.
Itâs possible the court thought it was doing one of the famous Ninth Circuit Section 230 switcheroos, i.e., provide a fastlane around Section 230 based on a claim that will fail anyway, in which case the Ninth Circuit can claim justice was served by getting the dismissal based on a different legal doctrine but still achieve the same result as if Section 230 applied. (See the appendix below for a roundup of the Ninth Circuitâs history of Section 230 switcheroos).
On the surface, that switcheroo motivation seems plausible. In general, services donât have a legal âdutyâ to provide any reporting mechanism at all. Therefore, if their âreporting infrastructureâ is allegedly deficient, the claim should fail for a lack of duty.
The claim should also fail because thereâs no breach of any duty. âDoe #1 filed a complaint through Twitterâs content-reporting interface, and Twitter instructed him to send a copy of his identification to confirm he was the person in the reported video.â So it sounds like the plaintiff was able to report the CSAM enough sufficient to trigger Twitterâs acknowledgement. What more was required from Twitter to fulfill the so-called âreporting mechanism architectureâ duty? In other words, were the plaintiffs really complaining that the reporting mechanism wasnât properly designed, or that Twitter didnât act properly in response to the report? The evidence casts doubts on the former, and the court expressly says Section 230 precludes the latter.
Further, the putative Section 230 switcheroo encounters a new complexity: the Take It Down Act. The CSAM at issue surely would be covered by the Take It Down Actâs notice-and-removal scheme, plus the Take It Down Act requires regulated services to provide a reporting mechanism for Take It Down Act-governed materials. Hereâs the statutory language:
(2) NOTICE OF PROCESS.âA covered platform shall provide on the platform a clear and conspicuous notice, which may be provided through a clear and conspicuous link to another web page or disclosure, of the notice and removal process established under paragraph (1)(A) thatâ
(A) is easy to read and in plain language; and
(B) provides information regarding the responsibilities of the covered platform under this section, including a description of how an individual can submit a notification and request for removal.
For negligence purposes, does this law establish a âdutyâ to provide a reporting mechanism and define the design attributes required to satisfy the duty? Again, itâs unclear that Twitter didnât already do enough of this to satisfy any alleged duty. Of course, plaintiffs can always tendentiously argue over things like whether the disclosure was âeasy to readâ or âin plain language,â so thereâs plenty to fight over if thereâs a private right of action enforcing the Take It Down Actâs reporting mechanism.
ButâŠthe Take It Down Act does not provide for a private right of action. Violations of the statutory reporting mechanism requirements are enforced exclusively via the FTC. Though it remands the case, this ruling implies that plaintiffs might use a negligence claim to establish a private right of action for the Take It Down Act AND get around Section 230.
What did the Ninth Circuit say about the intersection of Twitterâs allegedly negligent reporting mechanism architecture and the Take It Down Act? Not a word. The Ninth Circuit opinion DOESNâT MENTION THE TAKE IT DOWN ACT AT ALL. In effect, the Ninth Circuit implies there may have been a common law duty to provide easy CSAM reporting irrespective of the Take It Down Act, but maybe that got overriden by the Take It Down Act? So many questions, no answers. Instead, we have no idea how the panel thinks these doctrines interactâor even if they understood the collisions The district court will have to sort through the mess this panel created.
[Note: there may be other laws that require services to provide reporting functions, including the DMCA 512 safe harbor, the EUâs DSA, the Florida and Texas social media censorship laws, and more. I donât think this panel really thought through or understood the implications of opening the door on all of those.]
This issue also looks like it would be appropriate for en banc review. However, if en banc review is granted, the en banc panel could potentially overturn the Reddit and Grindr opinions and throw Section 230 into even more turmoil.
âRemoval Pending Reviewâ Duty
Plaintiffs also complain that Twitter âfailed to block reported [child pornography] while [it] was investigated and enabled reported [child pornography] to continue to be massively disseminated.â This product-defect theory is barred by § 230. Plaintiffs are attempting to hold Twitter liable for its failure to remove (even if automatically) harmful third-party content.
True, and yet, in light of the liability conferred by notification of CSAM and the automatic removal that ought to follow every notice, is this really different from the âreporting mechanism architectureâ duty?
Also, note framing this as a design defect is disingenuous. The plaintiff was just repackaging publication decisions as âdesignâ choices. The court sees through it, but this is a sign of the doctrinal sleight-of-hand built into virtually every design claim against publishers.
Amplification
Plaintiffs âallege that sex traffickers create and use certain hashtags to signal child pornography. Plaintiffs allege that Twitterâs search function defectively responds to these nefarious hashtags, aiding consumers of child pornography in finding the illegal content that they are looking for.â Citing Dyroff and MP v. Meta (from the Fourth Circuitâinteresting to see the Ninth Circuit acknowledge it), the court responds:
we hold that Twitter is immune from liability for the alleged third-party abuses of its hashtag and search functions. Distinguishing between innocent #ParisOlympics-type hashtags and the more nefarious ones would require Twitter to act as a publisher. Notwithstanding Plaintiffsâ allegation that âTwitter has the ability to, and in fact does, block certain hashtags,â deciding when to take that step is a publisher decision. Otherwise, upon designing neutral hashtag and search tools, Twitter would be required to monitor and act when its users adapt them to illicit ends
NCMEC Reporting Duty
The plaintiffs claimed that Twitter committed negligence per se by failing to make a timely report to NCMEC. The court says Section 230 doesnât preempt this claim:
Twitter argues that its reporting duty arises because its platform allows third parties to upload content. That may well be true. Twitter âis an internet publishing businessâ and âpublishing content is âa but-for cause of just about everythingâ [Twitter] is involved in.â But that is not the test. And the facts alleged here, coupled with the statutory âactual knowledgeâ requirement, separates the duty to report child pornography to NCMEC from Twitterâs role as a publisher.
Plaintiffs do not claim that Twitter must scour its platform for content triggering its NCMEC-reporting duty. They do not even claim that Twitter must review reported child pornography. Rather, they allege that once Twitter has obtained actual knowledge of such content, as evidenced by its representation that it had âreviewed the content,â it had a legal duty to promptly report that content to NCMEC. Because that duty neither requires Twitter to monitor content nor take any action associated with publication (e.g., removal) once it learns of the objectionable content, § 230 does not immunize Twitter from Plaintiffsâ negligence per se claim.
To be fair, the NCMEC reporting duty can exist independently of any publisher/speaker duty. For example, a service would have an obligation to report CSAM if they discovered the file in a private cloud storage, even if the file had never been shared/distributed to anyone. That provides some justification for excluding Section 230.
However, this ruling is problematic on two fronts. First, it encourages plaintiffs to bring lawsuits based on any irregularities with the NCMEC filing, including any delays. (The statute requires reporting âas soon as reasonably possible,â and plaintiffs can always claim that servicesâ filings took longer than that). Reporting mistakes happen, and plaintiffs now have another vector to litigate over mistakes.
Second, the statute is unambiguous: âThe Attorney General shall enforce this section.â In other words, by framing the allegedly negligent design âdefect,â the plaintiffs are trying to manufacture a private right of action for a statute that clearly doesnât provide one. Like with the so-called âreporting mechanism architectureâ duty, the panel didnât deal with that potential conflict and simply declared the statute categorically excluded from Section 230.
Implications
I wonder if the panel thought it was issuing a decisive win for Twitter. After all, Section 230 preempted all of the plaintiffsâ claims except for two, and the panel probably anticipated the two claims it left open will fail for lack of merit on remand.
But what the panel really did was: (1) add two more fast lanes that virtually every plaintiff can use to get around Section 230 and potentially get into expensive and troublesome discovery; (2) undermine Section 230âs structural integrity further; (3) wrongly recapitulate its Lemmon v. Snap opinion to make it easier for plaintiffs to miscite and misuse; and (4) ensure collisions between this ruling and federal statutory remedial schemes that donât appear to authorize private rights of action. With respect to the last point, many services have proceeded with the idea that negotiating with government enforcers is manageable, but private rights of action are not. If plaintiffs can end-run such remedial schemes by claiming negligence per se based on any statutory duty, then there are more PRAs than services think there are.
Given all of those problems, I think this case would benefit from en banc review.
Despite the awful new Section 230 exclusion, this ruling otherwise casts a shadow over the social media addiction cases. The plaintiffs in those cases will celebrate the mischaracterization of the Lemmon v. Snap opinion, but the court also expressly shuts down claims over âproduct designâ that might implicate their claims as well.
Case Citation: John Doe 1 v. Twitter, Inc., 2025 WL 2178534 (9th Cir. Aug. 1, 2025)
* * *
APPENDIX ON SECTION 230, THE NINTH CIRCUIT, AND SWISS CHEESE
__
The Ninth Circuitâs Ongoing Swiss-Cheesing of Section 230. Section 230 has been on a long decline trend in the Ninth Circuit. Take a look at this history of the Ninth Circuit adding exclusions to Section 230 (Iâm sure Iâm forgetting some):
- Batzel v. Smith: exclusion for third-party content not intended for publication
- Roommates.com: exclusions for (1) encouraging illegal content, (2) requiring the input of illegal content, and (3) materially contributing to content illegality. Recall the tortured history of that case: the initial panel decision prompted en banc review and a new en banc opinion, which was later rendered dicta by further proceedings.
- Barnes v. Yahoo: exclusion for promissory estoppel (expanded in this case to all contract claims). The panel had to amend its initial opinion.
- Doe v. Internet Brands and Beckman v. Match.com: exclusion for failure-to-warn. The panel had to completely replace its initial Internet Brands opinion.
- HomeAway v. Santa Monica: exclusion for consummating third-party transactions.
- Enigma v. Malwarebytes: exclusion when the plaintiffs allege âanti-competitive animus.â This case went back to the Ninth Circuit.
- Gonzalez v. Google: exclusion for funding third-party content. Iâm not sure whatâs left of this exclusion after the Supreme Court ruling.
- Lemmon v. Snap: exclusion for ânegligent designâ (when not based on third-party content, though the plaintiffs are conveniently ignoring that).
- Vargas v. Facebook: exclusion for discriminatory ad targeting. Non-precedential opinion.
- Quinteros v. Innogames: exclusion for moderatorsâ activities. Non-precedential opinion.
- Diep v. Apple: exclusion for first-party marketing representations (did this case survive the Calise case?). Non-precedential opinion.
__
Since I wrote that blog post, I would add the following additional swiss-cheese holes to the list:
- Calise v. Meta: Exclusion for ontract-based claims (though not all courts are agreeing with that exclusion).
- Estate of Bride v. YOLO: Exclusion for any site disclosure.
- Doe v. Twitter: Exclusions for alleged breaches of a âreporting mechanism architectureâ duty and NCMEC reporting duty.
đ
* * *