A Roundup of Recent Section 230 Decisions Involving Sex Abuse or CSAM

Another lengthy blog post rounding up cases from the past few months involving CSAM or commercial sex and Section 230/FOSTA.

Doe #1 v. MG Freesites, Ltd., 2025 WL 1314179 (N.D. Ala.  May 6, 2025)

Prior blog posts (1, 2).

Previously in this case, Judge Coogler denied Pornhub’s motion to dismiss, certified a class of plaintiffs, and denied summary judgment to Pornhub. The case got reassigned to a new judge, and Pornhub requested permission to make an interlocutory appeal. The new judge denied this request as well.

Pornhub argued that Judge Coogler incorrectly relied on the (non-controlling) Third Circuit’s Anderson v. TikTok ruling. The court says this isn’t a dispositive issue because “Judge Coogler would have come to the conclusion that Defendants were content providers and thus not entitled to immunity under Section 230 even if he had not considered Anderson.” In the summary judgment ruling, the evidence of Pornub’s Section 230-disqualifying “material contributions” to the content were that Pornhub creates video thumbnails, it creates and edits CSAM-related tags, and its algorithms help match CSAM distributors with CSAM viewers.

The court also says that there is not a substantial difference of opinion about the Anderson case and there is no circuit split. I disagree strongly with this court’s assessment. The Anderson case expressly acknowledged conflicts with 7 other circuits. Plus, see the K.B. v. Backpage ruling (discussed below), explaining why the Anderson ruling is wrong and not persuasive at all. The Doe v. Webgroup Czech Republic case discussed below also rejects the Anderson ruling.

Pornhub also challenged Judge Coogler’s distinction between disseminating “information,” which is protected by 230, and possessing “contraband,” which is not. (“Judge Coogler reasoned that CSAM “is itself a violent crime, a record of that crime, and contraband – not information.””) The new judge says that Judge Coogler’s ruling was based on the premise that being a receiver or possessor of CSAM isn’t covered by 230, so the “contraband” characterization wasn’t a dispositive ruling. Compare the In re Social Media Addiction case discussed below, which rejects the argument that CSAM possession can be distinguished from third-party content.

Judge Coogler’s rulings have been extremely dubious, and the new judge has to contort to preserve them. This case remains an outlier in Section 230 jurisprudence. I wonder if this case will get a different reception when it finally makes it out of the Alabama district court and goes to an appellate court.

Doe v. Apple Inc., 2025 WL 1266928 (N.D. Cal. May 1, 2025)

The plaintiffs allege that Apple impermissibly stores CSAM in its iCloud storage. Users can store iCloud files using “Advanced Data Protection,” which encrypts the files such that Apple can’t access them. The plaintiffs allege that Apple is “willfully blind” to any encrypted user-uploaded CSAM.

The plaintiffs tried to fit their claims into FOSTA’s Section 230 exception. The court responds: “Plaintiffs’ single conclusory allegation that the perpetrators at the center of this case were commercial sex traffickers is not enough to transform Apple’s alleged conduct into a Section 2421A violation.”

The plaintiffs also tried “Masha’s Law,” 18 USC 2255, which provides a civil remedy for CSAM violations. The court responds: “Does 1-6 v. Reddit, Inc. is directly on point; the Court held that willful ignorance of a provider, without more, cannot sustain a violation of a sex trafficking statute such as §§ 2252 or 2252A….Plaintiffs here have not—and it appears they cannot—allege that Apple knew of the specific five CSAM videos sent to Doe, nor of the CSAM videos that Doe sent back.” And even if the plaintiffs established the requisite statutory knowledge, Section 230 would apply to 2255 claims, i.e., 230 preempts civil claims based on federal criminal violations. Cite to Doe v. Twitter.

The court sidesteps the heavy policy issues raised by the plaintiffs’ lawsuits. The plaintiffs are essentially trying to eliminate online encryption by exposing services to strict liability for any user-committed torts or crimes committed using encryption because the service couldn’t “see” the encrypted materials. We should be encouraging the broader use of encryption, not imposing unbounded liability for offering it.

Doe v. Grindr, LLC, 2025 N.Y. Misc. LEXIS 1942 (N.Y. Supreme Ct. April 1, 2025)

This case involves a Grindr user, Weinreb, who allegedly sexually abused an underage Grindr user. The core of the complaint is that Grindr could have age authenticated users but didn’t, so it should be liable for harms suffered by underage users. This is an easy Section 230 dismissal:

While plaintiff argues that he, as a minor, should not have been allowed onto the Grindr app and should have been given warnings as to its use, the creation and maintenance of a Grindr profile, in and of itself, did not injure plaintiff. The harm only ensued because plaintiff responded to content posted by a third-party…A claim raising “failure to implement basic safety measures to protect minors … treat[s] [d]efendant[] as a publisher of information … [because it] is ‘inextricably linked’ to [d]efendant[]’s publication of [the sexual assailant’s] messages to [the minor victim].” Thus, CDA § 230 precludes plaintiff’s claims predicated on negligence, design defect and failure to warn since liability cannot exist without implicating Grindr’s role as the publisher or speaker of Weinreb’s content.

Plaintiff’s claims herein are analogous to those rejected by federal courts, including the Court of Appeals for the Ninth Circuit, in matters involving common law claims brought against Grindr by underage users [Cites to Doe v. Grindr 9th Cir. 2025; Doe v. Grindr CDCal 2024; Doe v. Grindr MDFla 2023]. In addition to these federal court decisions specifically involving minors’ use of the Grindr app, many well-reasoned cases dismissed, as precluded by CDA § 230, comparable claims brought by minor victims against other interactive computer services based on negligence, design defect and/or failure to warn theories [Cites to Doe (K.B.) v. Backpage (discussed below); Lama v. Meta; VV v. Meta; Doe v. Snap (SD Tex 2022); LW v. Snap; Doe v. Kik].

The court distinguishes TV v. Grindr, saying it is “unpersuasive in light of the several court decisions mentioned above.”

The court concludes the Section 230 discussion: “plaintiffs claims against Grindr for negligence (fourth cause of action), strict liability (fifth cause of action), failure to warn (seventh cause of action) and breach of implied warranty (sixth cause of action, in part) are precluded by CDA § 230.”

The plaintiff also pointed to Grindr’s TOS, which contains this provision:

NO PERSONS UNDER THE AGE OF EIGHTEEN (18) YEARS (OR TWENTY-ONE (21) YEARS IN PLACES WHERE EIGHTEEN (18) YEARS IS NOT THE AGE OF MAJORITY) MAY DIRECTLY OR INDIRECTLY VIEW, POSSESS OR OTHERWISE USE THE GRINDR SERVICES.

The plaintiff tries to treat this as a warranty that underage users cannot access the service. The court responds: “the plain and obvious meaning of this provision is that no persons under the age of 18 are permitted to view, possess or use the platform.”

As I’ll discuss with the next case (the K.B. case), plaintiffs’ efforts to create an online age authentication obligation via the common law are both unconstitutional and bad policy.

Doe (K.B.) v. Backpage.com, LLC, 2025 WL 719080 (N.D. Cal. March 3, 2025)

The plaintiff claims she was sex-trafficked on Instagram. The court dismissed the Second Amended Complaint in 2024. The Third Amended Complaint fares no better. The court summarizes: “§ 230 immunizes Meta from Doe’s claims because…each of the duties at issue in Doe’s Complaint ultimately seeks to hold Meta responsible based on its role as a publisher of third-party content that it failed to moderate or otherwise remove.”

Given that most of the arguments already failed in the second amended complaint, the most interesting part of the decision is where the court skillfully rejects the Third Circuit’s Anderson v. TikTok ruling’s attempt to override Section 230:

the Court does not read NetChoice as overruling Dyroff. NetChoice does not address Section 230 liability. Instead, NetChoice holds only that a platform’s “editorial judgments” about “compiling the third-party speech it wants in the way it wants” reflects the platform’s “own views and priorities”—and therefore warrants First Amendment protection. Doe’s argument therefore appears to presuppose that editorial decisions cannot be both an expression of a publisher’s point of view (protected under the First Amendment) and a publication of a third-party’s content (protected under Section 230). However, Doe provides no basis on which the Court should conclude that Section 230 immunity is mutually exclusive with First Amendment protection. Indeed, the undisputed core of Section 230 immunity protects a website’s morderation decisions, in its role as a “publisher,” about which third-party content to remove and which to permit. That those moderation decisions are also protected by the First Amendment does not strip them of their Section 230 immunity. To hold otherwise would effectively render the core of Section 230 a nullity, contrary to Congress’s intent and the plain language of the statute. In sum, NetChoice does not provide a basis for revisiting the prior ruling on Doe’s civil sex trafficking claims

This all seems so obvious to me. I don’t understand how the Third Circuit could have reached any other conclusion. Kudos to this judge for thinking clearly in the face of the terrible precedent.

The plaintiff argued that Instagram should have age-authenticated its users. The court says that the Ninth Circuit Doe v. Grindr ruling forecloses that argument:

Doe’s theory of liability derives from Meta’s alleged failure to include adequate age and identity verification measures that would prevent the creation of fake accounts facilitating sex trafficking communications. The duty to include these challenged features is “not independent” of Meta’s role as the “facilitator and publisher of third-party content” published on the platform. To the contrary, the duty cuts to the core of Meta’s role as a publisher: that is, determining who can and cannot speak on its platform. Doe’s proposed verification requirements are designed with the particular purpose of limiting who will create an Instagram account, and more specifically, limiting the types of content likely to be posted from those accounts (i.e., sex trafficking content). A platform’s decision as to whether to allow anonymous speech, and the consequent effects on the content of the speech that proliferates on the platform, is a classic publication decision. It is a determination of who may speak, and thus, whose content may be barred on the platform.

Again, this is so obvious, I don’t see how anyone can reach a contrary conclusion. Indeed, my entire Segregate-and-Suppress paper is based on this premise: the point of doing the segregation is to suppress. If it’s a UGC service doing the segregating-and-suppressing, it’s taking away the ability of authors to publish their content. This is both a Section 230 AND First Amendment issue. I think other defendants will want to highlight this passage in response to every plaintiff’s segregate-and-suppress argument.

[Also, a reminder that online age authentication requirements have been unconstitutional for decades per the Supreme Court rulings in Reno v. ACLU and Ashcroft v. ACLU. We’ll see if the FSC v. Paxton ruling changes that thinking. Until then, plaintiffs are asking courts to manufacture unconstitutional obligations via the common law. And as my Segregate-and-Suppress paper explains, such obligations are bad policy. They would hurt children, adults, and the Internet generally.]

The court says the Lemmon v. Snap workaround is also foreclosed by the Ninth Circuit’s Doe v. Grindr decision. “Doe is seeking to hold Meta liable for its determination of who can and cannot access its platform to speak in the first place, which cuts to the core of Meta’s role as a publisher. Thus, this claim is barred by Section 230.”

The plaintiff also argued that Instagram should have made it easier to report sexual predators on the site. The court responds:

This duty is again directed at Meta’s role as a publisher. The right from which “the duty springs” is Meta’s role as the alleged publisher of the sex trafficking posts. Doe does not contend, for example, that the manufacturer of the phone she used to access Instagram is required to provide her with a notification mechanism, like a button, that would allow her to report sex trafficking. Instead, Doe alleges that Meta is the one who owes her that duty because Meta “knew or should have known” that Instagram “attracts, enables, and facilitates sex traffickers, and sex traffickers use its Instagram product to recruit and sexually exploit other Instagram users like Jane Doe.” In other words, Meta owes the duty because of the third-party content that Meta allows to be published on its platform, and because Meta has failed to effectively monitor and remove it. As such, Doe is seeking to impose liability against Meta based on its publication of the third-party content, which is precisely the type of claim precluded by Section 230.

The court rejects the plaintiff’s failure to warn arguments based on Bride v. YOLO. E.g., “Doe is seeking to hold Meta liable as a publisher for its failure to warn of the generalized risk of sex trafficking content on its platform, and the use of fake accounts that might facilitate that content.”

This opinion persuasively rejects the standard list of plaintiffs’ demands for publishers to redesign their services (e.g., age authenticate; offer reporting tools). A very well done opinion from Judge Rita Lin.

Doe v. Webgroup Czech Republic, AS, 2025 WL 879562 (C.D. Cal. Feb. 21, 2025)

This is a FOSTA case over CSAM. Prior blog post on this case. This opinion is the court’s dismissal of the Second Amended Complaint.

The court says:

As it relates to Defendants’ platforms allegedly hosting voluminous CSAM and failing to moderate illegal conduct or verify individuals’ ages, these activities are inevitably tethered to Defendants’ statuses as publishers. Maintaining an online platform and reviewing or moderating online content are quintessential activities of Defendants’ publishing role….Accordingly, Section 230 immunizes Defendants from liability stemming from Plaintiff’s allegations that their platforms display CSAM and fail to review or monitor such content.

After that strong start, the court takes a weird turn, saying “Defendants’ encouragement or promotion of CSAM on their platforms, may, perhaps, fall outside the scope of Section 230 immunity because the role of a publisher does not necessarily entail actively encouraging creation of the content.” Say what? All UGC sites encourage their users to submit content. The Roommates.com case explicitly addressed this point: if “you don’t encourage [only] illegal content, or design your website to require users to input [only] illegal content, you will be immune.” I add the “only” qualifiers because that’s what the court said elsewhere and it’s how the case has been interpreted. So this court seems to have lost sight of the Roommates.com limitations. As the court later acknowledges (see below), the defendants did not encourage the uploading of CSAM exclusively.

This tangent turns out to be inconsequential:

Defendants’ online tools that Plaintiff challenges do not rise to the level of promotion and encouragement of CSAM. Precedent confirms this point. For example, Plaintiff’s argument—that Defendants’ use of advertisement materially contributes to the unlawful content—mirrors the argument raised in Gonzalez [v. Google]. There, the Ninth Circuit rejected the claim that Google, as the defendant, materially contributed to the unlawful content “by pairing it with selected advertising and other videos.” Nor does Plaintiff allege that Defendants particularly direct their advertising operation with a keen focus toward CSAM; their advertising operation neutrally applies to all online content…

Defendants’ revenue-sharing activities that Plaintiff challenges cannot be divorced from the users’ own conduct of filming and independently disseminating the infringing content. Based on this reasoning, Plaintiff’s claims are indeed directed at third-party’s conduct and do not independently challenge Defendants’ revenue-sharing activities.

The court distinguishes the Anderson v. TikTok precedent. The defendants’ search functions don’t overcome 230 because “Plaintiff only alleges that Defendants’ platforms, like many online service providers, use content-neutral algorithms that then list related searches or user-uploaded, related content.” [Recall that the Anderson v. TikTok opinion was structurally inconsistent on whether it was nuking 230 for all First Amendment-protected expression or treating search functions specially.] This takes the court to the conclusion that “Moody and subsequent circuit decisions like Anderson do not disturb the long-standing precedent that online service providers do not create content or lose Section 230 immunity simply by implementing content-neutral algorithms that generate related searches or user-uploaded content based on the users’ own viewing activity.” This is a pretty generous reading of Anderson given how messy that opinion was.

I trust you can hear me banging my head on my desk at the phrase “content-neutral algorithms,” which is a fabulous example of an oxymoron/null set. Algorithms are NEVER neutral.

The court doubles down on this, saying 230 protects services for “implementing standard and neutral website features.” What does this even mean??? What makes a feature “standard”? Is this like an industry-standard? That would be screwed up, as it would discourage services from trying innovative tools that better protect users if it would deviate from some hypothetical industry standard. And how can a “feature” ever be “neutral”? The court is getting to the right place, but using vernacular that cannot bear the weight being placed on it. As a result, I read this really as the court outright rejecting the plaintiffs’ overreaching arguments, not as a legally precise delineation of 230’s contours.

The court makes it clear it’s not appreciating the plaintiff’s overreaching arguments when it says:

Plaintiff’s SAC is devoid of any allegations demonstrating that Defendants employ their traditional website features in a manner that distinctly targets or proliferates the presence of CSAM, as compared to other users-uploaded content on their website platforms. Nor does Plaintiff allege that Defendants’ tools encourage or elicit users to post CSAM, compared to users who post other content.

In other words, the defendants run UGC sites. 230 protects UGC sites. That answers all relevant questions.

The FOSTA claim fails per Doe v. Reddit.

In re Social Media Adolescent Addiction/Personal Injury Products Liability Litigation, 2025 WL 1182560 (N.D. Cal. Feb. 28, 2025)

I have not blogged all of the rulings in this case. First, they are voluminous. Second, they are coming out faster than I can keep up. Finally, the case will be heading up on appeal to the Ninth Circuit, so the district court’s decisions are just the warmup acts.

I’ll focus solely on the allegations regarding CSAM. The court describes the allegations:

each of five individual plaintiffs recounts a harrowing history of sexual abuse suffered while engaging with third parties on Meta’s and/or Snap’s platforms. None of these plaintiffs allege that either social media company encouraged the propagation of their CSAM, but instead that their platforms were used as the vehicle for communication with abusers and for transmission of CSAM. Despite receiving repeated notice of plaintiffs’ CSAM, the platforms allegedly failed to remove the CSAM, report individual users or groups on the platforms that were hosting the CSAM, or otherwise take appropriate corrective action to abate the continued dissemination of plaintiffs’ CSAM.

The court says these claims are barred by Section 230.

The court starts out: “Most courts have held Section 230 bars civil CSAM claims against social media platforms.” The court explains: “The only way for defendants to have avoided “possession” of the CSAM material in the first place is to pre-screen content loaded onto its platforms by third parties, which is a traditional publishing activity. In other words, “possession” and failure to pre-screen or remove are two sides of the same coin for an internet-content provider.”

The outlier plaintiff-friendly case is Doe 1 v. MG Freesites (N.D. Alabama 2022) [the most recent ruling in that case is discussed at the top of this post]. The court says that tbe Alabama ruling is inconsistent with Doe v. Twitter and other Ninth Circuit precedent. The court also says that the MG Freesites case involved allegations that the defendants encouraged and materially contributed to the CSAM, but in this case, the plaintiffs only argue that “the platforms had actual or constructive knowledge of instances of CSAM on the platforms and failed to properly report that CSAM.” The court concludes that the “difference is material under Section 230. The alleged conduct is subject to Section 230 immunity in a civil suit.”

BONUS: Doe v. Aylo Global Entertainment Inc., 2025 WL 1382196 (C.D. Cal. April 15, 2025). Not a Section 230 case, but this is also a FOSTA case and thus covers the same territory as the other cases in this post. This case relates to the “Girls Do Porn” service, and the court dismisses some of the claims against Aylo/Pornhub.

Some notes from the case:

  • “the FAC does not plausibly allege that the Aylo Defendants advertised Plaintiff knowing that force, threats, fraud, or coercion would be used to cause her to film additional sex trafficking videos with GDP”
  • “Plaintiff also fails to allege that the Aylo Defendants’ advertising activity amounted to an advertising of Plaintiff’s availability for future commercial sex acts.”
  • The statute of limitations applied to the claims, because the plaintiff waited 9 years to file them and Aylo benefited from the Single Publication Rule. “Plaintiff has alleged that the Aylo Defendants hosted, distributed, and advertised videos originally posted in 2014, which does not constitute a republication.”

* * *

Prior Blog Posts About Grindr