Section 230 Helps Discord Defeat “Defective Design” Claims Regarding Sexual Predation–Jane Doe v. Discord

This is another entry in the genre of “predator access” cases claiming that predators solicited minors for sex online, in this case on Discord. Many predator access cases have targeted Roblox, which has a pending MDL in CA consolidating dozens of cases. Some of those plaintiffs have also named Discord. The plaintiffs tried to get this case moved to the Northern District of California so that it could operate in parallel with the Roblox MDL, but the court refuses that request. Instead, the court hands Discord a decisive win per Section 230.

The court starts off with this broad proposition:

Section 230 compels dismissal of claims seeking to hold platforms liable for activity amounting to sexual exploitation of one user by another when the factual predicate is that the two users engaged in messaging using the platform’s service” [cites to Doe v. Grindr, Doe v. MySpace, Doe v. Snap, Doe 1 v. Backpage, Doe II v. MySpace, In re Facebook, LW v. Snap, Doe v. Grindr (S.D. Fla.), Doe v. Kik.]

Negligence

The plaintiff tried the standard set of arguments that Discord was defectively designed because it didn’t adhere to the plaintiff lawyers’ vision of how services should operate:

Plaintiff’s “Negligence” claims seek to impose liability on Discord for (i) designing its messaging service to facilitate harmful private communications; (ii) allowing “unsupervised” messaging between users; (iii) failing to require phone number verification or otherwise “screen users”; (iv) failing to “implement … parent controls” and “parental notifications” that would monitor and supervise messages; (v) failing to remove user profiles and block messages from adults who message teens; (vi) failing to set default safety settings that would block messages between unconnected users; (vii) offering an “open chat function” without sufficient moderation; and (viii) failing to monitor for, report and prevent the use of [its] app[ ] by sexual predators.”

The court says all of those configuration choices are editorial choices protected by Section 230:

These claims each amount to Plaintiff seeking to impose a duty on Discord to monitor, screen, and block Plaintiff’s communications with other Discord users. All of these duties would require Discord to alter or amend how it publishes, monitors, screens, flags, blocks, or removes users’ messages and profiles, including how it offers to its users “neutral tools” that allow users to communicate in different chat forums and formats. [cite to Jones v. Dirty World (6th Circuit)]

Notice how this court implicitly veers away from the social media addiction rulings in California and numerous other precedents saying that design choices can be agnostic about what content they apply to and therefore are not based on third party content.

Strict Liability

The court treats the products liability claim the same as the negligence claim. The plaintiff complained about the following practices:

The Complaint faults Discord for providing a service that “allow[s] children to come into contact with child predators, and asserts that Discord should provide “[e]ffective parental controls” to stop harmful message exchanges; reconfigure features to “block[ ] direct messaging between child and adult users”; block content from “known abusers”; and offer a more restrictive “[c]ontrolled chat” option.

The court responds that these claims “would require Discord to more perfectly screen for and block harmful messages and alter the operation of the neutral tools it provides users to send messages,” which Section 230 does not permit.

Concealment/Failure to Warn

The court says the concealment/failure to warn claims also second-guess Discord’s editorial decisions. The court says:

Courts cannot accept attempts to repackage what is in actuality “publisher” actions as “torts of omission” to evade Section 230

Thus, “these allegations appear to be simply a restatement of Plaintiff’s negligence claims and product liability claims already found to be barred by Section 230. Put another way, the only way that Discord could address these aspects of its platform would be “to take certain moderation actions” that would eliminate the alleged discrepancy between Discord’s description of its moderation efforts and the “reality” of its moderation – again, “publishing” actions.” [cite to Bride v. YOLO]

Failing to warn users that Discord is a “dangerous” app “is at root a claim based on “publication” choices related to moderation efforts, which fall within the immunity provided by Section 230.” Cites to Bride v. YOLO, Doe v. Grindr, Wozniak v. YouTube.

The court also questions if there was any actual omission: “Discord does disclose and issue transparency reports that – as is the case with any platform that handles an immensely high volume of messages each day – do show that its content moderation efforts are imperfect.”

Misrepresentation

Plaintiff’s claims seek to hold Discord liable for alleged “misrepresentations” by failing to conform its content moderation standards – based on what amounts to its general “aspirational” standards of seeking to provide a platform “safe for minors” – to a level defined by Plaintiff. [The Grindr court distinguished] claims based on actual specific and defined contractual promises [from] general aspirational goals regarding platform content moderation

The litigation over “safe” content moderation is decades-old and completely confused.

Third-Party Content

Nowhere in Plaintiff’s Complaint does it accuse Discord of creating the offensive messaging, but rather the Complaint seeks to hold Discord liable for facilitating – or failing to moderate – sexually exploitative offensive messaging created by others. The fact that Discord may have provided the “tools” by which Plaintiff and her alleged abusers exchanged messages, to “carry out what may be unlawful or illicit” does not make Discord a “content provider,” but rather treats Discord as a “publisher” of (offensive) messaging created by third parties.

Implications

A reminder that sexual predation cases involve heartbreaking facts. Section 230 often arises in tragic circumstances.

The Section 230 jurisprudence is coming apart at the seams, as illustrated by this ruling. I think this court got it right and disagreeing courts got it wrong. However, there is now enough precedent on both sides of every issue to vex everyone. This opinion carefully prioritized appellate rulings, which have largely rejected the design defect workarounds to Section 230. However, many more design defect cases are heading to appellate courts across the country, and any appellate deviation in any one of those cases will tear Section 230 even further apart.

Case Citation: Jane Doe v. Discord Inc., 2026 WL 1067574 (N.D. Ohio April 20, 2026). The complaint.