Section 230 Preempts Predator Access Claims Against Apple, Snap, and Verizon–Joan Doe v. Snap

This blog post addresses a specific genre of Section 230 litigation that I call “predator access” cases. The cases involve a victim with an online profile who connects with a criminal who commits sexual abuse. The victim is often underage, and there is a long history of such cases, such as Doe v. MySpace from 2008. However, I think adult cases like Doe v. Internet Brands also fit within the genre.

By their nature, predator access cases involve tragic facts and sympathetic victims, but they are also primarily in the realm of criminal law. The predation violates criminal law, and the criminal justice system needs to find and punish a perpetrator. In light of that context, it raises the next question: should anyone else be civilly liable for the criminality? It would compound the tragedy to extend the boundaries of liability to reach improper defendants.

Internet services are a but-for cause of the predation crime, but so are many other players. The more insightful question is: exactly how did the services facilitate the predation? Invariably, the answer is that the service allowed the victim and predator to converse with each other. But if that’s all the services do, that is a textbook Section 230 case, because the victim is trying to hold the service responsible for third-party content.

To get around this clear legal barrier, victims invoke tenuous legal doctrines like “negligent design,” “failure to warn,” and contract breach/false advertising–all to position the case as being about the service’s first-party choices, not about the third-party content. For the most part, these plead-arounds have failed, including the decision I’m blogging below. However, with the demise of Section 230 around the country, predator access civil cases against services don’t look as futile as they did in the Doe v. MySpace days.

* * *

A mom (Joan Doe) gave her 10 year old daughter (Jane Doe) an Apple iPhone. The mom signed up for Verizon’s “Smart Family” app. At the daughter’s request, the mom approved installation of Snapchat on the daughter’s phone. Jane and Joan connected on Snapchat. The mom later met Omeire and shared her Snapchat contact information with him, which meant that Omeire could connect on Snapchat with the daughter. Omeire “befriended Jane online and eventually sexually assaulted her when they met in person.” Omeire has been sentenced to 60 years in prison.

Jane sued Snap, Verizon, and Apple for negligence, products liability, and false advertising claims. In effect, Jane sued three layers of the stack: Snapchat as the social media where contact was made, Verizon as the provider of allegedly insufficient filtering software, and Apple as the app store vending pernicuous apps. The companies all successfully defended using Section 230.

Publisher/Speaker Claims. The court forthrightly narrates how plaintiffs routinely try to obfuscate the publisher/speaker inquiry:

this prong is sometimes easily muddled or confused: the determination is not whether the defendants did in fact publish or speak, but rather if the plaintiff’s claims in effect allege that they did. Courts have consistently looked beyond the plaintiff’s chosen labels––whether negligence, product liability, or otherwise––to ask whether the gravamen of the claim truly is the defendant’s handling of third-party speech. No doubt, Plaintiff valiantly attempts to cast her claims similarly (but not too similarly) to those others’ failed pleadings—saying it is not the content of Mr. Omeire’s communications that she seeks to hold Snap, Verizon, and Apple to account for but rather their failures in allowing Mr. Omeire access to Jane. But when, as here, the claim truly turns on decisions about whether the entity was required to monitor, review, and edit content, then a court must recognize that the defendant is being sued as a publisher, regardless of how the pleading is crafted

The court says this case is clearly about holding the defendants liable for content publication decisions:

the myriad claims asserted against the Defendants…all ultimately stem from the same factual premise: that Snap, Verizon, and Apple failed to prevent Mr. Omeire from gaining access to Jane Doe, and therefore owed her a duty to block or restrict his communications. However framed, the claims seek to impose liability for the Defendants’ alleged failure to “monitor, screen, or regulate” third-party interactions on their platforms or devices. But Section 230 forecloses precisely such theories of liability.

Against Snap, the Complaint identifies duties such as verifying user ages and identities, restricting strangers from connecting with minors, and preventing the creation of multiple accounts. But these are editorial functions—the very activities of publication that Section 230 immunizes. Even allegations about Snap Maps and Bitmojis describe features that allegedly facilitated the transmission or disguise of third-party information. Put another way, such an allegation flows from how Snap disseminated or allowed transmittal of Mr. Omeire’s user-generated content.

The same analysis applies to Verizon. Among other allegations that follow a similar thread, Ms. Doe contends that Verizon assumed a “non-delegable duty” through its Smart Family App to “Childproof the Internet.” But liability again turns on Verizon’s purported failure to block, filter, or restrict Mr. Omeire’s communications on Snapchat.

Apple is similarly situated. The Complaint asserts that Apple negligently distributed Snap’s and Verizon’s applications through its App Store, and that Apple misled users by advertising that all apps are reviewed for safety. But the core theory is that Apple should have excluded Snapchat or Verizon’s Smart Family App because they did not adequately screen or block harmful content. Again, that is an allegation about editorial decisions of a publisher or speaker––what third-party applications and communications to host, allow, or restrict.

Third-Party Content.

The court sets out its rule: “A platform that merely hosts or transmits content is protected; a platform that creates or meaningfully shapes the content is not.” I’m not sure what the term “meaningfully shapes” means here. It’s not in the statute or in most precedents, and it invites misunderstanding. Online content isn’t Play-Doh.

Citing Roommates.com, the court says:

Neither Snap, Verizon, nor Apple created tools that required users to enter or act on unlawful information. There is no allegation they engineered or encouraged Mr. Omeire’s harmful content, let alone compelled its creation.

Even the most plaintiff-generous read of Ms. Doe’s charges, suggests the platforms here functioned solely as conduits. They provided neutral tools that allowed users to communicate and share information, but they neither authored nor materially shaped the content in dispute….Under the Complaint’s allegations, the content at issue was created entirely by Mr. Omeire. Ms. Doe’s claims rest on the theory that Snap, Verizon, or Apple allowed such material to circulate and make its way to Jane. That theory falls squarely within the editorial and structural functions Congress chose to immunize.

[Insert my standard objections that Section 230 doesn’t apply only to “conduits,” whatever that means, and the term “neutral tools” is an oxymoron.]

The court reinforces that Section 230 applies to all of the multitudinous and diverse claims asserted in this case:

On close inspection—whether styled as negligence, failure to warn, design defect, warranty breach, fraudulent misrepresentation, emotional distress, or the Delaware statutory claims—the quintessence of each claim that Plaintiff attempts to bring under Delaware law is that Defendants allowed Mr. Omeire’s third-party communications to reach Jane and failed to prevent or filter those communications. Courts have consistently held that imposing liability for such conduct treats the service provider as a publisher of usergenerated content, a role that Section 230 immunizes.

The attempt here to “plead around” Section 230 by labeling the challenged conduct as product liability, failure to warn, or negligent provision of services must be rejected. The core of each these claims is the same. At bottom, Ms. Doe seeks to hold Defendants Snap, Verizon, and Apple responsible for the consequences of third-party speech and connections. And Section 230 bars such claims.

* * *

Once again, Section 230 involves heartbreaking tragedies where the victims go beyond the obvious wrongdoers to extend liability to defendants who shouldn’t be in the chain of liability. To me, it’s relevant that the criminal justice system already successfully processed the predator. The court addresses all of this expressly:

Jane Doe was the innocent child victim of an unspeakable crime. The (no-longer-merely-alleged) perpetrator of those horrific acts has been criminally prosecuted and is, and remains, a defendant in this civil suit. Yet, it is wholly understandable that Ms. Doe would like to take aim at any other person or entity she feels is the least bit responsible for her daughter’s harm. Any parent would.

But the execrable use of certain technology that in some circumstances lies along the path to such atrocities—as occurred in so many of the cases cited herein—cannot be remedied by misdirected claims such as Ms. Doe’s.

State court judges don’t regularly see Section 230 cases, especially complex technology litigation like this. Due to this novelty, Section 230 often baffles state court judges because it reaches seemingly counterintuitive outcomes. And because the cases involve human tragedy and harmed victims, state court judges are often inclined to bend the rules to benefit plaintiffs. This opinion, in contrast, correctly adhered to the legal boundaries set by Congress, despite the context. The author of this difficult but thoughtful opinion is Judge Paul R. Wallace.

Case Citation: Joan Doe v. Snap, Inc., 2025 WL 2926161 (Del. Superior Ct. Oct. 15, 2025)