Snapchat Isn’t Liable for Offline Sexual Abuse–VV v. Meta

According to the complaint, a 12-year-old girl made a Snapchat account, connected with sexual predators on Snapchat, met them offline, and was sexually abused. She sued Snapchat for her harms. Snapchat successfully defends on Section 230 grounds.

The contested issue is whether the claims treat Snapchat as a publisher/speaker of the third-party content. The plaintiffs argued they were suing over Snapchat’s alleged design defects, including  “(1) user recommendation technologies (specifically, design features that facilitated connecting adult sexual predators with vulnerable minors); (2) lack of identify [sic] and age verification; (3) recommendation algorithms that are designed to make children addicted; (4) “other design features” such as push notifications and (5) failure to warn users and/or parents of the addictive design defects in Snapchat.” [A reminder that court-ordered identity and age verification requirements likely violate the First Amendment; the other claims may do so as well.]

Agreeing with the majority opinion in Force v. Facebook, the court says:

the fact that an interactive computer service allegedly created user recommendation technologies and algorithms that operate to connect users together does not change the computer service’s status as a publisher. [cite to LW v. Snap]. Nor do allegations of an application’s lack of identify and age verification remove the “publisher” designation. [Cite to Doe v. Grindr]. Similarly, allegations of failure to warn of an application’s potential danger do not remove the “publisher” status. [Cite to Herrick v. Grindr].

As usual nowadays, the plaintiffs invoked Lemmon v. Snap. The court responds:

The result in Lemmon makes sense, of course, because the plaintiffs there did not attempt to hold the defendant liable for publication of third-party content. Rather, the case rested solely on an alleged defect in the Snapchat application that did not involve statements made by third parties when using Snapchat.

To bolster their Lemmon workaround, the plaintiffs claimed to disclaim all liability based on third-party content. Unlike some other courts that have accepted this procedural gambit at face value, this court doesn’t fall for it:

On page forty-nine of their complaint, the plaintiffs allege that they “Expressly Disclaim Any and All Claims Seeking to Hold Defendants Liable as the Publisher or Speaker of Any Content Provided, Posted, or Created by Third Parties.” Despite this attempt to plead around any potential CDA immunity, the court will read the allegations of the complaint as a whole to determine if the plaintiffs are actually alleging that the defendants were acting as a “publisher or speaker.” In any event, such an allegation is a legal conclusion, and it is the court’s role to analyze the legal soundness of the allegations of a complaint, construed in a manner most favorable to the pleader, and not necessarily accept the plaintiffs’ characterization of their own allegations.

Alternatively, the court could have said that the plaintiffs have no claim without relying on third-party content. However, the causation piece was a core issue in the Neville v. Snap case, where the court said that 230 doesn’t apply just because third-party content was a but-for contributor to the harm. The Neville court’s analysis was simplistic and (I think) clearly wrong, but perhaps this judge didn’t want to tangle with it.

Instead, in a footnote, the court says the Neville judge “acknowledged the existence of Force v. Facebook, Inc., but it chose not to follow that case. Indeed, the Neville court quoted the Force concurring and dissenting opinion at length instead of the majority opinion. Therefore, Neville is of little persuasive value to this court.” Indeed, the Neville court repeatedly showed a blatant disregard for precedent, valorizing non-binding precedent over in-jurisdiction binding precedent. In contrast, this opinion is grounded in the precedent.

The court distinguishes the Bolger and Erie Insurance cases because the plaintiffs didn’t allege that they bought any products from Snap.

In a footnote, the court distinguishes AM v. Omegle as likely inconsistent with Second Circuit precedent.

The court summarizes:

the allegations of this case fall squarely within the ambit of the immunity afforded to “an interactive computer service” that acts as a “publisher or speaker” of information provided by another “information content provider.” The plaintiffs clearly allege that the defendants failed to regulate content provided by third parties such as Sharp and Rodriguez when they were using the defendants’ service. Therefore, as recently held by a federal district court judge when analyzing similar claims against the dating application Grindr, “as [the plaintiffs’] claims in essence seek to impose liability on Grindr for failing to regulate third-party content, they require that the [c]ourt treat Grindr as a publisher or speaker.” As each of the plaintiffs’ causes of action arise out of this same factual background, the court is compelled to conclude that all of the plaintiffs’ claims are legally insufficient…

As previously noted by the First Circuit Court of Appeals when ruling on a case that raised similar issues: “This is a hard case—hard not in the sense that the legal issues defy resolution, but hard in the sense that the law requires that [the court] … deny relief to plaintiffs whose circumstances evoke outrage. The result [the court] must reach is rooted in positive law. Congress addressed the right to publish the speech of others in the Information Age when it enacted the [CDA] ….” [Cite to Doe v. Backpage]. It is the court’s function to apply the law in its current form. As stated by our Appellate Court when previously ruling on a case involving the CDA, matters such as this one “can make faithful interpretation of statutes difficult. Without further legislative action, however, there is little [this court can] do in [its] limited role but join with other courts and commentators in expressing [its] concern with the statute’s broad scope.” [Cite to Vazquez v. Buhl].

The facts of this case are extremely similar to the (uncited) 15 year old Doe v. MySpace case, which reached the same outcome that Section 230 applies to offline sexual abuse following online messaging. In that respect, the law has been quite consistent for a long time. Thus, as the court notes, sexual abuse cases are tragic, but the lawsuits were against the wrong defendants. (Note: the abuser was convicted and jailed). A well-drafted and no-nonsense opinion from Judge Barbara Bellis, who also had the displeasure of handling one of Alex Jones’ defamation cases.

I still haven’t blogged last year’s In re Social Media Addiction rulings from the California federal and state courts. (I have a partially complete 4k word draft lurking in my drafts folder, but it will likely never see the light of day). It’s interesting to see how subsequent courts have repeatedly disagreed with the conclusions in those cases; and both cases are on appeal, where I hope their errors will be fixed. For now, the conflicting decisions spurs many lawsuits that I think have failed (or will) and act as a form of revictimization.

Case citation: V.V. v. Meta Platforms, Inc., 2024 WL 678248 (Conn. Superior Ct. Feb. 16, 2024). Law360 article.