Snapchat Isn’t Liable for a Teacher’s Sexual Predation–Doe v. Snap
A high school teacher allegedly used Snapchat to groom a sophomore student for a sexual relationship. (Atypically, the teacher was female and the victim was male, but the genders are irrelevant to this incident). Among other defendants, the victim sued Snapchat for negligence. The court treats this as an easy Section 230 case:
ICS Provider. Undisputed, plus Lemmon v. Snap and Grossman v. Rockaway.
Publication of Third-Party Content.
Doe’s negligent undertaking claim alleges that Snapchat has failed to monitor content and messages sent between parties on its platform. Doe’s gross negligence claim alleges that Snap is indifferent to sexual predators’ use of the platform to message minors “with apparent impunity.” Each of these claims seek to “fault[ ] [Snap] for information provided by [a] third part[y]”—messages and photos sent by Guess-Mazock to Doe. Section 230 provides Snap with immunity from these claims.
Regarding the plaintiff’s negligent design theory, which got around Section 230 in Lemmon, the court says (with citations to the Texas Supreme Court ruling in In re Facebook):
The crux of Doe’s negligent design claim, like his negligent undertaking and gross negligent claims, is that Snapchat designed its product with features that allegedly created the opportunity for Guess-Mazock to send illicit messages to Doe. Doe’s negligent design claim similarly aims to hold Snap liable for communications exchanged between Doe and Guess-Mazock. This claim is also barred by Section 230.
Implications
The Lemmon Exception is Narrow. Although plaintiffs are excited about Lemmon v. Snap, this case reiterates that the Section 230 workaround is narrow. If the plaintiff seeks to hold the defendant liable for third-party content–definitely true in this case–the Lemmon ruling make it clear that a “negligent design” framing is still subject to Section 230, no matter how creatively or emphatically the plaintiff points to the service’s first-party design choices.
1:1 Messaging Services Can’t Be Liable for Users’ Content. In this case, Snapchat was apparently functioning like a private 1:1 messaging service–no different than text messages or emails. This matters a lot for the victim’s claims. First, Snapchat may not have had the legal right to snoop into the conversation due to the ECPA and other privacy laws, so there’s no way for Snapchat to tell if the conversation was about helping with homework or sexual grooming.
Second, if Snapchat can’t monitor the conversation’s substance and intervene when it turns illegal, the only way Snapchat can protect the minor victim in this case is by preventing the minor from having a Snapchat account at all. I know there are many people who support that outcome; indeed, that’s the inevitable effect of California’s proposed Age-Appropriate Design Code.
However, it has several unwanted downsides. It would take away all of the pro-social uses of Snapchat (don’t laugh, there are many), even in circumstances where the minor users never faced a risk of victimization. Also, many teens feel that social media makes their lives better, and so categorically booting them from all social media services would be a huge loss to them.
Furthermore, keeping kids off social media would require age, and likely identity, authentication for all users, not just minors, which disadvantages all users–including adults–by exposing them to privacy and security risks and undermining their ability to speak anonymously/pseudonymously.
While the victim’s story in this case is tragic, the victim’s legal arguments to reconfigure minors’ access to social media would lead to a different, and also profound, tragedy.
Section 230 Was a Fast Lane to the Inevitable Result. Section 230 accelerated the inevitable outcome that Snapchat isn’t liable for the grooming, leading to an efficient, quick, and low-cost resolution. The court didn’t address Snapchat’s other defenses, like the lack of proximate causation, but I think most private messaging services cannot be liable for tortious conversations when they can’t see the conversation contents. As a result, amending Section 230 to contemplate a case like this wouldn’t likely change the substantive outcome (i.e., the victim would lose a negligence claim either way), it would just mess up Section 230.
Meanwhile, the victim’s case is still ongoing against the teacher, as it should be.
Case Citation: Doe v. Snap, Inc., 2022 WL 2528615 (S.D. Tex. July 7, 2022). The complaint.
UPDATE: In November, the court dismissed the victim’s case against the school district: “the law is…clear that it takes considerably more than a teacher’s own predatory conduct to find the school district or school board that employed the teacher liable.” Doe v. Snap, Inc., 2022 WL 16635370 (S.D. Tex. Nov. 2, 2022)
UPDATE 2: Doe’s teacher, Bonnie Guess-Mazock, pleaded guilty to sexual assault. See Plea Acceptance, Texas v. Guess-Mazock, No. 22-05-06072 (359th Dist. Ct., Montgomery County, Tex. May 12, 2022).
Pingback: Omegle Denied Section 230 Dismissal-AM v. Omegle - Technology & Marketing Law Blog()