Social Media Addiction Lawsuit Proceeds Against TikTok and Instagram–Nazario v. Bytedance
This case involves the deadly practice of “subway surfing.” The plaintiff in this case claims her son died doing subway surfing that he tried only because he was encouraged by TikTok and Instagram videos. The court largely rejects the social media defendants’ motion to dismiss.
Section 230
The plaintiffs claimed that the social media defendants co-created the users’ uploaded videos because they provide various editing tools that uploaders can use to enhance or modify their uploads. The court disagrees:
the social media defendants make features available to users to personalize their content and make it more engaging…. It is the user, not the social media defendants, that selects what features to add to their posts, if any….the social media defendants did not make any editorial decisions in the subway surfing content; the user, alone, personalizes their own posts. Therefore, the social media defendants have not “materially contributed” to the development of the content such that they may be considered co-creators.
In a footnote, the court distinguishes the Lemmon case because the plaintiff doesn’t allege that “the applications’ add-on features, in and of themselves, inspired Zackery to engage in the dangerous activity.” (In contrast to Snap’s speed filter, which allegedly motivated users to drive at excess speeds).
The plaintiff nevertheless argued “but the algorithms…” Dspite the extensive precedent rejecting this workaround, the plaintiff finds a sympathetic court here:
Plaintiff’s claims, therefore, are not based on the social media defendants’ mere display of popular or user-solicited third-party content, but on their alleged active choice to inundate Zackery with content he did not seek involving dangerous “challenges.” Plaintiff alleges that this content was purposefully fed to Zackery because of his age, as such content is popular with younger audiences and keeps them on the social media defendants’ applications for longer, and not because of any user inputs that indicated he was interested in seeing such content. Thus, based on the allegations in the complaint, which must be accepted as true on a motion to dismiss, it is plausible that the social media defendants’ role exceeded that of neutral assistance in promoting content, and constituted active identification of users who would be most impacted by the content
Cites to Wohl and Patterson with a shoutout to the bonkers Anderson v. TikTok decision.
Sadly, the court has lost the jurisprudential plot here. So long as the content is third-party content, it doesn’t matter whether the service “passively” displayed it or “actively” highlighted it–either choice is an editorial decision fully protected by Section 230. Thus, the court’s purported distinction between “neutral assistance” and “active identification” is a false dichotomy. All content prioritization is, by design, intended to help content reach the audience that is most interested in it. That is the irreducible nature of editorial discretion, and no amount of synonym-substitution masks that fact.
To get around this, the court restyles the argument as being about product design and failure to warn: “plaintiff asserts that the social media defendants should not be permitted to actively target young users of its applications with dangerous “challenges” before the user gives any indication that they are specifically interested in such content and without warning.” As always, I ask: what is the product, and warn about what? If the answer to both questions is “third-party content,” Section 230 should apply.
The court acknowledges that Section 230 might still apply, but it thinks it needs “discovery to illuminate how Zackery was directed to the subway surfing content.” Not only would this discovery invade publishers’ editorial prerogatives, what information would change the answer? If “the algorithms” are “responsible”, then what? I am reminded of the Roommates.com discussion about duck bites and how permitting discovery is one of the consequences Section 230 was designed to avoid.
First Amendment
The court also punts on the First Amendment defense until after discovery “into the operation of the algorithms used.”
Prima Facie Elements
The court says the claims for product liability, negligence, and wrongful death survive the motion to dismiss.
Implications
I don’t blog all of the lower court decisions regarding social media addiction because they are so noisy. The law will get less noisy after more appellate courts weigh in.
Still, this case illustrates how the Section 230 precedent is fading, as courts keep chipping away at its edges to reach counterintuitive conclusions that should be clearly covered by Section 230. If plaintiffs can survive motions to dismiss just by picking the right words, then Section 230 already loses much of its value. These pleadaround techniques especially seem to work in state trial courts, who are used to giving plaintiffs the benefit of discovery.
Case Citation: Nazario v. Bytedance Ltd., 2025 N.Y. Misc. LEXIS 5818 (N.Y. Supreme Ct. June 27, 2025)