Section 230 Protects TikTok for “Blackout Challenge” Death, Despite the Algorithms–Anderson v. TikTok
A tragic story: a 10-year old girl saw the Blackout Challenge on TikTok, tried it herself, and died. The mom sued TikTok for design defect and failure to warn claims under strict products liability and negligence theories.
The mom claimed she sought to “hold Defendants directly liable for their own acts and omissions as designers, manufacturers, and sellers of a defective product.” The court responds that, due to Section 230, it needs to determine if the claims treat TikTok as a publisher/speaker of third-party content–which, of course, is exactly what this lawsuit is trying to do.
To get around this, the mom called out TikTok’s algorithms. She:
alleges that TikTok and its algorithm “recommend inappropriate, dangerous, and deadly videos to users”; are designed “to addict users and manipulate them into participating in dangerous and deadly challenges”; are “not equipped, programmed with, or developed with the necessary safeguards required to prevent circulation of dangerous and deadly videos”; and “[f]ail[] to warn users of the risks associated with dangerous and deadly videos and challenges.”
Thus, the mom claims she is trying to hold TikTok liable for defective publication.
The court responds simply that TikTok’s algorithms are “not content in and of themselves.” Cites to Dyroff, Force v. Facebook, Obado v. Magedson.
To further get around this, the mom cited Doe v. Internet Brands and Lemmon v. Snap. The court responds: “the duty Anderson invokes directly implicates the manner in which Defendants have chosen to publish third-party content. Anderson’s claims thus are plainly barred by Section 230 immunity.” The court continues (emphasis added):
Anderson insists that she is not attacking Defendants’ actions as publishers because her claims do not require Defendants to remove or alter the content generated by third parties. Publishing involves more than just these two actions, however. As I have discussed, it also involves decisions related to the monitoring, screening, arrangement, promotion, and distribution of that content—actions that Anderson’s claims all implicate. [cites to Force and Herrick v. Grindr]
From a legal standpoint, this inquiry into what it means to “publish” content is quite straightforward. Publishers do more than merely “host” users’ content for other users to discover on their own. As the court correctly notes, “promotion” and “distribution” of user content are quintessential publisher functions. This is exactly the question on appeal to the Supreme Court in Gonzalez vs. Google, so the Supreme Court’s ruling will likely be the final word on this topic. We’ll soon find out if their decision will end the UGC ecosystem.
This court concludes:
because Anderson’s design defect and failure to warn claims are “inextricably linked” to the manner in which Defendants choose to publish third-party user content, Section 230 immunity applies….Nylah Anderson’s death was caused by her attempt to take up the “Blackout Challenge.” Defendants did not create the Challenge; rather, they made it readily available on their site. Defendants’ algorithm was a way to bring the Challenge to the attention of those likely to be most interested in it. In thus promoting the work of others, Defendants published that work—exactly the activity Section 230 shields from liability. The wisdom of conferring such immunity is something properly taken up with Congress, not the courts.
Trust me, Congress WILL take this up in 2023. A Republican-led House will be a steady source of poorly conceived messaging bills regarding “protecting” kids and punishing “Big Tech.” Plus, the Age-Appropriate Design Code, also purporting to protect kids online, will finish off the Internet if Congress doesn’t. In the interim, I hoping, without much optimism, that the Supreme Court will similarly view this issue as “something properly taken up with Congress, not the courts.” This instantiation of the Supreme Court believes in deferring to Congress, except when it doesn’t.
Finally, your perennial reminder that even if the mom had overcome Section 230 in this ruling, the case is quite likely to fail on other grounds (the prima facie elements, First Amendment, etc.). Blaming Section 230 exclusively for this lawsuit’s dismissal is probably wishful thinking.
Case citation: Anderson v. TikTok, Inc., 2022 WL 14742788 (E.D. Pa. Oct. 25, 2022)