Section 230 Still Works in the Fourth Circuit (For Now)–M.P. v. Meta
I’m going to classify this ruling as a “big deal,” with the crucial caveat that Section 230 is still doomed and this ruling doesn’t reverse that. Given how judges have turned against Section 230, at this point any appellate ruling that applies Section 230 without narrowing its scope is important. This case stands out for two additional reasons.
First, historically the Fourth Circuit has been a defense-side Section 230 bastion due to the classic Zeran ruling from 1997. But, in 2022, the Fourth Circuit seemed to turn its back on Zeran in its Henderson v. Public Data ruling. This week’s ruling is more Zeran than Henderson, suggesting that Henderson might be a blip or edge case for the circuit.
Second, the Third Circuit’s Anderson v. TikTok ruling said that because content moderation of third-party content was constitutitionally protected expression per Moody, all moderated content becomes first-party content for Section 230 purposes. This twisted “logic” functionally repeals Section 230 in that circuit.
The Fourth Circuit ruling essentially ignores the Anderson ruling (the dissent mentions it twice) and issues a ruling flatly inconsistent with it. The Fourth Circuit says that algorithmic prioritization of content is a publisher function that qualifies for Section 230, in contrast with the Third Circuit position that algorithmic prioritization disqualified TikTok for Section 230. Plaintiffs will still opt into the Third Circuit, so the Fourth Circuit’s sanity-check won’t save many defendants. However, it’s another indicator that circuits outside the Third are likely to disagree with the Anderson ruling, virtually ensuring the conflict will reach the Supreme Court. (In a mild surprise, TikTok abandoned its Supreme Court appeal of the Anderson case, so that reconciliation will have to wait a bit longer).
A single circuit’s ruling can’t fix all of the problems with the Section 230 jurisprudence. But given how rarely I’ll blog good news about Section 230 going forward, consider this opinion a temporary rally in the long-term decline. 📉
* * *
I previously summarized this case:
This case involves the murderous attack by Dylann Roof against the Emanuel AME Church in Charleston, S.C., killing nine African-Americans. A victim’s daughter sued Facebook, alleging that Facebook’s “design and architecture” radicalized Roof, and that should disqualify Facebook for Section 230.
The district court granted Facebook’s motion to dismiss on Section 230 grounds. The Fourth Circuit, in a divided opinion, affirms.
Section 230
The plaintiff argued that the plaintiff’s common law claims (strict products liability, negligence, and negligent infliction of emotional distress) didn’t treat Facebook as the publisher of third-party content. The majority says (emphasis added):
M.P. seeks to hold Facebook liable for disseminating “improper content” on its website. Crucially, M.P. cannot show that Facebook’s algorithm was designed in a manner that was unreasonably dangerous for viewers’ use without also demonstrating that the algorithm prioritizes the dissemination of one type of content over another. Indeed, without directing third-party content to users, Facebook would have little, if any, substantive content. Simply stated, M.P. takes issue with the fact that Facebook allows racist, harmful content to appear on its platform and directs that content to likely receptive users to maximize Facebook’s profits.
Check out that bolded language. When plaintiffs complain about “the algorithms,” the court says they are necessarily complaining about publication decisions because “the algorithms” editorially prioritizes some content over others. Once the plaintiff makes that concession, the applicability of Section 230 should be confirmed.
The majority continues discussing how algorithmic content prioritization = publishing for Section 230 purposes (emphasis added):
acts of arranging and sorting content are integral to the function of publishing. [cite to Force v. Facebook] For instance, newspaper editors choose what articles merit inclusion on their front page and what opinion pieces to place opposite the editorial page. These decisions, like Facebook’s decision to recommend certain third-party content to specific users, have as a goal increasing consumer engagement. But a newspaper company does not cease to be a publisher simply because it prioritizes engagement in sorting its content. And the fact that Facebook uses an algorithm to achieve the same result of engagement does not change the underlying nature of the act that it is performing. Decisions about whether and how to display certain information provided by third parties are traditional editorial functions of publishers, notwithstanding the various methods they use in performing that task
Again, this is an obvious point that the 230 opponents keep trying to obscure. Internet services may structure their editorial operations differently than traditional publishers, but they are performing the same editorial functions by gathering, organizing, and disseminating content. Quietly, the majority completely rejects a “but the algorithms” exceptionalism to defining the functions of publishers.
The majority says that if 230 is a policy problem, the “question whether, and to what extent, Section 230 should be modified is a question for Congress, not for judges.” We’re all waiting nervously to see when Congress will do that. The devastating consequences of Congressional review are why I have Section 230 on the extinction watchlist.
Judge Rushing, a TAFS[FN] judge, partially dissented on Section 230. She would not dismiss the negligence claim against Facebook “because recommending a group, person, or event is Facebook’s own speech, not that of a third party.” The majority rejects her argument in a footnote, saying that the plaintiff never claimed to be holding Facebook liable for recommending a group, person, or event, so “the dissent ultimately argues a case that M.P. does not make.”
[FN: TAFS = Trump-Appointed Federalist Society judge]
Lack of Causation
The majority says the plaintiff should lose, regardless of Section 230, because Facebook wasn’t the proximate cause of the harms. Once again, changing Section 230 wouldn’t change the outcome of this case.
[Note: the district court didn’t address causation, but the majority talks about it anyway because the parties briefed it. In other words, the lack of causation is so obvious that the appellate court felt comfortable resolving it in the first instance.]
The court says:
M.P. has not plausibly alleged that Facebook was the proximate cause of her injuries. Her specific allegations involving Roof’s use of Facebook are that (1) he viewed extremist content on Facebook; (2) he “joined extremist groups on Facebook;” and (3) shortly before June 2015, he changed his Facebook profile picture to one that included white supremacist symbols. Notably, M.P. does not allege how much time Roof spent on Facebook or how he became radicalized on the platform. Nor does M.P. provide any factual foundation causally linking Roof’s Facebook use to his crimes of murder. In short, M.P. does not offer a plausible argument, or otherwise point to supporting allegations, that Roof’s horrific acts were a natural and probable consequence of his Facebook use
Implications
The Material Support For Terrorists Cases. This case fits within the broader genre of plaintiffs suing social media for their alleged “material support for terrorists.” However, the gunman in this murder was apparently a lone-operator domestic terrorist radicalized by white supremacy, with no connection to or inspiration from a foreign terrorist organization like most other cases in the genre.
Oddly, the court largely ignores this genre. For example, in 2023, the Supreme Court issued two opinions on social media liability for supporting terrorists, the Gonzalez and Taamneh cases. The Fourth Circuit doesn’t cite either.
The court also doesn’t engage with the similar lawsuits related to the Buffalo mass-murder, another act of domestic terrorism by a radicalized white supremacist. The New York state court, in a poorly crafted and misguided opinion, denied Facebook’s Section 230 motion to dismiss. I can’t explain the differing results, except to say that this majority was right and the Buffalo judge was wrong.
TAFS Judges Will Gut 230 Eventually. Judge Rushing would have excluded the negligence claim from Section 230 because she sees a difference between Facebook recommending third-party content (which she says is publishing third-party speeech) and Facebook recommending a third-party group, person, or event (which she says it publishing first-party speech–the recommendation). Putting aside the majority’s point about the factual predicate problem with Judge Rushing’s argument, Judge Rushing’s distinction makes no sense. With respect to Section 230, if Facebook recommends a group/person/event, that’s all third-party “stuff” to Facebook; and the recommendation leads Facebook users to consume third-party content from these third parties, so it’s third-party content all the way down.
In general, TAFS judges vigorously hate Section 230. (Most likely they are reflecting Trump’s longstanding anti-230 vitriol, but I think it’s more complicated than that). Judge Rushing’s opinion, though unsuccessful this time, shows the dangers to Section 230 of having TAFS judges in the appellate pool. They eventually will combine with each other or other techlashed/redpilled judges to blow a massive hole in Section 230’s scope. It’s a numbers game. The more Section 230 cases that reach appellate courts, and the more new TAFS judges are appointed in Trump 2.0, the greater the odds that TAFS judges will end Section 230 (if Congress doesn’t do it first).
Deep Cut for 230 Nerds. I roared with laughter when, in a footnote, the majority says: “We treat the terms ‘information’ and ‘content’ as synonymous in this opinion.” This semantic equation makes sense from a dictionary standpoint, but it poses a majory dilemma for statutory interpretation of 230.
Section 230 uses the defined term “information content provider.” If the terms “information” and “content” are synonyms, then the majority is saying that the statutory term means “content content provider” or “information information provider.” 🤣 This is nonsensical, but no more so than the opposite interpretation of the statutory defined term, which implies that there could be “non-information content providers” or “information non-content providers.”
This semantic mess traces back to the AOL days. Someone there coined the “information content provider” phrase in the early 1990s for reasons now lost to history, and they probably have no idea how their odd word choice has vexed us for decades.
Case Citation: M.P. v. Meta Platforms, Inc., 2025 WL 377750 (4th Cir. Feb. 4, 2025)