Muslim Loses Case Against Facebook Over Discriminatory Content Moderation–Elansari v. Meta
Elansari is Muslim. This is not his first time as a plaintiff. In this lawsuit, he claims that Facebook blocks pro-Palestinian publishers and favors pro-Israeli publishers. Thus, he argues, Jewish readers are more likely to get the information they want from Facebook than Muslims.
Title II Discrimination. “Facebook is not a public accommodation under Title II….Facebook is not a physical facility; it is a website. Plaintiff does not access any physical space when he uses Facebook. Lastly, Plaintiff was not denied access to or services of Facebook….Meta is a private company and a user of Facebook cannot dictate what will be published on its website.” [As to the latter point, the Fifth Circuit apparently feels otherwise.]
1981. “while he may not like the content published, Facebook’s decision as a private company on what to publish on its website is not specifically targeted at Plaintiff. In short, these allegations do not show that Defendant engaged in a discriminatory refusal to allow Plaintiff to use Facebook based on race.” The court might have relied on the First Amendment here but instead only discussed the statutory elements.
- ICS Provider: “websites and social media are considered interactive computer service providers.” Cites to Green v. AOL, Sikhs for Justice v. Facebook, Shulman v. Facebook, Marfione v. KAI.
- Publisher/Speaker Claims. “Plaintiff alleges Defendant removed Palestinian Muslim news sources from its platform. Withdrawing content falls squarely in ‘a publisher’s traditional editorial functions.'”
- Third-Party Content. “Defendant removed news published by Palestinian and Muslim news sources. Defendant itself had no role in creating or developing the information provided through these new sources.”
This ruling is yet another example of Section 230’s application to discrimination claims. See, e.g., NFD v. Harvard (“The CDA exempts certain laws from its reach. Federal and state antidiscrimination statutes are not exempted.”). Divino and Newman also came to mind.
The top-line holding of this decision should surprise no one: unrelated third-parties can’t impose liability on UGC publishers for their editorial decisions to reject or remove someone else’s content. Those decisions are protected by the First Amendment and Section 230, and courts reach that conclusion even without Section 230 (e.g., Rutenberg v. Twitter).
This ruling brought two additional cases, both uncited, to mind.
First, this case is similar to the Noah v. AOL case from 2 decades ago. In that case, the plaintiff claimed that AOL was discriminating against Muslims by doing anemic content moderation in some Muslim-themed chatrooms. Like this case, that case failed both because AOL wasn’t a public accommodation and because of Section 230. I have questioned whether I should keep the Noah opinion in my casebook given its age (and the fact most of my current students were born after AOL’s heyday), but this ruling reinforces that its issues may be timeless.
Second, this case reminded me of Klayman v. Zuckerberg, where the plaintiff alleged that Facebook reacted too slowly in removing anti-Israeli content. So in that case, the plaintiff thought Facebook was biased against pro-Israeli content, and in this case, Elansari thinks Facebook is biased against anti-Israeli content. These seemingly incongruous perceptions of bias are overwhelmingly common. The reality is that users routinely feel that Internet services are biased against their team, whatever team that is, and this constant perception of bias makes content moderation an unwinnable “game” because it leaves everyone questioning whether their team got fair treatment. And yet, many people think the solution is to use the law to mandate “better” content moderation. They won’t get that outcome under any circumstance, and it normalizes government control over speech it doesn’t like.
Case citation: Amro Elansari v. Meta, Inc., 2022 U.S. Dist. LEXIS 178399 (E.D. Pa. Sept. 30, 2022)