Rounding Up the Supreme Court Briefs in NetChoice v. Paxton, the Challenge to Texas HB20’s Social Media Censorship Law

A quick recap: last summer, Texas passed HB20, a #MAGA messaging bill that poses existential threats to the Internet as we know it. NetChoice and CCIA led a lawsuit seeking to enjoin most of it. In December, the federal district court granted the requested injunction based on the First Amendment. Texas appealed to the Fifth Circuit. After oral arguments in the case, the panel summarily lifted the injunction in a one-sentence order that did not explain its reasons for changing the status quo or provide a window for phasing-in or appealing. NetChoice and CCIA appealed the one-sentence order to the Supreme Court’s shadow docket. The case automatically routed to Justice Alito, who requested that Texas respond. 8 amicus briefs came in support of NetChoice/CCIA’s appeal, including mine. 2 amicus briefs came in support of Texas. NetChoice/CCIA filed a reply. Now everyone is waiting.

This post will round up some highlights (or, with respect to Texas and its supporters, some lowlights) from the filings with the Supreme Court. As usual, a case library with links to all of the documents is at the bottom of the post.

[Procedural note: making filings to the Supreme Court on a rush basis isn’t easy. A lot of people burned the midnight oil to get their filings in, so we should pour one out for every person listed on the brief captions and the team behind them. The associated angst and personal sacrifices are largely due to the Fifth Circuit’s condemnable decision to disturb the status quo without explaining itself. This is a symptom of major dysfunction in the Fifth Circuit, which has a growing reputation for doing strange things (see also last week’s ruling striking down the SEC’s administrative law function).]

NetChoice/CCIA Emergency Application

“it will be impossible for these websites to comply with HB20’s key provisions without irreversibly transforming their worldwide online platforms to disseminate harmful, offensive, extremist, and disturbing content—all of which would tarnish their reputations for offering appropriate content and cause users and advertisers to leave….the cost of revamping the websites’ operations would undo years of work and billions of dollars spent on developing some platforms’ current systems. Even if platforms could revamp their entire communities, they would lose substantial revenue from boycotts by advertisers who do not want their ads to appear next to vile, objectionable expression.”

“much like a newspaper must decide what stories deserve the front page, how long stories should be, what stories should be next to other stories, and what advertisements should be next to what stories, social media platforms engage in the same kinds of editorial and curatorial judgments both for individual users and the platforms as a whole

“the Fifth Circuit panel majority has in effect issued something akin to a nationwide (or even worldwide) injunction that disrupts the First Amendment rights of Applicants everywhere that the Internet exists—and without a word of reasoning”

Amicus Briefs in Support of NetChoice/CCIA

Cato. “HB20 may make using dominant social media platforms so distasteful they become virtually unusable for most of the public.”

CDT et al. “Allowing HB20 to go into effect will upend the status quo—under which platforms have long engaged in often extensive content moderation—and will harm users, and the public interest, in three ways. First, platforms will have to end or alter their viewpoint-based content moderation practices. Second, the risk of litigation posed by HB20 will discourage some platforms from engaging in any content moderation, even under ostensibly viewpoint-neutral policies. Third, other platforms may begin to remove even more speech in an effort to appear evenhanded and more consistent in the enforcement of their content policies.”

“Allowing HB20 to go into effect will also make it legally riskier for platforms to apply even ostensibly viewpoint-neutral content moderation policies. Because mistakes are inevitable when platforms engage in content moderation at scale, many users whose content is moderated will be able to point to inconsistencies in how a platform has moderated similar or even identical content. These users may use evidence of inconsistency to claim that the moderation decision in their case was viewpoint-based. And, even if the platform’s moderation action is not obviously viewpoint-based, a user could still claim it is based on the expression of his or her off-platform views.”

Chamber of Progress et al. Some interesting co-signers to this brief, including ADL and NAACP.

“Platforms could not moderate pro-Nazi speech—that is, unless they also moderated all content pertaining to political ideologies. They could not moderate speech denying the Holocaust—at least not without banning all content remembering or educating about the Holocaust. They could not remove speech glorifying terrorist attacks against the United States—unless they also remove speech decrying, memorializing, or educating about terrorist attacks against the United States. Platforms could not combat misinformation from foreign governments, like misleading propaganda regarding Uyghurs in China or other issues of public concern, unless they likewise banned truthful information provided by the U.S. State Department. They could not moderate speech promoting eating disorders, self-cutting, and suicide—that is, without also barring content providing resources for those suffering from eating disorders or contemplating self-harm.”

Chris Cox. “Congress enacted §230 for the express purpose of overturning a state court ruling that required platforms to be mere conduits to avoid liability for user posts.” [This has been said many times, yet apparently it can’t be repeated often enough.]

“Section 230 does not classify platforms as “publishers” or “not publishers.” It simply states that a platform will not be deemed a publisher in certain circumstances. Texas mistakenly insists that a platform must be classified for all purposes as either a publisher or a conduit. But §230 does not require classification of a platform as either one or the other for good reason. Most platforms share some features in common with traditional publishers. The two forms of media differ, however, in that platforms host millions or even billions of pieces of content each day that become available online in real time. Almost all platforms perform content moderation – they are not mere conduits – but their content moderation efforts cannot approach those of a newspaper whose editors can read and understand all of its contents before they are published. Section 230 is premised on this multi-faceted reality. The statute provides that a platform will be treated as a publisher when it is involved in the creation or development of particular content, but it will not be treated as a publisher otherwise.” [Again, this is super-obvious to anyone who isn’t reading Section 230 in bad faith].

“because content moderation is a form of editorial speech, the First Amendment fully protects it well beyond the specific safeguards enumerated in §230. Properly understood, §230 complements the First Amendment.” Yes.

“the protection of content moderation is the very raison d’etre of §230”

Copia Institute. “even if any platform moderation were to be driven by bias, the existence of expressive bias is not something for regulation to correct; it is something for regulation to protect. Bias is evidence of expressive freedom, that we could be at liberty to have preferences, which we can then express. This law targets that freedom by denying platform operators the ability to express those preferences.”

Goldman. I explained why HB20’s mandatory transparency requirements are unconstitutional. See this blog post for more detail and this article for even more detail.

RCFP et al. “Texas’s invitation to ask whether covered platforms are ‘like newspapers’ was always misguided. The Tornillo rule has been extended “well beyond the newspaper context” because it asks whether the government has seized control of an aspect of the speech process (deciding what to publish) rather than whether the regulation burdens a favored class (the press).”

“Texas maintained below that these [transparency] intrusions are less objectionable than the government’s direct exercise of editorial control, as if forcing the Miami Herald to disclose why it rejected Pat Tornillo’s submissions would have been a defensible compromise. Not so….government mandates requiring transparency raise their own distinctive First Amendment concerns—especially when, as here, they complement a viewpoint discriminatory scheme.”

TechFreedom. “this brief includes some truly distasteful, vile material (including some references to actual social media posts that were taken down)—because that is what the law ultimately enables.” As one example, “Social media platforms would be unable to take even minimal steps to help protect American children or teenagers from content promoting suicide and self-harm.”

“Under HB20, social media platforms would be prohibited from considering “authoritativeness” in making these recommendations, as any finding that a viewpoint is less authoritative would have the effect of de-boosting it….HB20 would require private companies to help others poison the minds of the populace with false, even if well-produced content.”

Texas’ Opposition

Many lowlights:

  • The brief refers to the social media platforms as “modern public squares” per Packingham, yet later it calls them common carriers. Uh, pick one?
  • The brief repeats a lie it made to the 5th Circuit when it says: “applicants represented below that only Facebook, YouTube, and Twitter are likely affected by the Texas law at issue here.” This false claim deceives the justices about the law’s impact, and I provided the receipts to prove it.
  • The social media platforms assert “a First Amendment right to refuse service to their customers based on the viewpoints those customers profess. This Court has never recognized such a right, and it should not do so now to vacate a stay.” Huh? We’re not talking about serving breakfast specials at a diner; we’re talking about publishing content, and of course the Supreme Court has repeatedly endorsed the freedom of publishers to “refuse service” those authors whose “viewpoints” they don’t wish to publish.
  • Overall, the brief claims to defend what it calls the “hosting” provision (which “prohibits the platforms from censoring a customer based on his viewpoint or location in Texas”) and the transparency provisions. This is a definitional sleight-of-hand because it masks exactly which parts of the law the state is actually defending and which of the other parts it is letting go. But the Fifth Circuit lifted the injunction for the whole law, not just the parts the state is actually defending. Thus, the Fifth Circuit has unleashed provisions that even the state has tacitly admitted are unconstitutional. That cannot be the right result.
  • The state again argues HB20 regulates conduct, not speech. A garbage argument.
  • “The platforms are the twenty-first century descendants of telegraph and telephone companies: that is, traditional common carriers.” The resemblance between Western Union and Facebook is uncanny. Like twins.
  • “The platforms have now partnered with federal officials to exclude or censor certain customers these officials deem undesirable.” Did you know that Biden secretly owns as much Facebook stock as Zuckerberg? Read my many blog posts on jawboning to see how other courts are assessing this joint action argument.
  • The “platforms can also ban foreign government speech without violating HB 20.” How is that not discrimination against the foreign governments compared to other unbanned governments’ propaganda?
  • “They likewise can ban spam.” This is deceptive. An unchallenged part of HB20 only permits the deployment of spam filters if the services provide notice and appeals to anyone filtered–an obviously impossible logistical requirement. I don’t believe email services are following that rule, and I don’t believe the state has enforced it either. However, omitting any reference to that provision misleads the court.
  • “platforms can ban content that incites violence, so the platforms are not required to host ‘ISIS propaganda claiming that extremism is warranted.'” Later, the state gives two other examples of putative incitement: “videos of an Egyptian cleric who had been ‘banned from the US over extremism’ and a preacher whose messages ‘were said to have inspired the murder of a politician.'” These examples are obviously not “incitement” per the constitutional definitions of the term, which requires the incitement of *imminent* violence not present in the examples. It’s deeply troubling that the Texas AG’s office, which is responsible for enforcing the “incitement” crime, admitted to the Supreme Court that it doesn’t understand the term.
  • The so-called “operational” requirements “essentially require the platforms to maintain a customer-service department for processing complaints and reviewing user appeals.” Name any other publisher that has been legally compelled to provide a “customer service department” for editorial decisions. I’ll wait.
  • “Whatever harms the platforms suffer from interim compliance with HB 20, by contrast, are generally financial and comparatively less significant.” An interesting way to describe the consequences of government-imposed censorship.
  • The last 22 pages of the state’s filing is the so-called “expert” report from Adam Candeub that was filed with the district court. This is an extremely irregular attachment, and the state does nothing to explain why it’s attached. The district court simply ignored Candeub’s report, for good reason. It’s  a legal brief, not an expert report, and it would have been more proper as an amicus brief, not as putative “evidence.”

Amicus Briefs in Support of Texas

Florida and Other #MAGA States. More lowlights:

  • “If the First Amendment prohibited mandatory hosting rules, then telephone companies could refuse to connect calls based on the viewpoint of the caller; internet providers could shut off access to people they dislike; and delivery drivers could decline to deliver content with which they disagree.” IAPs aren’t utilities and can turn off service if they want.
  • “HB 20 is a limited hosting rule.” What would a less limited hosting rule look like?
  • “social media platforms remain free and perfectly able to speak with their own voice on any issue both on their own platforms and outside them.” Well, they can say whatever they want, except that they cannot choose what content is fit for their audience. ¯\_(ツ)_/¯
  • “the government is not compelling speech; it is compelling hosting.” Again, using the murky term “hosting” to mask what’s being required.
  • “communication about what speech the platforms will allow is commercial speech because it (1) proposes the terms of a commercial transaction with the user, (2) refers to the specific product the platform offers, and (3) arises from the platform’s economic motivation to enlist or retain a user.” Based on this perspective, is there anything a company says publicly that isn’t commercial speech?
  • “if requiring a biannual transparency report and the like offends the First Amendment, then the SEC’s voluminous annual reporting, the FTC’s merger disclosure rules, or even many tax filing obligations might likewise be constitutionally suspect.” False.
  • They cite the Santa Clara Principles as evidence that services support giving appellate processes to users. A reminder of what the Santa Clara Principles say: “The Santa Clara Principles is not a template for regulation. The principles were not created for that purpose and should not be used as such. States should not transform the Santa Clara Principles directly into legal mandates.”
  • The complaints-and-appeals process “is an economic regulation that demands businesses be responsive to their users.” Try that argument out on traditional publishers and see how well it fits.

Hamburger et al. This brief is quite similar to Hamburger’s brief to the Fifth Circuit, so see my critiques of that brief.

NetChoice/CCIA Reply Brief

“The appellate process cannot function with integrity if panels can thwart this Court’s involvement by refusing to explain their dramatic departure from settled First Amendment law and the ordinary procedures Nken prescribes.” <== This, 1,000x.

Conclusion

As I’ve said before, the fate of the Internet hangs in the balance of the legal challenges to the Florida and Texas laws. If we want a future Internet that includes user-generated content–and I think everyone does, even the supporters of the law–we MUST defeat these laws, emphatically, and ensure that state legislatures don’t dabble in this genre again. Everyone expected these legal battles to end up at the Supreme Court eventually, but few expected to get there so soon. Unfortunately, the Supreme Court shadow docket is a poor way to resolve these complex issues, so I’m not optimistic we’ll get the essential immediate relief from the Texas law–in which case more high-stakes drama over the Internet’s future will surely follow. One more round of condemnation for the Fifth Circuit putting us into this turmoil and upending the status quo without providing an explanation.

Case library (see also NetChoice’s libraryCourt Listener page, and the Supreme Court page):