Catching Up on the Challenge to Texas’ Social Media Censorship Law–NetChoice v. Paxton

I’m continuing coverage of the legal challenge to Texas’ social media censorship law, now on appeal to the Fifth Circuit. I recently rounded up the Texas opening brief and its supporting amici briefs. In this post, I’ll round up the plaintiffs’ opening brief, the supporting amici briefs, and Texas’ reply brief. As usual, I’ve included a case library at the end of this post.

NetChoice/CCIA Appellee Brief

Some quotes that stood out:

The law “is a content-, viewpoint-, and speaker-based law that would eviscerate editorial discretion, impermissibly compel and chill speech, and impose onerous disclosures on select disfavored ‘social-media platforms.'”

The state’s “hosting” theory “would give government complete power over what and how various entities disseminate speech—like bookstores, book publishers, essay-compilation editors, theaters, art galleries, community bulletin-boards, and comedy clubs….’Big Tech’ is not a label that allows the State to conscript private companies into its preferred editorial program.”

Services’ “editorial choices are expressive, reflect platforms’ values, and convey a message about the platforms and the communities they hope to foster….Platforms also prioritize, arrange, and recommend content according to what content users would like to see, how users would like to see it, and what content reflects accurate information (rather than disinformation)….when platforms have failed to remove harmful content, their users and advertisers have sought to hold platforms accountable—including through boycotts.”

“HB20 creates constitutional problems when it infringes any platforms’ editorial discretion. The First Amendment’s protections do not dissipate once a platform goes from 49.9- to 50.0-million-monthly U.S. users—nor at any of the other arbitrary thresholds the Legislature considered.” For more on size distinctions, see this article.

“it is constitutionally irrelevant at what point in time platforms exercise editorial discretion. Removal of content ex post is just as much an editorial choice as refusing to publish content in the first place. And government cannot compel continued publication any more than it can compel initial dissemination. The choice of when and how to exercise editorial discretion is itself protected by the First Amendment.”

A point I’ve noted many times: “Defendant’s unsupported “hosting” theory is akin to defending censorship as regulating the “conduct” of writing. But whatever “conduct” platforms engage in is intertwined with editorial discretion and the expression embodied in such editorial choices.”

Amici Briefs in Support of NetChoice/CCIA

[Note: many of these amici also filed in the NetChoice v. Moody 11th Circuit appeal. The changes: CDT joined the RCFP brief. IA is defunct. The First Amendment law professors did not file.]

[In my prior post, I noted the heavy gender skew of the counsel for Texas’ amici–21 counsel on the briefs, 2 of them women. For comparison, I did a similar analysis of the counsel for NetChoice/CCIA’s amici. Coincidentally, 21 counsel were listed on these briefs. By my count, 6 were women.]

Cato Institute

“Property rights are thus collateral damage in Texas’s effort to retaliate against Silicon Valley’s allegedly left-wing bias….By adopting the tactics of communications collectivists, Texas contravenes historically held conservative political values of limited government, constitutional fidelity, and strong property rights.” [Yes, #MAGA ended the Reagan Republican era].

Chamber of Progress et al

“H.B. 20 forces Providers to make a choice: either forego any viewpoint-based moderation in a content category or remove the entire category. Texas’s blithe acceptance of whole subjects of speech disappearing from social media suggests it has little true interest in preserving “an uninhibited marketplace of ideas.” If Providers elect not to eliminate content categories, H.B. 20 will force Providers to disseminate viewpoints Texas favors, despite Providers’ belief such speech will harm their communities and their businesses.”

Chris Cox

“Congress enacted §230 for the express purpose of overturning a state court ruling that required a platform to be a mere conduit to avoid liability for user posts….§230 does not protect platforms only when they act as mere conduits….Being a mere conduit means not moderating content. It is the ‘anything goes’ model that §230 was designed to discourage….Platforms do exercise editorial discretion when they moderate content, and it is for that very reason that §230 does protect them.”

Copia Institute

Techdirt “could not disclose its moderation policy because its moderation system is primarily community-driven and subject to the community’s whims and values of the moment.”

EFF et al

The law’s “user-focused provisions, such as requiring annual notice to users on the use of algorithms, are part and parcel of the law’s restrictions on editorial discretion. They are likewise motivated by misconceptions about disproportionate partisan censorship and thus raise the specter of selective punitive enforcement.”

IP Justice

“If user-generated content becomes subject to HB 20, there will likely be a limitation of the number of creators that are sustainable over time and thus, a silencing of voices and an elimination of content.”

Knight First Amendment Institute

On the plus side, this brief is clearer and more supportive of NetChoice/CCIA than the pro-censorship mess they filed in the NetChoice v. Moody case. In particular, they dropped their misguided efforts to distinguish traditional newspapers from social media. I spent a chunk of my paper on editorial transparency challenging the arguments in that brief.

On the minus side, the brief argues that mandatory editorial transparency obligations can be constitutional due to the Zauderer case, which greenlighted laws requiring disclosure of attorneys’ contingency fees. It’s an odd precedent to embrace. It’s a compelled speech disclosure case, but the compelled speech doctrine is filled with contrary decisions too. The precedent also involves commercial speech, which isn’t analogous to the Texas law unless you take the very censorial position that every snippet of information disclosed by a company is automatically commercial speech.

To bypass the highly relevant Herbert v. Lando precedent, they argue “HB20’s disclosure requirements cannot plausibly be characterized as a ‘casual inquiry’ into social media platforms’ editorial judgment,” but that’s obviously wrong. The compelled disclosures are just-in-case, without any evidence of wrongdoing by the disclosers. The brief doubles down: “The mere fact that a compelled disclosure implicates editorial judgment does not render it unconstitutional. Rather, the question is whether it imposes an undue burden on speech.” As my paper explains, I believe any investigation and enforcement into editorial judgments necessarily imposes that impermissible speech burden. The RCFP brief does a much better job on this point.

RCFP et al

“The government’s decision to displace an editor’s point of view in favor of its own—even a notionally neutral one—is always viewpoint based.”


“Social media websites—even large ones—are nothing like common carriers. Common carriage is about (1) carriage, i.e., pure transportation or transmission, (2) of uniform things, i.e., people, commodities, or parcels of private information, (3) in a manner that is common, i.e., indiscriminate…Social media, meanwhile, depart from them in all pertinent respects. Social media are (1) a diverse array of data-processing products (microblogs, videochats, photo streams, and so on), (2) typically shared as a public-facing expressive activity, (3) that are offered subject to the condition of a user’s compliance with extensive terms of service.”

Texas’ Reply Brief

This brief added nothing to the discourse, and I wish I could get back the time I spent reading it. A few lowlights:

“The Platforms do not exercise ‘editorial discretion’ over almost all content they host because they exercise no discretion ex ante over it.” First, many services manually prescreen content before it’s posted. I can’t give a precise list because I don’t know the set of services that the law regulates. Second, automated filtering before posting is absolutely the “ex ante exercise of discretion.” The state simply disregards automated prescreening because the services haven’t provided their algorithms in discovery (huh?).

It still blows my mind how much energy has been spent arguing over the badly fractured Turner case when Reno v. ACLU came after it and flatly said that there’s no basis for qualifying the level of First Amendment scrutiny applied to the Internet. So, like, Turner is 100% irrelevant to an Internet case….?

Per the state, “What [the services] cannot do is viewpoint discriminate, such as by treating [Ukraine] war boosters differently than war skeptics.” In other words, the state expressly concedes that the services must either stop all discussion of the Ukraine invasion or treat Russian government propaganda as credible. #MAGA rarely hides its Putin-envy but WOW.

Case library (see also NetChoice’s library and the Court Listener page):