Court Enjoins Texas’ Attempt to Censor Social Media, and the Opinion Is a Major Development in Internet Law–NetChoice v. Paxton

Earlier this year, the Texas legislature enacted HB 20, a blatant attempt to censor social media service. The Texas law emulated a similar Florida censorship law. In June, a Florida district court enjoined based on the First Amendment and Section 230. A Texas federal court has now similarly enjoined the Texas law in a decisive opinion relying solely on the First Amendment. The opinion emphatically slices through the FUD that’s been generated by pro-censorial forces questioning whether social media services exercise editorial discretion. That’s never been a close question, and this judge shreds the arguments.

Both this decision and the Florida decision are major developments in Internet Law. I think this opinion is even more significant than Florida’s because:

  • it is written in a broader and more persuasive way than the Florida opinion (to be fair, the Florida judge was extremely time-stressed). Its First Amendment analysis is clear and easy-to-follow.
  • the Texas law was more refined than the Florida law and had fewer sharp edges (for example, it didn’t have Florida’s stupid exception for theme park owners), so it was supposed to be more resistant to constitutional challenges.
  • this opinion expressly analyzes, and rejects, mandatory transparency and digital due process obligations. Given the broad-based regulatory energy being invested throughout the country to impose transparency and digital due process requirements, this is a major precedent that will pose a quandary for those regulators.
  • this opinion does not rely on Section 230 (though Section 230 would have been appropriate in this case), which reduces the risk of folks freaking out about 230.

Nevertheless, this ruling isn’t the final word. We will anxiously await the Fifth Circuit’s views on this case.

The Opinion

Social Media Services Aren’t “Common Carriers”

One of most significant pro-censorship fads of the past 18 months has been claiming that social media services are “common carriers.” This judge clearly and emphatically rejects it.

The judge rightly starts by citing Halleck (a Supreme Court precedent that the pro-censorship crowd often ignores or sidesteps), saying “Social media platforms have a First Amendment right to moderate content disseminated on their platforms.” The court continues:

The Supreme Court’s holdings in Tornillo, Hurley, and PG&E, stand for the general proposition that private companies that use editorial judgment to choose whether to publish content—and, if they do publish content, use editorial judgment to choose what they want to publish—cannot be compelled by the government to publish other content.

The court then flatly rejects the “common carrier” appellation: “This Court starts from the premise that social media platforms are not common carriers.” [cite to USTA v. FCC] The court explains (emphasis added):

Unlike broadband providers and telephone companies, social media platforms “are not engaged in indiscriminate, neutral transmission of any and all users’ speech.” User-generated content on social media platforms is screened and sometimes moderated or curated. The State balks that the screening is done by an algorithm, not a person, but whatever the method, social media platforms are not mere conduits.

I love how the court decisively rejects the algorithmic exceptionalism argument. It’s never mattered whether editorial discretion is exercised by humans or machines. Either way, it’s still editorial discretion.

Because Texas rested its arguments on the common carrier theory, the court says it could stop simply with the determination that social media services aren’t common carriers. Nevertheless it continues so it can “make a determination about whether social media platforms exercise editorial discretion or occupy a purgatory between common carrier and editor.” (Purgatory is exactly where I feel like I’ve been living for these past 18 months). The court explains (emphasis added):

Social media platforms “routinely manage . . . content, allowing most, banning some, arranging content in ways intended to make it more useful or desirable for users, sometimes adding their own content.” [cite to NetChoice v. Moody] Making those decisions entails some level of editorial discretion, even if portions of those tasks are carried out by software code. While this Court acknowledges that a social media platform’s editorial discretion does not fit neatly with our 20th Century vision of a newspaper editor hand-selecting an article to publish, focusing on whether a human or AI makes those decisions is a distraction. It is indeed new and exciting—or frightening, depending on who you ask—that algorithms do some of the work that a newspaper publisher previously did, but the core question is still whether a private company exercises editorial discretion over the dissemination of content, not the exact process used….This Court is convinced that social media platforms, or at least those covered by HB 20, curate both users and content to convey a message about the type of community the platform seeks to foster and, as such, exercise editorial discretion over their platform’s content….Without editorial discretion, social media platforms could not skew their platforms ideologically, as the State accuses of them of doing.

That bolded language…whoa! As I (and many others) have said thousands of times, Internet services’ house rules differ from service to service because their audiences have different needs, and the services are appealing to different audiences. The court unreservedly acknowledges this point, recognizing that the adoption and implementation of house rules are editorial decisions.

The Level of First Amendment Scrutiny

Having established that social media services exercise editorial discretion, it’s fairly easy to conclude that strict scrutiny applies because, after all, the law by design sought to impose censorship. The court says “HB 20 prohibits social media platforms from moderating content based on ‘viewpoint,'” and the court notes some associated problems:

  • “HB 20 compels social media platforms to significantly alter and distort their products”
  • “HB 20 also impermissibly burdens social media platforms’ own speech”
  • “the threat of lawsuits for violating Section 7 of HB 20 chills the social media platforms’ speech rights”

The court summarizes: “HB 20’s prohibitions on “censorship” and constraints on how social media platforms disseminate content violate the First Amendment.”

In a footnote, the court distinguishes Pruneyard because “the shopping mall did not engage in expression and ‘the [mall] owner did not even allege that he objected to the content of the [speech]; nor was the access right content based.'” (Please stop treating real estate cases as analogous to online publication decisions 🙏). The court says Rumsfeld v. FAIR “has no bearing on this Court’s holding because it did not involve government restrictions on editorial functions.”

The court then turns to the law’s mandatory transparency requirements and other digital due process obligations and sets a major new precedent by saying they too trigger strict scrutiny (emphasis added):

  • Section 2’s disclosure and operational provisions are inordinately burdensome given the unfathomably large numbers of posts on these sites and apps [and by] ‘forc[ing] elements of civil society to speak when they otherwise would have refrained.” [cite to Washington Post v. McManus].
  • “The consequences of noncompliance also chill the social media platforms’ speech and application of their content moderation policies and user agreements.”

In addition to the law’s viewpoint discrimination, the court says it engages in speaker- and content-based discrimination. Citing the size-based distinction in the law (and the many censorial public statements by the bill sponsor and others in government), the court says “the Legislature intended to target large social media platforms perceived as being biased against conservative views and the State’s disagreement with the social media platforms’ editorial discretion over their platforms. The evidence thus suggests that the State discriminated between social media platforms (or speakers) for reasons that do not stand up to scrutiny.” This is a reminder that size-based statutory distinctions are a double-edged sword. Without such a distinction, a digital due process requirement can impose disproportionately large costs/obligations on small publishers that inhibit speech; with such distinctions, the classifications of who qualifies for harsher regulatory treatment may not be justifiable (not that the Texas legislature really tried to justify why it set its arbitrary cutoffs where it did). For more on size-based distinctions in Internet regulation, see this article.

Some of the statutory phrases were also impermissibly vague:

  • The requirement that social media services give “equal access or visibility to” content. “A social media platform is not static snapshot in time like a hard copy newspaper. It strikes the Court as nearly impossible for a social media platform—that has at least 50 million users—to determine whether any single piece of content has “equal access or visibility” versus another piece of content given the huge numbers of users and content” (emphasis added). This is a gut check on all statutory requirements of “consistent” content moderation.
  • In the law’s application to any service that “consists primarily of news, sports, entertainment, or other information or content that is not user generated but is preselected by the provider,” the court says the word “primarily” is vague.
  • The law authorizes the AG to enforce “potential violations” of the law. The court responds: “Subjecting social media platforms to suit for potential violations, without a qualification, reaches almost all content moderation decisions platforms might make, further chilling their First Amendment rights.”

However, some phrases are not vague. In particular, with respect to the transparency obligations, “these provisions may suffer from infirmities, [but] the Court cannot at this time find them unconstitutionally vague on their face.”

Heightened Scrutiny

Although the court concludes that strict scrutiny applies, it says the law would not survive intermediate scrutiny either:

  • “the State provides no convincing support for recognizing a governmental interest in the free and unobstructed use of common carriers’ information conduits” [cite to Miami Herald v. Tornillo, and the court recognizes that the state’s argument is a word salad too]
  • “The State’s second interest—preventing ‘discrimination’ by social media platforms—has been rejected by the Supreme Court. Even given a state’s general interest in anti-discrimination laws, ‘forbidding acts of discrimination’ is ‘a decidedly fatal objective’ for the First Amendment’s ‘free speech commands.'” [cite to Hurley]
  • In a footnote, the court dismisses the applicability of the confusing Turner cable precedent because it ultimately sought to preserve the integrity of the broadcasting scheme. The court says: “The analysis applied to the regulation of broadcast television has no bearing on the analysis of Internet First Amendment protections.” [cite to Reno v. ACLU, another major Supreme Court precedent that the pro-censorship forces often ignore]
  • “HB 20 is not narrowly tailored. Sections 2 and 7 contain broad provisions with far-reaching, serious consequences…the [NetChoice v. Moody] court colorfully described [the Florida social media censorship law] as ‘an instance of burning the house to roast a pig.’ This Court could not do better in describing HB 20.” [As I said in my NetChoice v. Moody post, the “meme of porcine incineration and residential arson…runs throughout First Amendment jurisprudence.”]

If the law can’t survive intermediate scrutiny, then the state must find a way to qualify the law for rational basis scrutiny. There’s really no way it can. As a result, the court’s rejection on intermediate scrutiny grounds is another major precedent.

Other Issues

The law’s lengthy severability provisions (seriously, they ran for several pages) don’t help the state:

Both sections are replete with constitutional defects, including unconstitutional content- and speaker-based infringement on editorial discretion and onerously burdensome disclosure and operational requirements. Like the Florida statute, “[t]here is nothing that could be severed and survive.”

On the remaining elements of a preliminary injunction, the court adds (emphasis added): “HB 20 prohibits virtually all content moderation, the very tool that social medial platforms employ to make their platforms safe, useful, and enjoyable for users….In addition, social media platforms would lose users and advertisers, resulting in irreparable injury….content moderation and curation will benefit users and the public by reducing harmful content and providing a safe, useful service.” The bolded language is crucial because it shows the judge understood the policy stakes of this case. House rules are the sine qua non of social media. Efforts to strip away house rules essentially eliminate the services’ raison d’etre. For more on the policy considerations, see this article.

Implications

This is a major ruling. Even though it will eventually be supplanted by the Fifth Circuit’s decision, for now it casts a huge shadow over many of the pro-censorial regulatory efforts being pursued around the country. Reading this opinion, I felt a little like how I felt reading the initial district court opinion enjoining the ACLU v. Reno CDA litigation over 25 years ago. That too was a quickly issued but important ruling protecting the Internet, where a contrary result would have potentially permanently harmed the Internet’s development. So too with this opinion.

The court explicitly grounded its decision only in the First Amendment and specifically sidestepped the Section 230 and dormant Commerce Clause challenges. The court’s First Amendment-only ruling has several beneficial effects:

  • It doesn’t give the appellate court any room to do something stupid about Section 230, such as question Section 230’s constitutionality or find new exceptions to Section 230.
  • It preserves the court’s option to strike down the law on other grounds if the appellate court reverses it.
  • It takes the wind out of the sails of critics who falsely claim that Section 230, and not the First Amendment, is the reason why state legislatures can’t regulate the crap out of social media. As many of us have been saying for a long time, Section 230 reform may not change the substantive outcomes with respect to laws like this because the First Amendment would provide a more cumbersome route to the same substantive result. Instead, the reform would just mess up Section 230 for other use cases.

When I blogged the Florida law, I invoked the meme of a barking dog that catches the car, as an analogy to what happens when a state legislature passes a #MAGA messaging bill that was unmitigated censorship. When that happens, courts should easily strike it down. While we should celebrate the judicial branch as the strongest bastion of liberty left in our country, we should never stop condemning legislators who waste their time and their taxpayers’ money passing an obviously unconstitutional law. Worse, even if the law survived, it represents the worst kind of policy because it would accomplish none of its purported goals. As Jess Miers and I explain here, the law would actually be counterproductive to every goal it claims it seeks to advance. Texas residents, you deserve better governance.

Although this ruling strikes down much of HB20’s payload, the plaintiffs didn’t challenge HB20’s must-carry-spam provision–the worst possible policy that a legislature could pursue. As a result, AG Paxton can now enforce that provision at any time. I wonder if this decisive ruling will dissuade him from doing so. It’s possible the pro-spam provision will become one of those statutes that’s technically on the books but everyone ignores. I still think someone should preemptively challenge it rather than leave it hanging.

For more on this opinion, see the Techdirt coverage.

This is an amazingly well-crafted and savvy opinion produced on a very speedy timeline. For his remarkable combination of speed and accuracy, I’m awarding the Technology & Marketing Law Blog Judge-of-the-Day award to Judge Robert L. Pitman.

Case citation: NetChoice, Inc. v. Paxton, 2021 WL 5755120 (W.D. Tex. Dec. 1, 2021)

Case library (see also NetChoice’s library and the Court Listener page):