Florida and Its Amici Try to Justify Government Censorship in the 11th Circuit–NetChoice v. Moody

Earlier this year, Florida enacted a wide-ranging, complex, poorly drafted, and enthusiastically censorial law, SB7072. Among other problems, the law dictates how “social media platforms” can make their editorial decisions. Fortunately, a Florida federal judge blocked Florida’s social media censorship law as unconstitutional. As expected, Florida has appealed the ruling to the 11th Circuit. This blog post covers the appellate brief and some supporting amicus briefs.

Before reading this post, read my new article, Online Account Terminations/Content Removals and the Benefits of Internet Services Enforcing Their House Rules. It explains why must-carry laws, including part of the Florida law, would accelerate the end of user-generated content. This is why I get so angry about anyone supporting these laws. Not only are they actively campaigning for unconstitutional government censorship, but for reasons detailed in the article, the laws would decrease, not increase, “free speech” online. Don’t let the legal sophistry distract from the law’s counterproductive and terrible policy consequences.

The Florida Appellate Brief

Some of the state’s concessions:

  • the state conclusorily disagrees with the district court’s claim that the law applies to entities that don’t resemble social media, but didn’t push the issue. Who qualifies as a “social media platform” is crucial because no one is sure which entities are regulated by the law, but it goes way beyond social media archetypes like Facebook and Twitter. When the law’s supporters highlight examples related to Facebook and Twitter, they are implicitly trying to obscure the law’s expansive reach to dissimilar entities.
  • the state concedes the injunction against “post prioritization” provisions.
  • the state admits that the law, as applied, could conflict with Section 230.

More generally, the brief ignores many censorial corners of the law to tell a cleaner, but false, narrative about what the law says and does. I imagine the appellate court could treat those deceptive characterizations as admissions that the ignored details are, in fact, properly enjoined.

I’ll now deconstruct some of the most dubious parts of the state’s brief:

  • “Large media companies with the power to influence public debate are nothing new. But what makes social media platforms different is their ability to shape public discourse not by promoting their own messages but by silencing voices they deem to be harmful…. the bottleneck position that comes with control of these platforms brings with it a remarkable power to censor”

By virtue of making publication decisions, every publisher necessarily “silences” the voices of people it chooses not to publish. This is not unique to social media.

Also, I remind you: every time the state refers to publishers’ exercise of editorial discretion as “censorship,” it’s lying. Instead, the law is a government effort to restrict the editorial discretion of private publishers. The Florida law doesn’t ban censorship; it IS censorship. This is just another example of the dirty trick of accusing your opponents of doing what you are actually doing.

  • “social media platforms arbitrarily discriminate against disfavored speakers, including speakers in Florida. The record is replete with unrebutted examples of platforms suppressing user content for arbitrary reasons”

Unpublished authors routinely disagree with how the publishers applied editorial discretion to their content. That doesn’t mean the editorial discretion was exercised arbitrarily or discriminatorily (and it would be protected by the First Amendment even if it was).

  • “the consistency provision and the notice requirements do not regulate content moderation at all”

The consistency provision penalizes editorial decisions. How is that not regulation of content moderation?

  • “Deplatforming thus restricts who can access a platform; it does not necessarily “restrict access to or availability of material” on the platform”

“Deplatforming” restricts the material available on the service because deplatformed users can’t publish content on the service.

  • “when platforms engage in viewpoint- rather than content-based censorship, the censored speech is not “otherwise objectionable” within the meaning of (c)(2)”

Is restricting anti-vax content a content- or viewpoint-based restriction?

  • “newspapers are unlike social media platforms making decisions about which users to deplatform, censor, or shadow ban. Newspapers (1) do not publicly disclaim responsibility for the articles they publish, (2) are highly selective about the material they include, (3) have limited space due to the physical product they produce, and (4) curate articles to create a unified speech product that conveys a coherent message or offers perspectives on one or more overarching themes.”

Great, let’s talk about media exceptionalism. These four points of distinction aren’t evidence of it. (1) per the express terms of Section 230, Internet services still face “responsibility” for many types of third-party content, including federal criminal prosecutions, IP, ECPA, and FOSTA. Also, offline publishers routinely disclaim “responsibility” for third-party content, such as when they say that opinion pieces doesn’t represent their opinions. (2) Internet services are highly selective about the content on their services as well. For example, Facebook has hundreds of content moderation rules. (3) Why does this matter? (4) This is supposed to be a magic phrase in constitutional parlance, but what does it even mean? Virtually every Internet service curates third-party content topically or thematically, such as message board topical threading.

  • “the hosting regulations here do not meaningfully “interfere[] with any message” the platforms would otherwise communicate”

Well, services want to send a message that they don’t tolerate garbage content. The Florida law would thwart that. More importantly, the law’s definition of “censor” includes “post[ing] an addendum to any content or material posted by a user”…so, um, yeah, the law absolutely interferes with messages the services want to deliver.

As you can see, in this part of the brief, the state only tries to defend the law in one very specific context (“hosting”). Presumably, this silently concedes that other contexts are censorial. Furthermore, few services want to act solely as web hosts, with no curation added, and they won’t provide uncurated hosting services for free.

  • “Unlike traditional newspapers, social media platforms, because of the technology they use, have in essence an unlimited ability to respond with their own speech to counter any hosted user speech with which they disagree”

Internet services’ rebuttal powers are broader than newspapers??? Newspaper authors can’t rebut without the newspaper’s permission.

  • “a reasonable user of a typical social media platform would not identify the views expressed on the platform as those of the platform itself”

As my Online Account Termination paper explains, garbage content gets a halo of credibility from being hosted on trusted services. But the services can’t retain consumer trust over time if they are forced to host garbage content.

  • “The Act places modest limits on social media platforms’ hosting function”

So, it’s OK to engage in censorship, so long as the censorship is “modest”?

  • “The social media platforms covered by the Act’s hosting regulations have significant market power within their domains, and they hold themselves out to the public when trafficking in important public goods—namely, the speech of others. “

Citations, please. The state should enumerate all of the entities regulated by the law with details about their supposed market power. If there are dozens of regulated entities, could they all have market power???

  • the consistency requirement “is a constitutional regulation of conduct”

Editorial discretion–a right protected by the Constitution–is the freedom to decide what to publish or not. A consistency requirement is the antithesis of editorial discretion because it, in facts, limits editorial discretion. Also, “consistent content moderation” is an oxymoron. It simply can’t be done. So imposing penalties for inconsistent content moderation guarantees that the ordinary exercise of editorial discretion would be punished. (More on the conduct/speech dichotomy below).

  • “By posting addenda, platforms effectively distort or obstruct the journalist’s message, and if platforms could post endless addenda, they could black out the message altogether in a wall of contrary speech”

A “wall of contrary speech” sure sounds like “free speech” in the marketplace of ideas. And where the state says “the only thing they cannot do is drown out a journalistic enterprise’s own speech,” that mischaracterizes the law.

  • “The notice requirements in the Act operate much like the notice requirements in Zauderer. For one, like in Zauderer, the Act targets commercial speech—specifically, the speech that occurs when a platform solicits users to sign up for or keep using the platform”

First, the law simultaneously mandates the disclosure and regulates its contents. That’s a far cry from the Zauderer situation. Second, this argument treats everything that the Internet service publicly says about its operations as “commercial speech,” which has never been the law. Is every public statement by a traditional print publisher about its editorial standards also “commercial speech”?

Reminder: the commercial speech doctrine just lowers the constitutional review standard; it doesn’t eliminate it. The brief doesn’t actually make the case that the disclosure obligations survive intermediate scrutiny.

Note: there are 5 state lawyers and 5 outside counsel on Florida’s brief. Good use of taxpayer money.

Note 2: the brief favorably cites the following scholars: Adam Candeub, Genievive Lakier, Tim Wu, Eugene Volokh, and Benjamin Zipursky. I personally would be horrified if Florida thought any of my arguments supported its position.

Amicus Briefs

10 Red State AGs. [Brnovich (AZ), Cameron (KY), Fitch (MS), Knudsen (MT), Marshall (AL), Paxton (TX), Rutledge (AR), Schmitt (MO), Taylor (AK), Wilson (SC)]

The brief claims Florida’s law only regulates conduct, not speech. If true, that would give states essentially unlimited opportunities to tell Internet publishers how to run their editorial operations–as many of the represented states are interested in doing. Fortunately, it’s not even close to true. This would be like saying the law doesn’t regulate newspapers’ speech, it just regulates the conduct of publishing speech. It’s intellectually disingenuous and really damages the credibility of anyone advancing it. Voters in the represented states, you should be livid that your top lawmaker took this position.

Babylon Bee. I assume this brief was meant as satire, like the Babylon Bee itself. Some quotes illustrating why I think that:

  • “social media platforms systematically target conservative users and messages for censorship, selectively invoking vague policies against “hate” and “misinformation” to stunt the free flow of information and silence conservative voices.”
  • “Twitter uses any arguable open texture in the term [“hate”] as cover for an open season against conservatives.”
  • the Florida law “protects social media platforms, too, from improper government pressure to censor….If and when government officials demand that platforms stifle debate on a particular issue, or that they censor messages the officials disfavor, the platforms can refuse and cite their obligations under Florida’s law….Florida’s law provides a needed prophylactic defense for a core First Amendment protection.”

On the last point, the brief seems to be saying that Florida’s censorship of Internet services protects those Internet services from government censorship? I sure hope that’s satire, because otherwise that’s Orwellian AF.

Institute for Free SpeechThe brief is largely a love letter to the Florida law, but it claims that the law goes too far when it restricts fact-checks and gives special status to political candidates/journalistic enterprises/Walt Disney. Those provisions are indeed egregious. The rest of the law is too.

Case library (see also NetChoice’s library)