Colorado’s Mandatory Social Media “Warning Labels” Are Unconstitutional–NetChoice v. Weiser
[I’m so far behind in blogging the challenges to state Internet censorship laws. We’ll see if I can catch some of the ones I missed.]
State censorship laws come in a variety of forms. Today’s post focuses on one of Colorado’s approaches, which requires social media services to display a mandatory warning label to minor users. Disclosure mandates can receive more favorable Constitutional scrutiny if they qualify for Zauderer scrutiny. This particular mandate doesn’t. Instead, the court says the disclosure mandate is amenable to a facial challenge, that strict scrutiny applies, and the law fails strict scrutiny. Next stop: the Tenth Circuit.
The Law’s Requirements
Social media platforms covered by the Act are required to “establish a function” that provides minor users with certain information. The function must satisfy two criteria: First, it must “provide users who are under the age of eighteen with information about their engagement in social media that helps the user understand the impact of social media on the developing brain and the mental and physical health of youth users,” and second, the information must “be supported by data from peer-reviewed scholarly articles or the sources included in the mental health and technology resource bank established” by the Act.
This function must “[d]isplays a pop-up or full-screen notification to a user who attests to being under the age of eighteen when the user: (I) Has spent one cumulative hour on the social media platform during a twenty-four-hour period; or (II) Is on a social media platform between the hours of ten p.m. and six a.m.” This has to redisplay every 30 minutes (which is a sure way to irritate users).
Facial Challenge
The court says this disclosure requirement is amenable to a facial challenge because “Section 4 requires every covered social media platform to perform the same task under the Act.” Even though the services are likely to disclose different information, the requirement to speak at all is the same for all of the affected social media–“They must all convey to minor users Colorado’s belief that excessive use of social media may be risky to their health and well-being.”
Scrutiny Level
Zauderer scrutiny doesn’t apply. “Section 4’s compelled disclosures do not constitute commercial speech because they do far more than merely propose a commercial transaction….the disclosures compelled by the Act require social media companies to opine on the impacts of social media use on minors’ mental and physical health.” The court rejects the AG’s “suggestion that NetChoice’s members necessarily engage in commercial speech simply because third-party businesses advertise on their platforms.”
The court summarizes the commercial speech argument:
the primary difference between a social media platform’s curated feed and a newspaper’s editorial page is that the former operates in the electronic sphere, whereas the latter has traditionally operated in the physical. But that immaterial difference does not merit the adoption of a new, overly-capacious definition of commercial speech….
Section 4 targets the impacts of social media use generally—that is, it does not specifically target the commercial speech that allegedly occurs on those websites by third-party advertisers
The court distinguishes the atrocious Fifth Circuit ruling in FSC v. Paxton, which upheld warning disclosures for porn sites:
the landing pages on the pornographic websites are different than the feeds curated by the social media platforms at issue here. The Free Speech Coal. decision did not suggest that the pornographic websites moderated content or otherwise curated a bespoke feed based on the particular traits or viewing history of a given user. By contrast, NetChoice’s members allege—and substantiate via their unrebutted declarations—that they do just that. This difference is critical here because the Supreme Court has made clear that content moderation is expression
The court seems to be saying that algorithmically sorted personalized content gets MORE Constitutional protection than the traditional human editorial curation of one-size-fits-all publications. This can’t be right, but it’s the court’s way of dodging the Fifth Circuit’s terrible work.
Application of Strict Scrutiny
The court says the disclosures are not the least restrictive means available to the state:
instead of imposing the compelled speech requirement, Colorado could have incentivized social media companies to voluntarily provide these disclosures to their minor users, or it could have elected to provide minors with these disclosures itself….Colorado had other options at its disposal for advancing its goal of protecting the health and well-being of its children from the potential adverse effects of social media use.
* * *
NetChoice racks up another win in court, but as usual, it’s unclear if they can preserve the win on appeal.
Because the court enjoins the law, it’s tempting to overlook how bad Colorado’s law is. Don’t. In the name of “protecting kids online,” legislatures keep embracing terrible policy, including this law, and they should be condemned for it.
As the court summarizes, this law is intended to teach children that “excessive use of social media may be risky to their health and well-being.” It’s true that excessive use of anything is potentially risky, an asymmetrical warning about the risks of social media is a form of miseducation. It doesn’t teach minors how to make smart decisions about the social media use; and worse, it might prompt minors to second-guess and curtail their beneficial social media usage. In other words, disclosing only about risk, and not teaching minors how to evaluate cost-benefit, is unhelpful and pernicious.
And then…to push the message to minors every 30 minutes is annoying and likely counterproductive. Unwanted government-compelled disclosures breed reactance, especially among minors, and minors would surely develop blindness to the disclosures (like how consumers developed banner blindness to disregard unwanted banner ads).
To be clear, the government plays a critical role in teaching minors about responsible social media use. However, this teaching objective should be subject to good pedagogical design. Scare-tactics spam is the opposite of that. For additional ideas of how governments can actually help children online, see my Segregate-and-Suppress article.
Case Citation: NetChoice, LLC v. Weiser, 2025 WL 3101019 (D. Colo. Nov. 6, 2025)
