Texas Judge Enjoins App Store Authentication Law–CCIA and SEAT v. Paxton

This blog post involves two near-identical cases challenging the Texas App Store Accountability Act, Senate Bill 2420 (“SB 2420” or “the Act”), Tex. Bus. & Com. Code § 121.001 et seq. SB 2420 is a robust multi-pronged segregate-and-suppress law that “imposes age verification, parental verification, parental consent, and compelled speech on app stores and app developers.” The district court preliminarily enjoined the law–which is probably only a temporary reprieve until the Fifth Circuit gets these cases and continues its relentless celebration of censorship.

[My coverage quotes and cites the SEAT ruling except where noted.]

* * *

The law establishes a four-tier app rating system (under 13, 13-15, 16-17, 18+) and requires every app to self-rate and explain the rating. App stores must age-authenticate every user before they can download any apps at all. Under-18s are allowed to use or purchase apps only with parental consent, informed by the app developers’ self-rating and description. To enable parental consent, app stores must authenticate parental status. If an app “materially changes,” all prior consents are revoked and the app developer must start all over.

The court says this is a content-based regulation triggering strict scrutiny. First, the act excludes some favored nonprofits (e.g., “Plaintiffs would not face a barrier accessing an app from the College Board but would be unable to access an app from a newspaper”). Second, “SB 2420 specifically sought to shield minors from certain speech the State deems objectionable or harmful.” Paxton argued that the law partially regulated only commercial speech, but “the Act covers a wide variety of speech, including consuming news, social media, and entertainment.”

Unsurprisingly, the law doesn’t survive strict scrutiny:

it is far from clear that Texas has a compelling interest in preventing minors’ access to every single category of speech restricted by SB 2420. State interests in protecting minors exist; for example, the State has a compelling interest in preventing minors from accessing information that facilitates child pornography or sexual abuse. On the other hand, nothing suggests Texas’s interest in preventing minors from accessing a wide variety of apps that foster protected speech (such as the Associated Press, the Wall Street Journal, Substack, or Sports Illustrated) is compelling. While SB 2420 may have some compelling applications, the categories of speech it restricts are so exceedingly overbroad that Paxton likely cannot show a compelling state interest.

Paxton argued that the state can engage in censorship to address public health issues. The court sees a means-fit problem:

Paxton does not cite evidence to substantiate the assertion that downloading an app of any kind without parental permission poses a health hazard to minors. That argument gestures toward Texas’s interest in preventing social media addiction, but SB 2420’s coverage sweeps far wider—all apps are restricted, beyond social media, as described above. So too, SB 2420 does not limit its scope to apps that use addictive algorithms designed to encourage prolonged use, or apps that are responsible in particular for causing excessive screen time. As one example, SB 2420 restricts access to apps that seek to promote physical or mental health, such as mindfulness apps like Calm, fitness apps like Strava, or therapy providers like BetterHelp. Along those lines, Plaintiffs cite evidence that apps benefit young people and that minors’ access to app-based content is not linked to mental-health problems. Plaintiffs also attest that they use apps to advance their studies and civic engagement, as described above.

Later, the judge reinforces his view that a less restrictive alternative would have been to “narrowly target regulations toward apps that the State demonstrates have specific addictive qualities.” To be clear, I think regulations against “addictive algorithms” or “addictive apps” are also unconstitutional censorship. We’ll find out soon enough. More generally, permitting censors to broadly justify censorship on “public health” grounds would take society down a clearly dangerous and unsustainable path, as is the case in most authoritarian regimes.

The court points out other means-fit problems:

  • “SB 2420 specifically cuts teenagers off from wide swaths of the critical “democratic forum[] of the Internet” even though the same content offered via apps remains available to minors via pre-downloaded apps like Safari (or in stores)”
  • “Texas also prohibits minors from participating in the democratic exchange of views online by curtailing their access to all apps.”
  • “nothing in the record suggests, for instance, that teens suffer from mental health disorders due to using an app for their debate team preparation (as Plaintiff M.F. does), reading the news on CNBC (as Plaintiff Z.B. does) or the New York Times, or accessing e-books via Kindle, even though those apps are age-restricted and subject to parental override under the Act in the same way as social media.”
    • In the CCIA opinion, the court notes the same illogic with respect to dictionary and weather apps.

Thus,

Even accepting that Texas could pass legislation to counter harmful effects of social media use on minors’ mental or physical health, Paxton has not demonstrated that age-screening everyone in Texas and banning minors from accessing app store content without individualized parental consent is the least restrictive method to eliminate that harm

The court adds that if the standard were intermediate scrutiny, the law would still fail because “Texas has not offered any evidence connecting the Act’s goals to its methods.”

The court also says the “material change” triggers are unconstitutionally vague. “Paxton could seek to hold an app developer liable for violating this provision, for example, for each new song added to Apple Music or, as a less draconian example, when a new category of content is made available (e.g., when Spotify added podcasts alongside music).” This would support selective enforcement and over-censorship.

In the CCIA opinion, the court further notes that:

The Act holds app developers and app stores liable for knowingly misrepresenting an age rating but fails to provide meaningful guidance to developers and stores about what metrics should be used to determine the age rating of an app. SB 2420 is silent about what content is appropriate or inappropriate for each age category. Developers and stores could follow existing nongovernmental standards, yet following those standards may not be sufficient and could lead them to violate § 121.021(b). The portions of the Act that hold app developers and stores liable for getting age ratings wrong are impermissibly vague

Perhaps surprisingly, the court doesn’t discuss any problems with the parental authentication mandate, even though no one has a clue how to implement that requirement accurately and at scale.

Finally, the court says that a facial challenge is permitted here per Moody, because “only in the vast minority of applications would SB 2420 have a constitutional application to unprotected speech not addressed by other laws.”

* * *

The court summarizes: “The Act is akin to a law that would require every bookstore to verify the age of every customer at the door and, for minors, require parental consent before the child or teen could enter and again when they try to purchase a book.” A law like that sounds horrifying to me, but many people across the globe favor more heavily gated information ecosystems. That’s why the forces of censorship will win, and the Internet as we currently know it will not survive. 📉

Case Citations:

Students Engaged in Advancing Texas v. Paxton, 2025 WL 3731733 (W.D. Tex. Dec. 23, 2025)

Computer & Communications Industry Association v. Paxton, 1:25-cv-01660-RP (W.D. Tex. Dec. 23, 2025)

* * *

Blog Posts on Segregate-and-Suppress Obligations