Court Enjoins Another Arkansas Segregate-and-Suppress Law–NetChoice v. Griffin
[Note: I have other NetChoice rulings and segregate-and-suppress opinions stuck in my blog queue. I hope to cover them eventually. I’m fast-tracking this one because it rejects some noxious yet popular forms of Internet suppression.
Also, check out this line from the opinion: “Arkansas cannot sentence speech on the internet to death by a thousand cuts.” To be fair, most legislators would choose to sentence Internet speech to death in one swift, decisive blow if they could.]
* * *
This case involves NetChoice’s Constitutional challenge to Arkansas Act 900 of 2025, one of many Internet censorship laws coming out of Arkansas. I previously blogged about the injunctions against Act 689, the so-called Social Media Safety Act and Act 901. Does the Arkansas legislature do anything other than pass unconstitutional Internet censorship laws?
Act 900 tries to revive Act 689 by amending it after it was enjoined. Not surprisingly, the amendment doesn’t go well. The court preliminarily enjoins Act 900 too.
Who does the law “protect”?
“Act 900 has one particularly noteworthy problem: “users.” Act 900 has three different definitions for relationships a person can have with a platform….The addictive practices provision and the default provisions therefore apply to all Arkansas minors, whether they have a social media account or are merely a website visitor. Worse, the dashboard provision applies only to minor “users,” not account holders.”
This definitional problem is probably the result of a botched amendment, but it’s no less embarrassing. I guess you can’t make a censorship scramble without breaking a few eggs? 🥚
Addictive Practices
The addictive practices provision of Act 900 requires platforms to “ensure” that they “do[ ] not engage in practices to evoke any addiction or compulsive behaviors in an Arkansas user who is a minor, including without limitation through notifications, recommended content, artificial sense of accomplishment, or engagement with online bots that appear human.
The judge says this restriction is void for vagueness. Two problems with this language compared to Act 901: (1) “Act 900 is not limited to addiction to the platform itself.” (2) “Act 900 imposes liability on a strict liability basis, while Act 901 imposes liability on a negligence basis….a platform is liable for a practice the evokes addiction in a single child even if it could not have known through the exercise of reasonable care that the practice would have such an effect.”
The anti-addiction provision was coupled with quarterly audit requirements that services double-check they aren’t doing anything addictive. “This requirement is even more expansive with respect to what platforms must audit for—not just full-fledged “addiction,” but “addiction-driven behavior” caused by the platform—again, whether that behavior is on- or off-platform.”
Default Provisions
social media platforms must also “[e]nsure that, by default:” (1) “[n]otifications to an Arkansas user who is a minor, other than safety or privacy-related alerts, are ceased between the hours of 10:00 p.m. central standard time (CST) and 6:00 a.m. central standard time (CST) and allow a parent or guardian to modify this setting”; and (2) “[p]rivacy and safety settings for an Arkansas user who is a minor on a covered social media platform provides the most protective level of control for privacy and safety offered by the covered social media platform.” The Court assumes that the content-based exception to the notifications default for “safety or privacy-related alerts” is severable, so strict scrutiny does not apply to the remainder
The court sees the notification provisions as time-place-manner restrictions. “The State has a significant interest in ensuring minors get enough sleep, and this interest is unrelated to the suppression of free expression.”
The court nevertheless gets stuck on the lack of tailoring. “The notifications default applies to “Arkansas users”— account holders and platform visitors alike. It seems to the Court that platforms would therefore have to silence notifications between 10 p.m. and 6:00 a.m. for everyone in Arkansas unless they have become an age-verified adult account holder.”
The court also questions the parent override because “parents are part of the problem. If parents wanted to prevent their children’s sleep from being disrupted by late-night notifications, they have a readily available, free, no-tech solution already at their disposal: taking devices away at night.” This leads to a zinger:
The State has provided no evidence that parents lack the tools to assert their authority in this domain, so it appears unlikely that the State’s deferential approach to restricting nighttime notifications will actually serve its stated interest in ensuring minors get enough sleep. This “is not how one addresses a serious social problem.”
Thus, the notifications restriction “burdens platforms’ speech by silencing them for a third of the day without any indication that the burden will reduce nighttime social media use or otherwise serve the State’s asserted interest at all.”
[I don’t love the court’s methodology here. The main problem isn’t the ineffectiveness of the notification time restrictions; it’s that notifications are an integral part of the services’ editorial expression, i.e., how to communicate with their audiences. Meanwhile, states should be able to empower parents to make choices for their children, even if many parents choose not to restrict their children. The true problems lie elsewhere. For example, in my Segregate-and-Suppress paper, I identify functionally unsolvable problems with giving parents the right to decide how the children use the Internet.]
As for the heightened privacy settings requirement, the court says it’s not a time-place-manner restriction. Instead, the required settings “all restrict platforms’ ability to disseminate minors’ speech and to disseminate speech to minors and therefore implicate the First Amendment.” The court says the provision has to satisfy intermediate scrutiny.
The court says “the privacy default says nothing about who can change these settings, leaving the Court to conclude that, because the Act imposes a mere “default,” anyone—parent or child—can opt for less restrictive settings.” As a result, it’s not a parental control mechanism. The children’s agency over their own settings makes the provision “wildly underinclusive.” (Again, giving children agency may be a better approach than the alternative).
The court also has problems with the provision’s sweep:
Act 900 has a broad definition of “social media platform” that sweeps in websites like Nextdoor and Pinterest which are unlikely to be the site of sexual exploitation, burdening minors’ ability to speak and be spoken to on those platforms and burdening platforms’ ability to disseminate minors’ speech.
Thus, the court concludes:
the law, in effect, allows children to decide whether they need protection from sexual exploitation online because they are free to depart from the protective default. As Defendants’ evidence shows, teenagers’ developing brains make them less likely than adults to appreciate the risks associated with, for example, making their profiles public. Like the notification default, while the burdens imposed by the privacy default may be slight, they do not appear likely to serve the State’s asserted interest at all. Imposing small burdens on vast quantities of speech for no appreciable benefit is not consistent with the First Amendment. Arkansas cannot sentence speech on the internet to death by a thousand cuts
Dashboard Provision
Act 900 requires platforms to “[d]evelop an easily accessible online dashboard to allow a parent of a minor user to view and understand his or her child’s use habits.” This dashboard “shall also provide tools for a parent to restrict his or her minor child’s access to the covered social media platform, or logical portions of the covered social media platform.”
The court can’t decide if this is a Zauderer situation, but it doesn’t matter because “this provision is so unduly burdensome that it fails even” the Zauderer standard. A reminder that we desperately and urgently need a complete rethink of everything associated with Zauderer.
The court gets stuck on the fact that the dashboard only applies to unregistered users. (This appears to be a drafting mistake…? Who knows what the legislature was thinking. They cared more about censorship than making sense.). The court says providing this resource to unregistered users forces services to collect more personal information than they want:
This requirement would force platforms to compile scores of data about minor visitors to their websites, “somehow identify each minor’s parents” to provide dashboard access to them, and follow minors across devices to enforce parental restrictions. Such a requirement is unduly burdensome and seems likely to chill platforms’ dissemination of speech to or from anyone who is not an account holder.
[I would add that all of these problems are inherent in any parental control, supervision, or access provision, not just this particular situation where the legislature illogically extended these rights only to unregistered users.]
Implications
This is a quirky opinion with some logic twists that an appeals court may not agree with. Personally, I wish that courts would strike down laws at their conceptual layer, such as saying that age authentication mandates are always unconstitutional, or efforts to define social media will always be fatally under- and over-inclusive, or parental controls over their children’s online behavior are always mistailored because of the impossibility of authenticating parental status and the risks that parents will weaponize that control in opposition to their children’s interests. (I could go on with other structural problems). This opinion hints at some of these concerns but never reaches these more definitive positions.
Having said that, the court reaches the right place. Essentially, the court makes it impossible for legislatures to push segregate-and-suppress laws because they can never navigate the vagueness and tailoring problems sufficiently. The legislature can read this opinion and try to iterate the law yet again to address the judge’s concern, and they will still fail. Of course, the response of every legislature seems to be: if censorship is on the line, CHALLENGE ACCEPTED.
Case Citation: NetChoice LLC v. Griffin, 5:25-cv-05140-TLB (W.D. Ark. April 20, 2026)
* * *
Blog Posts on Segregate-and-Suppress Obligations
- Too Many Courts Are Letting States Take Wrecking Balls to the Internet (Roundup)
- Texas Judge Enjoins App Store Authentication Law–CCIA and SEAT v. Paxton
- Courts Enjoin Internet Censorship Laws in Louisana and Arkansas
- Challenge to Maryland’s “Kid Code” Survives Motion to Dismiss–NetChoice v. Brown
- My Testimony Against Mandatory Online Age Authentication
- Read the Published Version of My Paper Against Mandatory Online Age Authentication
- Prof. Goldman’s Statement on the Supreme Court’s Demolition of the Internet in Free Speech Coalition v. Paxton
- Court Permanently Enjoins Ohio’s Segregate-and-Suppress/Parental Consent Law–NetChoice v. Yost
- Arkansas’ Social Media Safety Act Permanently Enjoined—NetChoice v. Griffin
- Why I Emphatically Oppose Online Age Verification Mandates
- California’s Age-Appropriate Design Code (AADC) Is Completely Unconstitutional (Multiple Ways)–NetChoice v. Bonta
- Another Conflict Between Privacy Laws and Age Authentication–Murphy v. Confirm ID
- Recapping Three Social Media Addiction Opinions from Fall (Catch-Up Post)
- District Court Blocks More of Texas’ Segregate-and-Suppress Law (HB 18)–SEAT v. Paxton
- Comments on the Free Speech Coalition v. Paxton SCOTUS Oral Arguments on Mandatory Online Age “Verification”
- California’s “Protecting Our Kids from Social Media Addiction Act” Is Partially Unconstitutional…But Other Parts Are Green-Lighted–NetChoice v. Bonta
- Section 230 Defeats Underage User’s Lawsuit Against Grindr–Doll v. Pelphrey
- Five Decisions Illustrate How Section 230 Is Fading Fast
- Internet Law Professors Submit a SCOTUS Amicus Brief on Online Age Authentication–Free Speech Coalition v. Paxton
- Court Enjoins the Utah “Minor Protection in Social Media Act”–NetChoice v. Reyes
- Another Texas Online Censorship Law Partially Enjoined–CCIA v. Paxton
- When It Comes to Section 230, the Ninth Circuit is a Chaos Agent–Estate of Bride v. YOLO
- Court Dismisses School Districts’ Lawsuits Over Social Media “Addiction”–In re Social Media Cases
- Ninth Circuit Strikes Down Key Part of the CA Age-Appropriate Design Code (the Rest is TBD)–NetChoice v. Bonta
- Mississippi’s Age-Authentication Law Declared Unconstitutional–NetChoice v. Fitch
- Indiana’s Anti-Online Porn Law “Is Not Close” to Constitutional–Free Speech Coalition v. Rokita
- Fifth Circuit Once Again Disregards Supreme Court Precedent and Mangles Section 230–Free Speech Coalition v. Paxton
- Snapchat Isn’t Liable for Offline Sexual Abuse–VV v. Meta
- 2023 Quick Links: Censorship
- Court Enjoins Ohio’s Law Requiring Parental Approval for Children’s Social Media Accounts–NetChoice v. Yost
- Many Fifth Circuit Judges Hope to Eviscerate Section 230–Doe v. Snap
- Louisiana’s Age Authentication Mandate Avoids Constitutional Scrutiny Using a Legislative Drafting Trick–Free Speech Coalition v. LeBlanc
- Section 230 Once Again Applies to Claims Over Offline Sexual Abuse–Doe v. Grindr
- Comments on the Ruling Declaring California’s Age-Appropriate Design Code (AADC) Unconstitutional–NetChoice v. Bonta
- Two Separate Courts Reiterate That Online Age Authentication Mandates Are Unconstitutional
- Minnesota’s Attempt to Copy California’s Constitutionally Defective Age Appropriate Design Code is an Utter Fail (Guest Blog Post)
- Do Mandatory Age Verification Laws Conflict with Biometric Privacy Laws?–Kuklinski v. Binance
- Why I Think California’s Age-Appropriate Design Code (AADC) Is Unconstitutional
- An Interview Regarding AB 2273/the California Age-Appropriate Design Code (AADC)
- Op-Ed: The Plan to Blow Up the Internet, Ostensibly to Protect Kids Online (Regarding AB 2273)
- A Short Explainer of Why California’s Social Media Addiction Bill (AB 2408) Is Terrible
- A Short Explainer of How California’s Age-Appropriate Design Code Bill (AB2273) Would Break the Internet
- Is the California Legislature Addicted to Performative Election-Year Stunts That Threaten the Internet? (Comments on AB2408)
- Omegle Denied Section 230 Dismissal–AM v. Omegle
- Snapchat Isn’t Liable for a Teacher’s Sexual Predation–Doe v. Snap
- Will California Eliminate Anonymous Web Browsing? (Comments on CA AB 2273, The Age-Appropriate Design Code Act)
- Minnesota Wants to Ban Under-18s From User-Generated Content Services
- California’s Latest Effort To Keep Some Ads From Reaching Kids Is Misguided And Unconstitutional (Forbes Cross-Post)
- Backpage Gets Important 47 USC 230 Win Against Washington Law Trying to Combat Online Prostitution Ads (Forbes Cross-Post & More)
- Backpage Gets TRO Against Washington Law Attempting to Bypass Section 230–Backpage v. McKenna
- MySpace Wins Another 47 USC 230 Case Over Sexual Assaults of Users–Doe II v. MySpace
- MySpace Gets 230 Win in Fifth Circuit–Doe v. MySpace
- Website Isn’t Liable When Users Lie About Their Ages–Doe v. SexSearch
