There Is No Bottom When It Comes to Section 230 Reform Proposals (Comments on the Justice Against Malicious Algorithms Act)
I couldn’t wrap my head around these leaders investing their political capital into such a poorly drafted bill that seemed more about messaging than solving problems. Or do they think it’s actually a serious, if wholly defective, proposal? Either way, it shows us how we, and the Internet, are inevitably screwed. When one of these 230 reform bills finds a coalition, we’re destined for an Internet that none of these politicians’ constituents actually want.
What the Bill Does
The bill eliminates Section 230 for making a personalized recommendation of information that materially contributes to a physical or severe emotional injury. A personalized recommendation is defined as “the material enhancement, using a personalized algorithm, of the prominence of such information with respect to other information.” A personalized algorithm is “an algorithm that relies on information specific to an individual.”
The new Section 230 exclusion doesn’t apply to (a) services with less than 5M unique monthly visitors, (b) recommendations made in response to a user-specified search, or (c) vendors such as web hosts, domain name registrars, CDN, data storage, and cybersecurity.
Why This Bill is Bad
This bill embodies many of the worst ideas about how NOT to reform Section 230. If the drafters want better inspiration for Section 230 reform ideas, they should consult this 2019 statement.
Some of the structural problems with the bill:
- the bill misplaces its scienter standards. It removes Section 230 if the service recklessly, knew, or should have known that it was making personalized recommendations, but what’s the point of that? Any Internet service that makes a personalized recommendation will necessarily know that it’s doing so. The scienter qualifications make no sense.
- The scienter standards would make slightly more sense if they modified the imposition of harm, but that’s not what the bill says. Moving the words around won’t make the bill better, however. Adding scienter preconditions to Section 230 functionally repeal the immunity by eliminating Section 230’s procedural benefits. (For more, see Cathy Gellis’ post on this point). Plaintiffs can easily claim that services had the requisite scienter and survive a motion to dismiss, boosting defense costs even if the defense ultimately prevails, because the service will now have to go through discovery on its scienter and delay resolution until summary judgment at the earliest. Anyone advocating to impose scienter conditions on Section 230 either doesn’t understand why Section 230 works today or wants to repeal it without admitting that goal.
- the harm standard is unmanageable. How can services predict which UGC items will “materially contribute to a physical or severe emotional injury” and which won’t? They can’t make this assessment on an automated basis; and even manual review won’t identify all of the potential problematic items. Thus, this standard will require Internet services to assume that *all* UGC could “materially contribute” to the specified harms and thus clamp down on all UGC.
- the “physical or severe emotional injury” standard is problematic in two other ways. First, not all behavior that causes severe emotional injury will lead to a cause of action, so this standard is overinclusive. Second, this harm standard opens up a Section 230 bypass for any cause of action that recognizes physical or emotional injury–which is virtually all causes of action, including defamation. This too destroys Section 230’s procedural benefits because plaintiffs can easily graft an allegation of emotional injury to any complaint and bypass Section 230’s motion to dismiss.
- the definition of “personal algorithm” (“an algorithm that relies on information specific to an individual”) is confusing and over-inclusive. One obvious example is geo-location, which is specific to the geo-located individual. Imagine rolling back 230 for all geo-located content; or imagine abandoning geo-location. (Also, how specific does the geo-location need to be–if it’s geo-location of an individual at the national level, is that specific to the individual?).
- the exclusion applies equally to personalized content and personalized ad targeting, so this bill would potentially wreak havoc on the entire advertising ecosystem. After all, I get severe emotional distress when I get targeted ads for old or chunky men…and when I see Facebook’s ads urging Congress to reform Section 230…
- like all other efforts to impose size-based distinctions in Section 230, the “small service” exception is miscalibrated. Jess Miers and I put together a much-needed guide for properly drafting size-based distinctions for Internet services. For example: the “unique monthly visitors” standard isn’t defined and applies to services without registered users; there’s no phase-in period; and the 5M “unique monthly visitors” standard is too low. I think the supporters would be surprised at how many services they pick up unintentionally!
- The bill, like so many in this genre, doesn’t thoughtfully anticipate the possible countermoves that services might make beyond retrenching personalized content. For example, Internet services could make all content presentations into searches, which would annoy consumers greatly (imagine, for example, every service asking readers “would you like to see articles meant for you? Enter your location here to search for them”); they could migrate towards “top content” lists, which reify existing power structures and disadvantage minority voices; they could aggressively downrank content to make the remaining content more prominent (the 230 exclusion only applies to enhancing the prominence of UGC, not to degrading prominence); or they could just randomly publish content in reverse chronological order and let the trolls and spammers take over. Which of these alternative outcomes do the bill supporters want?
To make sure you didn’t miss the point, this bill would functionally eliminate Section 230’s immunity for UGC in multiple independent ways, which would severely undermine or eliminate much of the current UGC ecosystem. That makes the bill impossible to salvage with some tweaks.
For more on the illogic of linking Section 230 to algorithmic amplification, see Daphne Keller, Amplification and its Discontents.
What Would This Bill Do to Facebook/Instagram?
Facebook has been begging Congress for years to amend Section 230, because Facebook naively (or hubristically) assumes that it can steer Congress into good 230 reforms that entrench a competitive moat around Facebook and Instagram and hurt their competition. In reality, Congress is gunning for Facebook and Instagram, so each time Facebook begs Congress to reform Section 230, it’s digging its own grave (and the grave of all UGC generally…).
This bill obviously targets Facebook and Instagram, but unsurprisingly it overshoots its mark. For example, even if Facebook and Instagram eliminated their newsfeed algorithms in response to this bill, it would still lose Section 230 because presenting content only from a user’s friends would constitute an algorithm that relies on information specific to an individual, i.e., the friendship. Instead, to retain Section 230, Facebook and Instagram would have to present information from non-friends and friends equally, which means that most newsfeeds would rarely, if ever, display news from friends. So the bill would blow apart Facebook’s and Instagram’s core value proposition of letting friends talk to each other. At the same time, it would devastate all other social media services that use “friends” or “followers” as part of the content delivery mechanism. This would dial the entire Internet ecosystem back to look a lot more like the 1990s.
Final Thoughts
There’s a deep irony about this bill. Congressional staffers rely heavily on personalized algorithms (like lobbyist recommendations) for their information flows, yet this bill seeks to kibosh the personalized algorithms for the rest of us. ¯\_(ツ)_/¯
At its core, this bill is really a privacy bill misplaced in Section 230. The supporters should instead invest their time and energy in actually working on a comprehensive federal privacy bill that preempts problematic state laws like the CCPA/CPRA. I’m not convinced that the First Amendment permits the regulation of algorithms under the privacy rubric as opposed to the Section 230 rubric, but at least the conversation will make more sense than this bill does.
Parting thought: The “JAMAA” acronym, combined with the bill’s problems, brings to mind Carl Carlton’s classic song, She’s a Bad Mama Jama (hat tip Alan Kyle).
Prior Blog Posts on the 117th Congress’ Efforts to Kill the Internet
- The SHOP SAFE Act Is a Terrible Bill That Will Eliminate Online Marketplaces
- Comments on the PROMISE Act
- Comments on the “SAFE TECH” Act
- Comments on the “Protecting Constitutional Rights from Online Platform Censorship Act”
- While Our Country Is Engulfed By Urgent Must-Solve Problems, Congress Is Working Hard to Burn Down Section 230