<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	
	>
<channel>
	<title>
	Comments on: Comments on the Jury Verdict in the Los Angeles Social Media Addiction Bellwether Trial (Expanded/Updated)	</title>
	<atom:link href="https://blog.ericgoldman.org/archives/2026/03/comments-on-the-jury-verdict-in-the-los-angeles-social-media-addiction-bellwether-trial.htm/feed" rel="self" type="application/rss+xml" />
	<link>https://blog.ericgoldman.org/archives/2026/03/comments-on-the-jury-verdict-in-the-los-angeles-social-media-addiction-bellwether-trial.htm</link>
	<description></description>
	<lastBuildDate>Thu, 02 Apr 2026 22:02:00 +0000</lastBuildDate>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	
	<item>
		<title>
		By: Charles Barton		</title>
		<link>https://blog.ericgoldman.org/archives/2026/03/comments-on-the-jury-verdict-in-the-los-angeles-social-media-addiction-bellwether-trial.htm#comment-4582</link>

		<dc:creator><![CDATA[Charles Barton]]></dc:creator>
		<pubDate>Thu, 02 Apr 2026 22:02:00 +0000</pubDate>
		<guid isPermaLink="false">https://blog.ericgoldman.org/?p=28737#comment-4582</guid>

					<description><![CDATA[In reply to &lt;a href=&quot;https://blog.ericgoldman.org/archives/2026/03/comments-on-the-jury-verdict-in-the-los-angeles-social-media-addiction-bellwether-trial.htm#comment-4559&quot;&gt;David S. Gingras&lt;/a&gt;.

Some online journals seem to target the content of an edition to the subscriber. The publisher  is simply publishing multiple user-specific journals under the name.

A social medium platform differs from an online journal because the social medium platform holds out transport of a message under standard terms for compensation. In contrast, &lt;i&gt;Scientific American&lt;/i&gt; only transports its content to a subscriber.

(I) By common carriage doctine (especially in MA and in CA),  the social medium platform provides common carriage and publishing service.  When the social medium platform is a common carrier of messages (a modern telegraph), it is not liable for the transported message. &lt;i&gt;O’Brien v. W. Union Tel. Co.&lt;/i&gt;, 113 F.2d 539 (1st Cir. 1940).

(II) A social medium platform usually does not create its own content and is not liable for other people&#x27;s content, but moderation or curation is expressive action. The social medium platform seems potentially liable for the harmful effects of its expressive actions.

When the social medium platform tries to invoke the CDA for situation (II), it is dishonestly trying to hide its own expressive action to escape liability.

A social medium platform sometimes escapes liability via the CDA because the social medium platform mixes two logically distinct services.  The courts could force clarity by requiring a social medium platform to fulfill its common carriage obligations in a message conduit layer. Then the social medium platform would have the option -- if it wanted to be liable for its expressive action -- of creating a separate moderated or curated message stream with full First Amendment protection.
  https://uploads.disquscdn.com/images/5fb6cf5acc3aaa8c27a7e5ab520daae62203b2e3230a6a920bdbe43b522279b1.jpg]]></description>
			<content:encoded><![CDATA[<p>In reply to <a href="https://blog.ericgoldman.org/archives/2026/03/comments-on-the-jury-verdict-in-the-los-angeles-social-media-addiction-bellwether-trial.htm#comment-4559">David S. Gingras</a>.</p>
<p>Some online journals seem to target the content of an edition to the subscriber. The publisher  is simply publishing multiple user-specific journals under the name.</p>
<p>A social medium platform differs from an online journal because the social medium platform holds out transport of a message under standard terms for compensation. In contrast, <i>Scientific American</i> only transports its content to a subscriber.</p>
<p>(I) By common carriage doctine (especially in MA and in CA),  the social medium platform provides common carriage and publishing service.  When the social medium platform is a common carrier of messages (a modern telegraph), it is not liable for the transported message. <i>O’Brien v. W. Union Tel. Co.</i>, 113 F.2d 539 (1st Cir. 1940).</p>
<p>(II) A social medium platform usually does not create its own content and is not liable for other people&#x27;s content, but moderation or curation is expressive action. The social medium platform seems potentially liable for the harmful effects of its expressive actions.</p>
<p>When the social medium platform tries to invoke the CDA for situation (II), it is dishonestly trying to hide its own expressive action to escape liability.</p>
<p>A social medium platform sometimes escapes liability via the CDA because the social medium platform mixes two logically distinct services.  The courts could force clarity by requiring a social medium platform to fulfill its common carriage obligations in a message conduit layer. Then the social medium platform would have the option &#8212; if it wanted to be liable for its expressive action &#8212; of creating a separate moderated or curated message stream with full First Amendment protection.<br />
  <a href="https://uploads.disquscdn.com/images/5fb6cf5acc3aaa8c27a7e5ab520daae62203b2e3230a6a920bdbe43b522279b1.jpg" rel="nofollow ugc">https://uploads.disquscdn.com/images/5fb6cf5acc3aaa8c27a7e5ab520daae62203b2e3230a6a920bdbe43b522279b1.jpg</a></p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Why the Verdict on Social Media Defective Design Harming Children Gets the Instinct Right But the Law Wrong - Michael Geist		</title>
		<link>https://blog.ericgoldman.org/archives/2026/03/comments-on-the-jury-verdict-in-the-los-angeles-social-media-addiction-bellwether-trial.htm#comment-4581</link>

		<dc:creator><![CDATA[Why the Verdict on Social Media Defective Design Harming Children Gets the Instinct Right But the Law Wrong - Michael Geist]]></dc:creator>
		<pubDate>Thu, 02 Apr 2026 12:44:06 +0000</pubDate>
		<guid isPermaLink="false">https://blog.ericgoldman.org/?p=28737#comment-4581</guid>

					<description><![CDATA[[&#8230;] Treating the core architecture of a content platform as a product defect exposes the verdict to reversal on appeal and sets a precedent that could reach any attention-competing technology, such as streaming [&#8230;]]]></description>
			<content:encoded><![CDATA[<p>[&#8230;] Treating the core architecture of a content platform as a product defect exposes the verdict to reversal on appeal and sets a precedent that could reach any attention-competing technology, such as streaming [&#8230;]</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Why the Social Media Verdict on Defective Design Gets the Instinct Right But the Law Wrong - Michael Geist		</title>
		<link>https://blog.ericgoldman.org/archives/2026/03/comments-on-the-jury-verdict-in-the-los-angeles-social-media-addiction-bellwether-trial.htm#comment-4580</link>

		<dc:creator><![CDATA[Why the Social Media Verdict on Defective Design Gets the Instinct Right But the Law Wrong - Michael Geist]]></dc:creator>
		<pubDate>Thu, 02 Apr 2026 12:34:01 +0000</pubDate>
		<guid isPermaLink="false">https://blog.ericgoldman.org/?p=28737#comment-4580</guid>

					<description><![CDATA[[&#8230;] Treating the core architecture of a content platform as a product defect exposes the verdict to reversal on appeal and sets a precedent that could reach any attention-competing technology, such as streaming [&#8230;]]]></description>
			<content:encoded><![CDATA[<p>[&#8230;] Treating the core architecture of a content platform as a product defect exposes the verdict to reversal on appeal and sets a precedent that could reach any attention-competing technology, such as streaming [&#8230;]</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Charles Barton		</title>
		<link>https://blog.ericgoldman.org/archives/2026/03/comments-on-the-jury-verdict-in-the-los-angeles-social-media-addiction-bellwether-trial.htm#comment-4579</link>

		<dc:creator><![CDATA[Charles Barton]]></dc:creator>
		<pubDate>Thu, 02 Apr 2026 03:19:00 +0000</pubDate>
		<guid isPermaLink="false">https://blog.ericgoldman.org/?p=28737#comment-4579</guid>

					<description><![CDATA[At least one case before a federal appeals court may result in a definitive ruling that holds a social medium platform like Meta or like YouTube to be a common carrier of messages.

If a social medium platform is a common carrier of messages and if the social medium platform wishes to express itself by moderation or by curation, the social medium platform need only provide two feeds one for common carriage and one that provides moderation and curation.  Nothing would stop the social medium platform from provide another feed for training or behavioral modification. The social medium platform could give the user the ability to choose his preferred feed.
 https://uploads.disquscdn.com/images/91021b2c529adfd59fe4567f864ec8b5ea7aaa7e973887cc7a77263167c854f9.png]]></description>
			<content:encoded><![CDATA[<p>At least one case before a federal appeals court may result in a definitive ruling that holds a social medium platform like Meta or like YouTube to be a common carrier of messages.</p>
<p>If a social medium platform is a common carrier of messages and if the social medium platform wishes to express itself by moderation or by curation, the social medium platform need only provide two feeds one for common carriage and one that provides moderation and curation.  Nothing would stop the social medium platform from provide another feed for training or behavioral modification. The social medium platform could give the user the ability to choose his preferred feed.<br />
 <a href="https://uploads.disquscdn.com/images/91021b2c529adfd59fe4567f864ec8b5ea7aaa7e973887cc7a77263167c854f9.png" rel="nofollow ugc">https://uploads.disquscdn.com/images/91021b2c529adfd59fe4567f864ec8b5ea7aaa7e973887cc7a77263167c854f9.png</a></p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: David S. Gingras		</title>
		<link>https://blog.ericgoldman.org/archives/2026/03/comments-on-the-jury-verdict-in-the-los-angeles-social-media-addiction-bellwether-trial.htm#comment-4559</link>

		<dc:creator><![CDATA[David S. Gingras]]></dc:creator>
		<pubDate>Thu, 26 Mar 2026 14:41:00 +0000</pubDate>
		<guid isPermaLink="false">https://blog.ericgoldman.org/?p=28737#comment-4559</guid>

					<description><![CDATA[Eric - don&#x27;t shoot the messenger, but I am 100% OK with these verdicts. I don&#x27;t see any conflict or threat to the CDA, and I believe the verdicts will be affirmed on appeal....although the dollar amounts may be a bit silly.

Here&#x27;s why -- the CDA was passed in 1996. Obviously things were different back then. Message boards were largely passive. You posted a comment, saw comments from others, and life was peaceful. There were no algorithms shoving content in your face. The content you saw was mostly static.

Then Google/Facebook changed things. They started using algorithms that directly affected what content we see. Does that mean Google/Facebook created the content? Of course not. 

But if a website shows you content THEY picked for you, and which you would NOT have seen but-for the website operator&#x27;s active decision-making, that&#x27;s different. This isn&#x27;t about treating the website operator as a publisher or speaker (yes, I know publishers choose which content to show and what not to...of course). 

I still see this as something materially different than publishing. A publisher chooses which content to show to THE WORLD. A publisher does NOT decide to target individual readers with specific content tailored to THAT PERSON. No publisher would ever say: &#034;Hmm, here&#x27;s some really harmful stuff, and I am going to send this harmful stuff directly to one specific person - David Gingras.&#034;

That&#x27;s not publishing in the general or traditional sense. Do you really think the publisher of the New York Times sits at a desk and thinks: &#034;What stuff does David Gingras want to see in today&#x27;s paper?&#034; NO WAY.

It&#x27;s about treating website operators as guides who intentionally lead people to specific harmful content they wouldn&#x27;t otherwise see.

Think about this example -- imagine I place a large, blank bulletin board in a public park. People post a wide variety of messages on that board. One day, a sick person posts a message that says: &#034;All children are evil and should kill themselves.&#034; A normal, healthy child is passing by my message board, they see that message, read it, and they think: &#034;Hmm, that&#x27;s not very nice.&#034; They forget about it and walk away. At that point, my only role is providing the message board space. I had nothing else to do with any part of the process.

Here is what happens next -- I (as the message board operator) watched the child read that message. I then decide: &#034;I think that kid liked that message, so I am going to make sure they see nothing but similar messages moving forward.&#034; I then wave a magic wand so each time the child visits my message board, they see ONLY messages that encourage children to self-harm. Yes, the messages technically were all written by third parties. But this specific child would never have seen them but-for my decision to make the child see them.

I dunno....I have a fair bit of knowledge about the CDA, and to me, that doesn&#x27;t feel like the CDA should apply. I know it KIND OF feels like publishing, but also not.

Maybe this is a better example -- imagine you have a child with an eating disorder. I see some economic benefit in pushing your child to die, so I go out and find articles that encourage eating disorders...articles your child would never have seen on their own. Motivated by profit, I decide to email those articles directly your child every single day. I know your child has a problem, and I decide to make money by sending harmful content to your child that I know will make the problem worse.

Yeah, I&#x27;m sorry. I LOVE the CDA, but behavior like this is something different. I personally don&#x27;t buy the &#034;internet addiction&#034; thing....that&#x27;s like suing McDonald&#x27;s for making me fat because their fries are so tasty. At some level, personal responsibility breaks the causal chain.

At the end of the day, I think social media companies that use algorithms to increase engagement (and thus, profit) bear some responsibility for the societal costs imposed by that choice. Maybe this is too subtle to leave in the hands of a jury of non-experts, but I am perfectly fine with creating new legislation that puts SOME small guardrails on this stuff.

I didn&#x27;t like Charlie Kirk (at all), but I&#x27;ll steal one of his lines -- &lt;i&gt;PROVE ME WRONG&lt;/i&gt;.]]></description>
			<content:encoded><![CDATA[<p>Eric &#8211; don&#x27;t shoot the messenger, but I am 100% OK with these verdicts. I don&#x27;t see any conflict or threat to the CDA, and I believe the verdicts will be affirmed on appeal&#8230;.although the dollar amounts may be a bit silly.</p>
<p>Here&#x27;s why &#8212; the CDA was passed in 1996. Obviously things were different back then. Message boards were largely passive. You posted a comment, saw comments from others, and life was peaceful. There were no algorithms shoving content in your face. The content you saw was mostly static.</p>
<p>Then Google/Facebook changed things. They started using algorithms that directly affected what content we see. Does that mean Google/Facebook created the content? Of course not. </p>
<p>But if a website shows you content THEY picked for you, and which you would NOT have seen but-for the website operator&#x27;s active decision-making, that&#x27;s different. This isn&#x27;t about treating the website operator as a publisher or speaker (yes, I know publishers choose which content to show and what not to&#8230;of course). </p>
<p>I still see this as something materially different than publishing. A publisher chooses which content to show to THE WORLD. A publisher does NOT decide to target individual readers with specific content tailored to THAT PERSON. No publisher would ever say: &quot;Hmm, here&#x27;s some really harmful stuff, and I am going to send this harmful stuff directly to one specific person &#8211; David Gingras.&quot;</p>
<p>That&#x27;s not publishing in the general or traditional sense. Do you really think the publisher of the New York Times sits at a desk and thinks: &quot;What stuff does David Gingras want to see in today&#x27;s paper?&quot; NO WAY.</p>
<p>It&#x27;s about treating website operators as guides who intentionally lead people to specific harmful content they wouldn&#x27;t otherwise see.</p>
<p>Think about this example &#8212; imagine I place a large, blank bulletin board in a public park. People post a wide variety of messages on that board. One day, a sick person posts a message that says: &quot;All children are evil and should kill themselves.&quot; A normal, healthy child is passing by my message board, they see that message, read it, and they think: &quot;Hmm, that&#x27;s not very nice.&quot; They forget about it and walk away. At that point, my only role is providing the message board space. I had nothing else to do with any part of the process.</p>
<p>Here is what happens next &#8212; I (as the message board operator) watched the child read that message. I then decide: &quot;I think that kid liked that message, so I am going to make sure they see nothing but similar messages moving forward.&quot; I then wave a magic wand so each time the child visits my message board, they see ONLY messages that encourage children to self-harm. Yes, the messages technically were all written by third parties. But this specific child would never have seen them but-for my decision to make the child see them.</p>
<p>I dunno&#8230;.I have a fair bit of knowledge about the CDA, and to me, that doesn&#x27;t feel like the CDA should apply. I know it KIND OF feels like publishing, but also not.</p>
<p>Maybe this is a better example &#8212; imagine you have a child with an eating disorder. I see some economic benefit in pushing your child to die, so I go out and find articles that encourage eating disorders&#8230;articles your child would never have seen on their own. Motivated by profit, I decide to email those articles directly your child every single day. I know your child has a problem, and I decide to make money by sending harmful content to your child that I know will make the problem worse.</p>
<p>Yeah, I&#x27;m sorry. I LOVE the CDA, but behavior like this is something different. I personally don&#x27;t buy the &quot;internet addiction&quot; thing&#8230;.that&#x27;s like suing McDonald&#x27;s for making me fat because their fries are so tasty. At some level, personal responsibility breaks the causal chain.</p>
<p>At the end of the day, I think social media companies that use algorithms to increase engagement (and thus, profit) bear some responsibility for the societal costs imposed by that choice. Maybe this is too subtle to leave in the hands of a jury of non-experts, but I am perfectly fine with creating new legislation that puts SOME small guardrails on this stuff.</p>
<p>I didn&#x27;t like Charlie Kirk (at all), but I&#x27;ll steal one of his lines &#8212; <i>PROVE ME WRONG</i>.</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Addictive Social Media Litigation: Cutting off Innovation to Spite Big Tech		</title>
		<link>https://blog.ericgoldman.org/archives/2026/03/comments-on-the-jury-verdict-in-the-los-angeles-social-media-addiction-bellwether-trial.htm#comment-4557</link>

		<dc:creator><![CDATA[Addictive Social Media Litigation: Cutting off Innovation to Spite Big Tech]]></dc:creator>
		<pubDate>Thu, 26 Mar 2026 00:27:06 +0000</pubDate>
		<guid isPermaLink="false">https://blog.ericgoldman.org/?p=28737#comment-4557</guid>

					<description><![CDATA[[&#8230;] Los Angeles and New Mexico found Meta (Facebook) liable for harming the mental health of children. The legal side of the verdicts has been covered comprehensively. The impact on innovation deserves its own [&#8230;]]]></description>
			<content:encoded><![CDATA[<p>[&#8230;] Los Angeles and New Mexico found Meta (Facebook) liable for harming the mental health of children. The legal side of the verdicts has been covered comprehensively. The impact on innovation deserves its own [&#8230;]</p>
]]></content:encoded>
		
			</item>
	</channel>
</rss>
