Section 230’s Application to Account Terminations, CSAM, and More

Weiss v. Google LLC, 2026 WL 733788 (Cal. App. Ct. March 16, 2026)
Weiss’ business started running financial services ads on Google in 2015. Google suspended the ads multiple times, until Google issued a final suspension in 2024. The court says Section 230 protects Google’s suspension decisions.
The court starts with standard context-setting: “California’s appellate courts and federal courts have also generally interpreted section 230 to confer broad immunity on interactive computer services.”
The court continues:
Weiss seeks to adjudicate Google’s characterization of his business and its decision to suspend its ads. However, this conduct, i.e., Google’s “refusal to allow certain content on its platform,” is “typical publisher conduct protected by section 230” regardless of the reason for that refusal….
even if Google’s characterization of Weiss’s advertisements does not align with Weiss’s characterization, section 230 still affords Google immunity from liability for its decision to suspend his content…
all the content Weiss claims Google wrongfully suspended was admittedly created by Weiss, not Google…
Google’s determination that Weiss’s ads violated its general policies is not equivalent to contributing to the ads’ content.
In a footnote, the court adds: “Weiss seeks to hold Google liable for its enforcement of its own general policies, rather than a breach of a specific promise.”
When the dust settles, this becomes just another failed lawsuit over account terminations and content removals.
A reminder of the content moderation dilemma Google faces here. A few courts have said that Facebook doesn’t qualify for Section 230 protection for running scammy ads (e.g., Forrest v. Facebook). As a result, Google has good reason to suspend Weiss’ ads to manage its own liability exposure. At the same time, if Weiss succeeded with his claims here, then Google would have been potentially liable for removing ads based on Google’s fears that they are scammy. This would force Google to deploy a Goldilocks version of content moderation: Google would have to get its ad removal policy “just right,” with potential liability for mistakes in either direction. An impossible challenge.
Thompson v. The Meet Group, 2026 WL 730134 (E.D. Pa. March 16, 2026)
Thompson said Tagged deactivated his livestreaming account and stole $10k from him.
For reasons that aren’t obvious to me, Tagged defended on Section 230(c)(2)(A) grounds instead of 230(c)(1). Maybe this has something to do with trying to navigate around the abysmal Anderson v. TikTok case? EDPa courts are bound by that decision.
The court says Tagged can’t establish the 230(c)(2)(A) defense elements on a motion to dismiss: “application of CDA immunity in this case requires assessment of facts that are not in the pleadings—such as the reason why Thompson’s account was disabled and the content of Thompson’s posts.” Also, Thompson’s allegations of theft might defeat 230(c)(2)(A)’s good faith prerequisite. Cites to Smith v. TRUSTe and e-ventures v. Google.
No matter, the case fails anyway. (Another example of Section 230 not being the only reason why lawsuits lose). The court says the plaintiff had no property interest in his social media account that could be converted (cite to Eagle v. Morgan). The plaintiff’s TOS breach claim fails multiple ways, including the TOS’s reservation of termination rights and damages waiver.
So this becomes yet another failed lawsuit over account terminations, just not due to Section 230. You already know this, but if you’re a defendant in these cases, you should be focusing on 230(c)(1), not 230(c)(2)(A).
Gehringer v. Ancestry.com Operations Inc., 2026 WL 734526 (N.D. Cal. March 16, 2026)
Plaintiffs are individuals who have not subscribed to the Ancestry.com service and have not consented to the use of their name or photograph. They allege Ancestry not only includes their yearbook information on a searchable database, but also utilizes their likenesses as part of advertisements for Ancestry.com services…
Plaintiffs contend Ancestry used their likeness in three forms of “advertising”: 1) publication of the yearbook information on a database that contains a paywall for certain features; 2) dissemination of emails to potential Ancestry.com subscribers, noting Ancestry Hints® can expand their family tree, and using the names and images of Plaintiffs as examples; and 3) an Ancestry free trial program that allows potential subscribers to access Plaintiffs’ yearbook information for a limited time.
The court nixes claims over category #1 and #3 ads due to copyright preemption.
As for the category #2 ads:
Plaintiffs allege Ancestry crafted email advertisements that included their likenesses to encourage potential customers to subscribe to Ancestry’s service. The email advertisements were not created by a third-party user of Ancestry.com—Ancestry authored the content, and as such, it is “responsible, in whole or in part, for the creation” of that offending content. To avoid this conclusion, Ancestry attempts to recast the allegations in the First Amended Complaint, asserting Ancestry merely “republish[es] yearbook photos taken and first published by Esperanza High School.” But as the screenshots in the Complaint confirm, the emails sent by Ancestry to prospective users include far more than republished images of Plaintiffs; they incorporate those images into an advertisement for the Ancestry Hints® functionality and Ancestry’s subscription service. Drawing all inferences in Plaintiffs’ favor, Section 230 does not immunize Ancestry against liability for the content of the alleged email advertisements
Notice that Ancestry’s ad creation practices go further than Facebook’s sponsored stories, which also didn’t qualify for Section 230 protection.
State v. Sharak, 2026 WI 4 (Wis. Supreme Ct. Feb. 24, 2026)
Google scanned Sharak’s Google Photo uploads, identified what it thought was CSAM, and submitted a CyberTip. Sharak argued that Google was conducting the search on the state’s behalf. The court disagrees and upholds Sharak’s conviction.
That isn’t unusual. What’s more unusual is the court’s discussion of Section 230. “Rauch Sharak argues that [Section 230(c)(2)’s safe harbor] encourages ESPs to scan for CSAM by granting immunity to ESPs that moderate content and creating civil and criminal liability if ESPs do not scan for CSAM.”
The court responds:
Though § 230(c) may grant immunity to ESPs that choose to scan for CSAM, it does not require, reward, or incentivize scanning for CSAM in the first place. Moreover, § 230(c)(2)(A) grants immunity for “any action voluntarily taken in good faith to restrict access to” obscene material, which sweeps far more broadly than would be required to induce Google’s CSAM scan at issue here….
Even if the statutes encourage Google to scan for CSAM or provide a law-enforcement purpose, Rauch Sharak has not shown that they are enough to turn Google into an instrument or agent of the government.
Alice Rosenblum v. Passes Inc., 2026 WL 711837 (C.D. Cal. Feb. 3, 2026)
[The fact allegations are based on the court’s summary of the complaint.] Passes is a competitor to OnlyFans. Unlike its rivals, Passes allows 15-17 year olds to create accounts with parental consent. Guo is the CEO, and Celestin is a content acquisition specialist. At Guo’s direction, Celestin personally reached out to 17-year-old Alice Rosenblum to create a Passes account. Celestin did a photoshoot of Rosenblum and (with Guo’s help) created a Passes account for her without requiring parental consent.
“Over the next month, while Plaintiff was still 17 years old, Celestin and Ginoza [another Passes employee] allegedly directed Plaintiff to create sexually explicit images and videos of herself….the FAC provides over 14 examples of child sexual abuse material (“CSAM”) involving Plaintiff, being marketed on the Passes platform for $69 to $4,000. Furthermore, Passes agents posing as Plaintiff allegedly communicated via direct message to “big spenders” to continue to market and sell CSAM involving Plaintiff.”
The court rejects Passes’ and Guo’s Section 230 defense:
Section 230 immunity plainly does not apply to Plaintiff’s claims. To be sure, Plaintiff does largely seek to hold Passes Defendants liable as providers of an interactive computer service, and several allegations treat Passes as a publisher, as they involve Passes’ distribution of CSAM involving Plaintiff…Plaintiff alleges that Passes and its agents were directly responsible for the creation and portrayal of the CSAM on the Passes platform: Plaintiff alleges that Celestin, acting as an agent of Passes, personally took at least one photo of Plaintiff which was uploaded to Passes, and further instructed her to create specific photographs and videos and upload them to Passes, which he later marketed under specific captions and sold. Plaintiff further alleges that Passes itself hosted a banner featuring a sexually explicit photo of Plaintiff, which marketed CSAM involving Plaintiff. Plaintiff therefore seeks to hold Passes liable for harm allegedly arising out of its own creation of harmful content.
Passes claimed that Celestin and Ginoza were third parties, but “As alleged, Celestin was
not merely another third-party user of Passes; rather, he acted as an agent and employee of Passes.” Cite to Quinteros.
The court summarizes:
Section 230 immunity does not apply to Passes, a platform which has allegedly, through its agents, deliberately created, marketed, and sold illegal content, acting as an “information content provider” that uses its own “interactive computer service.”
In a footnote, the court adds regarding Guo: “Plaintiff’s allegation that Guo encouraged Plaintiff over the phone to post content, which supports Plaintiff’s claims for IIED and California Civil Code § 52.5, does not hold Guo accountable for Passes’ publishing activity.”
Doe v. X Corp., 4:25-cv-01282-O (N.D. Tex. Feb. 25, 2026)
“A third party copied commercial pornographic content from Plaintiff’s OnlyFans and studio-based productions and uploaded it to X without his consent, violating the OnlyFans terms and conditions and the studios’ licensing agreements.” He sued pursuant to 15 U.S.C. § 6851(b)(1)(A), a private right of action for nonconsensual production of intimate visual imagery. Doe produced the porn consensually, but he claims the restrictions extended to nonconsensual distribution.
The court says X qualifies for Section 230. Doe responded that he owned the IP in the works, so the IP exception applies. The court says:
The [IP] exception applies only when the claims arise from a law directly implicating intellectual property rights, not merely when intellectual property is involved in the claim. And the statute under which Plaintiff sues—§ 6851—is not an intellectual property law. Rather, it is concerned with “whether the depicted individual consented to a specific disclosure of an intimate visual depiction—regardless who holds the copyright to the image.” Thus, § 6851 creates a privacy-based tort right of action, not an intellectual-property based one.
The boundary between privacy and IP laws remains amorphous–increasingly so with all of the concerns about “deepfakes,” “virtual replicas,” and other AI-related regulations that use privacy framing to create what look like sui generis IP rights. This could be a good student paper topic.
For more discussion of the IP exception to Section 230, see this article.
Teague v. Google, 2026 WL 746996 (D. S.D. March 17, 2026)
Plaintiff claims Google committed defamation based upon the fact that “people think I raped [redacted]. This case (sic) been dismissed in 2021 but it still show (sic) on Google and caused me to (sic) threaten and attacked a few times.” Plaintiff further claims his image is on Google and it is difficult to get a job because the rape charges still appear on Google.”…
Google is not a “publisher or speaker” under the CDA and therefore “cannot be liable under any state-law theory to the persons harmed by the allegedly defamatory material.”
Google is immune from suit for defamation claims arising out of other content providers’ posts on the internet.