Google Isn’t Liable for Allegedly Problematic Search Results–Diez v. Google
Diez sued Google (pro se/in pro per) on the theory that Google had said it screens out CSAM and he thus assumed that all images he found were not CSAM. A representative allegation in the complaint: “Defendant knew or should have known that the images displayed in the ‘Results’ would be deemed ‘child pornography’ since Defendant had expansive access to ‘hash value’ records for child pornographic images.” (Note: if Google had the image’s hash value, its algorithm would have automatically screened it out). The complaint also says Google didn’t warn Diez that images in the search results may be illegal. His complaint demands $70M+ and various other relief.
The court interprets Diez’s complaint as alleging two causes of action: false advertising and violations of federal CSAM laws. In a per curiam non-precedential memo opinion, the Fifth Circuit dismisses the complaint.
False Advertising. The court says Diez didn’t purchase or lease any goods/services, so he’s not a protected “consumer.” The court also says that the complaint didn’t sufficiently allege that Google ever made any false claims.
18 USC 2252A. This statute provides a civil remedy for CSAM victims. Diez claims it protects him because, by its terms, it covers “any person aggrieved by reason of the conduct prohibited by the statute,” and he’s aggrieved because of his arrest and prosecution for CSAM. I don’t think that’s what Congress meant. Nevertheless, the court rejects it on Section 230 grounds because Diez is complaining about third-party content in Google’s image search index. The court says: “Google is merely an interactive computer service provider as opposed to an information content provider. Further, Diez’s complaint is without adequately supported allegations that Google created the disputed content.”
It’s logical that Section 230 would immunize 2252A civil claims when based on third-party content, because Section 230 applies equally to all federal civil claims. Still, this is a potentially politically sensitive issue, because the EARN IT Act proposes to categorically eliminate Section 230 protection for any claims related to CSAM. Before the politicians and 230 critics misinterpret this decision, note that Google never actually invoked Section 230. Because the complaint is pro per, the court scrutinized and dismissed it without ever authorizing service to Google, and Google never appeared in the case. Furthermore, if you think this is a Section 230 problem, then you probably can explain what, exactly, you want any image search function like Google’s to do differently here. Be specific–unless you want image search engines to shut down entirely. The EARN IT Act proposed a multi-year commission to examine what steps might be done, though it proposed to repeal Section 230 without waiting to benefit from the commission’s efforts.
Finally, it hasn’t been confirmed that the Pinterest-posted image was actually CSAM. If this image isn’t CSAM, the First Amendment protects the image; and consider how that might change your feelings about Google’s operations here. And if the image is CSAM, the First Amendment almost certainly prohibits imposing strict liability on “passive” content distributors like image search engines. For example, if Google didn’t have the image’s hash value in its database, then Google didn’t “know” it was CSAM. Also see the uncited Doe v. Peterson ruling from a decade ago.
Case citation: Diez v. Google Inc., 2020 WL 7496420 (5th Cir. Dec. 17, 2020). The complaint.
Note: when I first posted the complaint to Twitter, some folks commented on Diez’s handwritten complaint. I remind you that prisoners may have limited or no access to computers or typewriters.
UPDATE: A similar ruling: Weimer v. Google, Inc., 2020 WL 6708206 (9th Cir. Nov. 16, 2020). Section 230 protects search engines when the plaintiff used them to find CSAM.
Pingback: News of the Week; December 23, 2020 – Communications Law at Allard Hall()