Will Biometric Privacy Laws Undermine the Fight Against CSAM?–Martell v. X

This lawsuit involves the widely used PhotoDNA database, a cornerstone of the fight against online child sexual abuse material (CSAM). PhotoDNA renders hash values of identified CSAM items and then enables services to block images with identical hash values. In other words, once a PhotoDNA participant identifies a CSAM item, participating services can prevent the spread of other instances of the images. PhotoDNA isn’t a complete tool in the fight against CSAM. For example, it doesn’t detect edited versions of known images that generate different hash values, and it doesn’t block any CSAM item that has never been identified before. Nevertheless, I shudder to think about how CSAM would proliferate without PhotoDNA.

Given the scourge of CSAM and the importance of deploying a robust set of tools to combat CSAM, it’s absolutely shocking to see a lawsuit that, if successful, would make it impossible for services to use PhotoDNA in their fight against online CSAM. I cannot wrap my head around the plaintiff lawyers’ decision to bring this lawsuit.

The plaintiffs claim that making hash values of photos violates the Illinois Biometric Information Privacy Act (BIPA), which restricts scanning faces. The idea is that services using PhotoDNA (like X/Twitter) must hash all photos–including images showing people’s faces–in their database to check if those photos match a hash value in the PhotoDNA database. The plaintiffs argue that hashing the photo necessarily calculates the photo subjects’ face geometries. That doesn’t make any sense technologically, and fortunately it doesn’t make any sense legally either (for now).

The court indicates that it’s not a BIPA violation simply because services scan photos that contain faces and create hash values (“Plaintiff must allege that the PhotoDNA scanned individuals’ face geometry and not just that it scanned a photo.”). The plaintiff pointed to the hash values as evidence of face scanning, to which the court basically says “what?” (“The fact that PhotoDNA creates a unique hash for each photo does not necessarily imply that it is scanning for an individual’s facial geometry when creating the hash…If the scan merely compares the image to see if it is the same as other images, that does not imply the use of facial geometry”).

The court also says that the PhotoDNA hash values don’t uniquely identify individuals (“Without allegations that PhotoDNA uses facial geometry to identify individuals, Plaintiff failed to allege that the hashes are biometric identifiers….Plaintiff does not allege that any details of an individual’s face are measured or recorded during the PhotoDNA scan or that those records were used to identify individuals.”).

Twitter also invoked the Section 230(c)(2)(A) defense, which says that services aren’t liable for filtering instructions. The court sidesteps this defense on Twitter’s motion to dismiss because of 230(c)(2)(A)’s good faith requirement. This highlights how Section 230(c)(2)(A) is often useless on motions to dismiss, which is why it has relatively low prominence in Section 230 jurisprudence.

The court dismisses the complaint with leave to amend.

* * *

I know most readers are familiar with BIPA, but if this is your first introduction to it–yes, many BIPA lawsuits are this tendentious and myopic. BIPA took a good policy idea (give consumers opt-in power over their biometric information) and turned it into a favorite tool of the plaintiffs’ bar. That leads to shocking and venal lawsuits like this one or the one claiming that BIPA blocks standard age authentication technology (even if such deployments are required by law). To me, BIPA is a prime example of a law that was passed too early in the technology development cycle, before the full range of biometric use cases was really clear. (It doesn’t help that BIPA has overbroad drafting and overrelies on private enforcement). As a result, BIPA now potentially clashes with important and socially beneficial new technologies. Certainly lawsuits like this do little to enhance BIPA’s reputation or the reputation of those who enforce it.

The stakes for this lawsuit are extraordinarily high. Congress and other regulators already (falsely IMO) assert that Internet services don’t care or do enough to fight CSAM and, as a result, CSAM is prevalent on their services. I question the factual predicates of these claims. Nevertheless, if PhotoDNA becomes unavailable, it will disrupt anti-CSAM countermeasures that services widely rely upon. In that circumstance, CSAM likely will become more prevalent, and that will add more fuel to the regulators’ fire. Regulators won’t care why the ecosystem has changed (i.e., CSAM benefits from overbroad privacy laws). Instead, in response to their heightened concerns about CSAM, regulators will adopt draconian measures guaranteed to reconfigure the Internet in profound and structural ways. That is another good reason to hope this lawsuit fails.

While I have repeatedly bashed Twitter’s legal positions in the Musk era, it still fights and wins some important cases with industry-wide stakes, like this one. 🙏

Case Citation: Martell v. X Corp., 2024 U.S. Dist. LEXIS 105610 (N.D. Ill. June 13, 2024). The complaint.

Selected BIPA Blog Posts