Augmented Reality Filters May Violate Privacy Law–Hartman v. Meta

This case involves augmented reality (AR) effects/”filters” that people can use to doctor up images and videos on social media, such as the ability to add virtual bunny ears, flower crowns, or cat whiskers to people in the image or video. The tsunami of anti-AI laws (and other laws against synthetic content) puts all of those AR filters in jeopardy–after all, they produce fake algorithmically-generated images. But before we get there, AR filters are in jeopardy due to an old-school privacy law, the Illinois Biometric Information Privacy Act (BIPA).

BIPA protects consumers’ biometrics, including information derived from face scans such as “face geometry.” In order to add AR effects and filters to images and video, the software must compute the depicted individuals’ face geometry (i.e., the software needs to know where to put the bunny ears or cat whiskers). To some privacy lawyers, this is a prima facie violation of BIPA.

A court rejected Facebook’s motion to dismiss the BIPA claim for the AR filters used in Facebook Messenger and Facebook Messenger Kids.

Uniquely Identifiable Data

The complaint “sufficiently alleges that Meta scanned users’ face geometries and that these scans are capable of identifying the people from whom they were taken”:

the point of this process is to allow users to superimpose filters and effects like bunny ears or cat whiskers on their face. To do so effectively, the bunny ears or cat whiskers would have to appear in a location that creates a plausible appearance. If the filters and effects were applied based on a generic face template that included an oval shape to convey a facial structure, and general outlines of ears, nose, and mouth, the filters and effects could, and often would, create an odd appearance. Bunny ears could appear on the user’s forehead or be superimposed in a location that is not connected to the face at all. The technology would have little entertainment or commercial value if it applied these effects in such a non-personalized manner….

Scanning a person’s face to identify the locations of its constituent parts, including eyes, nose, mouth, and ears, creates a geometric representation
that is unique to that person. Thus, an “estimation of the location of parts of users’ faces” based on a scan of their face is intrinsically unique and could plausibly be used to identify them.

COPPA Preemption

COPPA lacks a private right of action and BIPA has one. The court concludes that COPPA’s preemption provision nevertheless doesn’t apply to BIPA:

BIPA’s subject matter is almost entirely distinct from that of COPPA. BIPA, as noted, regulates “biometric identifiers” and “biometric information.”… COPPA, on the other hand, regulates “personal information,” in the form of data-based identifiers….The thematic difference between biology-based and data-based identifying information should be apparent. And considering these distinct regulatory targets, there is no basis to conclude that BIPA’s requirements are “inconsistent” with COPPA’s

Implications

Facebook may still win this case based on BIPA’s various technicalities. However, I’m mostly flummoxed by the weaponization of BIPA against augmented reality, especially tools that bring a great deal of joy to people’s lives like AR filters and effects. If the BIPA challenge succeeds against augmented reality, it will be yet another reason to suspect that BIPA represents a regulatory intervention that occurred too early in the technology development cycle, before its applications were clear. A successful suit here also will be more evidence that BIPA is hindering the development of socially beneficial tools and services. For example, I remain irritated about the BIPA lawsuits trying to prevent Internet services from using anti-CSAM blocklists and face scans for age authentication purposes.

Case Citation: Hartman v. Meta Platforms, Inc., 2024 U.S. Dist. LEXIS 167696 (S.D. Ill. Sept. 17, 2024). The complaint.

Comments and Pings

Visit Full Blog