Google Photos Defeats Privacy Lawsuit Over Face Scans–Rivera v. Google

google-photos-300x101This case provides a glimpse into the legacy of the Supreme Court’s Spokeo decision on the injury-in-fact requirements for Article III standing in federal court. When it was issued, I called Spokeo a “jurisprudential clusterfuck.” Indeed, the subsequent caselaw has been messy–including this case!

Google Photos automatically creates “face templates” from uploaded photos and then lets the uploading user sort the photos by face. The court assumes, without deciding, that the face templates and associated data constitute biometric information pursuant to the Illinois Biometric Information Privacy Act (BIPA). Users can opt-out of this feature; but the court assumes, without deciding, that users (and anyone else depicted in the photos) did not consent to the feature. Google claims it only uses the face templates for the sorting feature. There is no evidence Google has disclosed the face templates, voluntarily or involuntarily, to any third parties; that Google uses the data for data mining or ad targeting purposes; or that Google links the data with other data in its possession. (Nevertheless, it’s easy to be skeptical of Google’s claim that it only uses this data for a single purpose given Google’s manifold data usage practices in virtually every other aspect of its business). The plaintiffs “testified that they did not suffer any financial, physical, or emotional injury apart from feeling offended by the unauthorized collection.” Despite that, the plaintiffs sued Google for BIPA and other claims. The court dismisses the BIPA claim for lack of Article III standing.

Data Retention. The plaintiffs allege that Google didn’t follow BIPA’s requirements for storing and retaining the face templates. Unfortunately for them, in Gubala v. Time Warner Cable, the “Seventh Circuit has definitively held that retention of an individual’s private information, on its own, is not a concrete injury sufficient to satisfy Article III.” That precedent pretty much ends the discussion. The plaintiffs argued that Google has been hacked repeatedly and deliberately not publicized those hacks, but the court ignores those incidents because they didn’t relate to the face templates or associated data. However, the court does caution Google that “if Google is aware of any bug or data breach to any Google Photos API or Google Photos itself, it should have already reported them to Plaintiffs (as supplemental discovery) and to the Court (in a supplemental filing), and must do so immediately if a Google Photos breach occurred.”

Face Template Collection. Google’s generation of the face templates isn’t governed by the Gubala precedent. Instead, this case resembles Patel v. Facebook, a BIPA case against Facebook. In February, a Northern District of California ruling found that the Patel plaintiffs had Article III standing. This opinion painstakingly distinguishes the Patel precedent to justify what appears to be a conflicting outcome.

The court summarizes the Patel ruling before picking it apart:

Patel’s holding stands on two pillars: the risk of identity theft arising from the permanency of biometric information, as described by the Illinois legislature, and the absence of in-advance consent to Facebook’s collection of the information

First, the court says there’s not a substantial risk of identity theft due to Google’s face templates even though people can’t change their faces. The court says the immutability “does not justify an across-the-board conclusion that all cases involving any private entity that collects or retains individuals’ biometric data present a sufficient risk of disclosure that concrete injury has been satisfied in every case.”

Second, the Illinois legislature specified that the harms of unconsented face template collection include identity theft and some other generic concerns. The court says this isn’t good enough given that faces are “public” information:

Most people expose their faces to the general public every day, so one’s face is even more widely public than non-biometric information like a social security number. Indeed, we expose our faces to the public such that no additional intrusion into our privacy is required to obtain a likeness of it, unlike the physical placement of a finger on a scanner or other object, or the exposure of a sub-surface part of the body like a retina. There is nothing in the Act’s legislative findings that would explain why the injury suffered by Plaintiffs here—the unconsented creation of face templates—is concrete enough for Article III purposes.

In a footnote, the court says the legislature could specify other harms, and technological evolution might mean that “private entities are able to use the technology to pinpoint where people have been over extended time periods.”

The court also discusses the similarities between the common law privacy torts and the harms in this case (a technique suggested by Spokeo). The court says that false light and public disclosure of private facts don’t apply, so it only considers the other two torts:

  • intrusion into seclusion. Google Photos doesn’t commit an intrusion. Users voluntarily upload the photos to Google. For non-users, faces–as distinguished from facial biometrics–are publicly disclosed and do not get a reasonable expectation of privacy per Fourth Amendment jurisprudence. Also, creating face templates isn’t “highly offensive” because “the templates are based on something that is visible to the ordinary eye, that is, Plaintiffs’ faces.” Furthermore, Google didn’t commercially exploit the faces. In a footnote, the court adds, “Plaintiffs’ argument that the creation of face templates is similar to “restaurants [] dust[ing] their customers’ glasses for fingerprints and stockpil[ing] those identifiers,” is misplaced. Fingerprints are not held out to the public like faces, which are visible to the ordinary eye. Applying a template to a face on a voluntarily uploaded photograph is very different from collecting the tiny physical remnants left by ridges on a person’s fingers. “
  • “Appropriation” (a/k/a publicity rights). Google doesn’t currently commercialize the face templates. The plaintiffs argued it could do so in the future, but the court says that’s too speculative.

In a footnote, the court summarily rejects the analogy to a Northern District of Illinois ruling in a similar case, Monroy v. Shutterfly.

Accordingly, the court says that the injuries from Google’s (assumed-to-be-unconsented) face template collection are not sufficiently similar to those recognized by the common law to satisfy Spokeo’s standards.

Thus, the plaintiffs lack any injury-in-fact because they didn’t suffer the harms specified by the legislature or harms sufficiently analogous to the common law privacy torts. The court dismisses the BIPA claim for lack of Article III standing.

In a footnote, the judge suggests that BIPA, passed over a decade ago and written in broad strokes, may have overreached: “The difficulty in predicting technological advances and their legal effects is one reason why legislative pronouncements with minimum statutory damages and fee-shifting might reasonably be considered a too-blunt instrument for dealing with technology. Of course, there might be policy considerations that weigh in favor of taking the broader approach.”

Comments

What Went Right for Google? Google made a limited use of its face templates–it only helped a user to privately sort his or her own photos. This contrasts with, say, Facebook, which “tags” people based on their faces, thus disclosing information to third parties. Perhaps this case sufficiently diverges from the precedent on its facts, and perhaps this case suggests that BIPA has an implied “private use” exception (at least in federal court).

Having said that, Google got a lucky draw with this judge (Edmond E. Chang). Most judges would have followed the quite similar rulings involving Facebook and Shutterfly, plus clearly this judge was more skeptical of BIPA than other judges have been. Don’t get me wrong–I think this opinion reaches an appropriate outcome where the plaintiffs’ only harm is that they feel offended. However, it takes a bold judge to forge an independent path like this.

Are Faces “Public”? Efforts to distinguish “public” from “private” information are always imperfect, and the discussion about faces being “public” feels particularly awkward. The opinion sidestepped some deep and messy philosophical questions lurking in that categorization.

First, faces sometimes are public and sometimes not; it depends on the context. A private sex tape that depicts the participant’s face is unquestionably protected by privacy law; while the same tape that blurs out or omits the face may not be (if there aren’t other unique identifiers). A face is neither public nor private; it depends on how the face is connected to the surrounding information. This public/private distinction also ignores different cultural practices, such as women who wear niqabs or burqas in public but take them off in private settings.

Second, the case involves the face template, not the face, and that may be a significant difference. The human eye is pretty good at detecting face geometry, but machines can quantify that information much more precisely and in a more portable format. I could understand why a legislature would restrict the automated quantification of face geometry even if the faces themselves are “public” information. (Enforceability of such a restriction is a separate issue).

All told, it’s a little glib to say that we publicly disclose our faces routinely and therefore they don’t receive the same privacy protection as fingerprints or retinas. I see why the judge made this distinction, but it creates some rough edges that other courts might find hard to accept.

Case citation: Rivera v. Google, Inc., 2018 WL 6830332 (N.D. Ill. Dec. 29, 2018).

Related Posts:

Illinois Users’ Face-Scanning Privacy Lawsuit Against Facebook Headed to Trial
Privacy Plaintiffs Lack Standing Against NBA 2K15’s Face-Scanning Technology
Face Scanning Lawsuit Against Shutterfly Survives Motion to Dismiss
Facebook Gets Bad Ruling In Face-Scanning Privacy Case–In re Facebook Biometric Information Privacy Litigation
Shutterfly Can’t Shake Face-Scanning Privacy Lawsuit