Shutterfly Can’t Shake Face-Scanning Privacy Lawsuit

A putative class action accuses Shutterfly of violating the Illinois Biometric Information Privacy Act. The plaintiff, who is not a Shutterfly user, alleges that, when an identified Shuttterfly user uploaded pictures of plaintiff, Shutterfly created a “face scan” using the photo and suggested that the user “tag” later-uploaded photos depicting plaintiff. The complaint, which pieces together Shutterfly’s face-scan program using information disseminated by the company (e.g., presentations, patents or patent applications), alleges:

Defendants have created, collected and stored millions of “face templates” (or “face prints”) – highly detailed geometric maps of the face – from millions of individuals, many thousands of whom are non Shutterfly users residing in the State of Illinois.

Shutterfly argued, fairly straightforwardly, that the statute expressly excludes photos as well as information derived from excluded items or processes. It also argued that the statute was meant to address risks that were different from the type presented by the creation of a database by the likes of Shutterfly.

The court rejects this argument. It notes that there is a dearth of cases construing the statute. The court recaps the definitions of the statute, including its exclusions. (It does not mention the provision that excludes information “derived from items of procedures excluded under the definition of biometric identifiers [which include photos]”.) The court says a claim is plausible:

Here, Plaintiff alleges that Defendants are using his personal face pattern to recognize and identify Plaintiff in photographs posted to Websites. Plaintiff avers that he is not now nor has he ever been a user of Websites, and that he was not presented with a written biometrics policy nor has he consented to have his biometric identifiers used by Defendants. As a result, the Court finds that Plaintiff has plausibly stated a claim for relief under the BIPA.

__

The case has some interesting parallels with the Redbox VPPA failure-to-purge cases (in that plaintiff targets the aggregation of information regardless of improper use) and the Gmail scanning cases (a non-user asserts a privacy-based claim).

After reading the statute, it appears that the court simply got it wrong. Photographs are clearly excluded, as is information “derived” from photographs. Maybe the court thought something different was going on here, but plaintiffs’ claims seem clearly excluded from the reach of the statute. Perhaps the court was just being overly cautious. Plaintiff’s argument is that while photographs are excluded from the definition of biometric identifiers, “face geometry” is not. But it’s awkward drafting at best for the legislature to have intended information derived from photos to be covered but have indicated this through an ambiguous omission of it from the definition of biometric identifier. (Sidenote: the statute appears to prohibit collection of this in any circumstance, even, for example, in a public setting. I wonder this or the broad transfer restrictions pose a First Amendment issue at the fringe.)

It’s also noteworthy that the plaintiff did not allege any harm in the complaint (beyond violation of the statute). This is not exactly surprising, given that this is a case based on a privacy statute, but still worth noting.

There is a similar case pending against Facebook, which was transferred to the Northern District of California and is pending there.

Eric’s comments:

1) As Venkat notes, there are some parallels between this case and the Gmail/Yahoo email-scanning-for-ads cases. In all of them, the service provider had contractual privity with its users and arguably obtained (or, at least, had the capacity to obtain) sufficient consent for the scanning. However, any user consent is insufficient to pick up third parties who aren’t users, and there really is no way for the service provider to obtain this missing consent. It’s a good reminder that if your solution to any perceived problem is to get sufficient consent, make sure you’ve mapped out a good privity argument. Otherwise, under the plaintiff’s legal theory, there is absolutely no way to do facial recognition on a corpus of photographs because there will always be third party non-users depicted in the corpus.

2) I understand the concerns about facial recognition. Once a faceprint is captured, it is permanent and virtually immutable. Still, the Illinois law reads like the kind of retrogressive, paranoid legislative overreactions that exasperates Law and Technology scholars who know that regulating the technology (rather than how it’s used) at early stages never really works. As I’ve written elsewhere, the real privacy issue isn’t the collection of faceprints, it’s the potential misuse. Even more fundamentally, the law seems to assume that companies will use biometrics stupidly (i.e., use weak security protocols for important authentications). While there will always be dumb implementations, this paranoia reflects the lack of experience with biometrics at the time of the statute’s passage. (For example, two-factor authentication would obviate some of the statute’s concerns).

3) With respect to regulating bad technology vs. regulating bad uses of technology, how is Shutterfly’s facial recognition a “misuse” of the information? As Venkat notes, there’s no allegations of harm suffered by the plaintiffs–nor I do see how (given Shutterfly’s innocuous activity) any legally cognizable harm could be alleged. So in my mind, this is yet another “privacy” lawsuit that’s anti-technology and not about redressing any consumer harm. Yay?

4) We’ve been sorting through thousands of photos in my mom’s archives, and there are many people depicted in the old photos that we don’t recognize–and in many cases the people who could recognize them are also now dead, so in all likelihood these folks will remain forever unidentified. What a gift it would be to have effective facial recognition for these photos. At minimum, we could share these photos with the appropriate friends or family so they could have a more complete archive of their families. Sadly, without facial recognition, those matches won’t be made.

Case citation: Norberg v. Shutterfly, 15-cv-05351 (N.D. Ill. Dec. 29, 2015). The complaint.

Related posts:

Court Says Plaintiff Lacks Standing to Pursue Failure-to-Purge Claim Under the VPPA – Sterk v. Best Buy

Seventh Circuit: No Private Cause of Action Under the Video Privacy Protection Act for Failure to Purge Information–Sterk v. Redbox

Redbox Can be Liable Under the Video Privacy Protection Act for Failure to Purge Video Rental Records — Sterk v. Redbox

Disney Not Liable For Disclosing Device IDs And Viewing Habits

App Users Aren’t “Subscribers” Under the VPPA–Ellis v. Cartoon Network

Ninth Circuit Rejects Video Privacy Protection Act Claims Against Sony

AARP Defeats Lawsuit for Sharing Information With Facebook and Adobe

9th Circuit Rejects VPPA Claims Against Netflix For Intra-Household Disclosures

Lawsuit Fails Over Ridesharing Service’s Disclosures To Its Analytics Service–Garcia v. Zimride

Minors’ Privacy Claims Against Viacom and Google Over Disclosure of Video Viewing Habits Dismissed

Is Sacramento The World’s Capital of Internet Privacy Regulation? (Forbes Cross-Post)

Hulu Unable to Shake Video Privacy Protection Act Claims

California Assembly Hearing, “Balancing Privacy and Opportunity in the Internet Age,” SCU, Dec. 12

It’s Illegal For Offline Retailers To Collect Email Addresses–Capp v. Nordstrom

California Supreme Court: Retail Privacy Statute Doesn’t Apply to Download Transactions – Apple v Superior Court (Krescent)

CA Court Confirms that Pineda v Williams-Sonoma (the Zip-Code-as-PII Case) Applies Retrospectively — Dardarian v. OfficeMax

California Supreme Court Rules That a ZIP Code is Personal Identification Information — Pineda v. Williams-Sonoma