Court Sends Google Assistant Privacy Lawsuit Back for a Redo
Wiretap Act: The court first rejects Google’s argument that any wiretapping is not intentional, saying:
interceptions may be considered intentional where a defendant is aware of the defect causing interception and takes no remedial action.
At the same time, the court says some de minimis error is to be expected and cannot transmute the false accepts into intentional interceptions. In addition, the court notes that plaintiffs are not only complaining about the recording itself, but also about subsequent use (for optimization) of those recordings.
The court next looks to whether the communications at issue are private. Plaintiffs pointed to a news report that false accepts captured sensitive conversations such as those occurring in the bedroom and those with children. However, plaintiffs never alleged that Google captured any of plaintiffs’ such conversations. The court does not require plaintiffs to jump over a high bar:
the Court believes it would be enough for Plaintiffs to show that they frequently have oral communications near their respective [devices] under circumstances giving rise to a reasonable expectation of privacy. That coupled with the allegations that false accepts routinely occur, would support an inference that Plaintiffs had private conversations intercepted.
The court says this may be difficult for plaintiffs who allege that they use their devices in public spaces or on their mobile devices. However, even those who allege they interact with their devices at home did not make any allegations regarding the nature of the conversations in question.
Google then argued that the interceptions occurred if at all, by a “device” being used in the ordinary course of business. This exception applies where the interceptions facilitates the transmission of the communication or is incidental to its transmission. The court says they certainly do not facilitate the transmission, but there is a factual question of whether the interceptions are incidental.
Finally, plaintiffs also alleged that Google improperly “used” or “disclosed” improperly obtained communications. For liability to attach, the defendant must “know or have reason to know” that the communications were improperly obtained. The court says plaintiffs’ allegations suffice because they contend that Google uses a recording even after it realizes that “it has wrongly recorded a conversation.”
Stored Communications Act: The parties argued as to whether Google improperly accessed a “facility” and what that facility is (the devices, Google servers, etc.). The court agrees with Google that a plaintiff’s own personal devices are not covered (it must be some sort of third-party facility operated by the provider of a communications service). Google’s own servers also do not qualify as Google does not need permission to access these. The court dismisses this claim and expresses skepticism that the plaintiffs could adequately allege this element.
Plaintiffs also brought an unlawful disclosure claim under the SCA. They focused on Google’s use of the recordings for analytics and optimization purposes. The court says that Google’s own internal use is not actionable, but any disclosure to third parties—which plaintiffs allege occurs—requires a closer look. To the extent this occurs, Google says plaintiffs consented to it. Google pointed to a provision of the privacy policy that stated Google
provide[s] personal information to [its] affiliates and other trusted businesses or persons to process it for [Google] based on [a user’s] instructions and in compliance with [Google’s] privacy policy . . . .
The court says this is not sufficient to infer consent at the motion to dismiss stage. Google pointed to other language in its privacy policy to bolster its argument, but the court easily picks it as apart as not being sufficiently clear about the disclosure.
State Privacy Statutes: Plaintiffs asserted claims under California Penal Code section 631 (for wiretapping) and 632 (for eavesdropping). The court says that plaintiffs’ theories appear “incompatible” with a wiretapping claim. Plaintiffs allege that Google improperly listened in on plaintiffs’ household conversations, but these conversations occurred in person. With respect to the eavesdropping claim, the court rejects Google’s argument that any eavesdropping that occurred was not intentional, but the court nevertheless says the claim is not adequately pled because the conversations in question are not alleged to be “confidential.”
Common Law and Constitutional Privacy: The court finds that plaintiffs fail to allege that they had an expectation of privacy in the conversations that were captured. The court also says it’s also skeptical that the allegations show that the capturing was “highly offensive” or “serious”. However, the court stops short of saying that the conduct Google is accused of is not highly offensive as a matter of law:
although it is a close call, the Court believes that a reasonable person could find Defendants’ alleged conduct to be highly offensive.
The court says this is ultimately a fact-dependent inquiry, which cannot be performed at the motion to dismiss stage.
Breach of Contract: The breach of contract claim was premised on a statement in Google’s terms of service that Google would only share plaintiffs’ “personal information” with their consent. Plaintiffs also sought to marshal portions of statements on various Google Assistant-related websites (“Google Nest Help Center” and “Google Safety Center”), but the court says these are not sufficiently incorporated by the terms to constitute contractual promises.
The court picks apart plaintiffs’ breach argument. The specific term says that Google would share personal information in four different circumstances, one of which is with the user’s consent. The court says that plaintiffs have not adequately alleged that plaintiffs have not sufficiently alleged their private conversations were recorded, so they fall short of alleging that whatever information is shared was plaintiffs’ “personal information”. Additionally, plaintiffs only focused on lack of consent but did not plead the absence of the other three circumstances.
The court also addresses plaintiffs’ damages theory. It rejects the benefit of the bargain theory, nothing that Google Assistant is free to use on enabled devices. It says the second theory of damages—harm to their privacy interests—is “more promising.” Again, the court says the complaint suffers from the defect that it does not allege that any of plaintiffs’ private conversations were actually captured. The court finally says that the disgorgement theory—based on profits gleaned from exploitation of plaintiffs’ private information—is too vague. While certain cases have approved this theory in privacy cases, the allegations supporting such damages theories have to be much more specific.
Warranty Claims: The court says that Plaintiffs fail to point to a specific guarantee made by Google. Plaintiffs relied on privacy assurances, but the court says these don’t specifically relate to the devices (the products at issue) but relate to the software. There is one statement that could be construed as a specific warranty, but the court says that even this is not unequivocal in stating Google would not utilize “false accepts” (the warranty said Google would record when hotwords were detected, and impliedly not record when hotwords are not detected). Google also relied on disclaimers in certain warranty documents, but the court says that the effect of the disclaimers are unclear and not properly in the record as undisputed contract terms.
Plaintiffs also brought implied warranty claims, but these are doomed by the disclaimers in the terms of service. Unlike the warranty documents, there was not dispute that plaintiffs were bound by the terms, and thus the disclaimer of warranties here neutralizes any claim for implied warranties. Plaintiffs made a passing argument that the disclaimers were unconscionable but the court rejects this argument as cursory.
UCL Claims: Plaintiffs had mixed success on standing for their UCL claims. They generally relied on an “overpayment” theory—i.e., that they would not have purchased the devices. This theory worked for those plaintiffs who purchased Google devices, but several plaintiffs either did not purchase Google devices or (in the case of minors) were not interacting with a device that they purchased. Plaintiffs also relied on a restitution theory under which Google would ostensibly be required to disgorge any profits gleaned from monetizing plaintiff’s personal information. The court rejects this argument as too speculative. (As mentioned in our blog post here, the Ninth Circuit recently took a broad view of unjust enrichment in a lawsuit against Facebook alleging improper tracking.)
Plaintiffs’ relied on the “unlawful” prong of the UCL and the court says the UCL claim rose and fell with the underlying claims. In addition, plaintiffs relied on Family Code 6701, the minor disaffirmance provision in California law. The court says plaintiffs failed to adequately articulate how section 6701 is implicated and dismisses this claim.
As to the fraudulent prong the court finds that plaintiffs failed to state a fraud claim with sufficient specificity. The court dismisses a claim based on this prong.
Finally, the court examines the “unfair” prong. The court says that the cost/benefit of whether Google should have implemented additional measures to ensure that “false accepts” did not occur was not amenable to summary adjudication. However, the court nevertheless finds the claims deficient because plaintiffs failed to plead their own conversations were intercepted and were also subject to a reasonable expectation of privacy. The court says this means that plaintiffs have not pled sufficient harm for purposes of this prong of the UCL claim.
___
This is an interesting ruling that illustrates the range of privacy issues lurking behind voice-activated “assistants”. We blogged about the Alexa lawsuit where plaintiffs sued Amazon for capturing conversations involving minors without consent. That lawsuit is ongoing; most recently the Article III judge adopted the magistrate’s recommendation rejecting Amazon’s request to send the claims to arbitration. Amazon has appealed that ruling and trying to keep the lawsuit on hold while the appeal resolves. This lawsuit involves conversations that were captured but that users did not intend to be captured. As Eric mentioned in the post about the Alexa ruling, voice-activated assistants have to deal with some inherent legal risk because they can capture the recordings of third parties (and minors) but also because they will record when they are not directed to.
We should add state eavesdropping statutes to the list of privacy statutes that have the potential to be effective weapons in the hands of class counsel.
Did Google design its product in the most privacy-friendly way? That’s tough to glean from the court’s order. Presumably it should be able to use the mistakenly activated recordings to learn how to be more accurate in knowing when to record. On the other hand, if Google used such data for advertising or targeting (or any other purpose), that could be a cause for complaint. In any event, this is something users should be given control over and apprised of.
I was struck how there was no obvious answer from Google’s privacy policy on the question of whether this recording was authorized. Nor was there clear consent from the consumer. I’m probably stating the obvious, but the terms of use and privacy policies never seem to provide a clear answer in the edge privacy cases. (On a vaguely related note, I wondered why Google didn’t try to move this lawsuit into arbitration.)
As mentioned at the outset, while Judge Freeman sends this lawsuit back to the drawing board, she signals at least several of the claims will likely survive.
Case citation: In re Google Assistant Privacy Litigation, No. 19-cv-0428-BLF (N.D. Cal. May 6, 2020) [pdf].
Related posts:
Amazon Can’t Force Arbitration of Minors’ Privacy Claims Based on Alexa Recordings–BF v. Amazon
Pingback: CLIP-ings: May 22, 2020 – CLIP-ings()