Amazon Can’t Force Arbitration of Minors’ Privacy Claims Based on Alexa Recordings–BF v. Amazon

16-PM-300x129This lawsuit alleges that Alexa improperly stores the voiceprints of minor users. The trial court declines to order arbitration. (It’s the recommendation of a magistrate, so it will go to the district judge for adoption or modification of the order.)

There is no dispute that the parents accepted Amazon’s terms of use, which apparently contain disclosures regarding the collection of personal information, including the information of children. Amazon implemented a consent specifically for the collection of children’s information that the parents have to accept. The children did not themselves agree to the terms of use.

The terms contain an arbitration clause and a class action waiver. The question is whether Amazon can enforce these against the minor plaintiffs. The court says no. Parties who have successfully compelled a non-signatory to arbitrate a claim have done so under varying versions of the “equitable estoppel” argument (under which it’s fair to say the party resisting arbitration is estopped from raising non-party status).

The first version of equitable estoppel arises where non-signatories to an arbitration agreement compel a signatory to arbitrate on the basis that the underlying arbitrable claims are closely intertwined with the claims asserted against the non-signatory. Amazon asked the court to extend this principle to the attempt by a signatory (Amazon) to enforce an arbitration clause against a non-signatory (the minor), but the court declines.

The second theory of equitable estoppel applies where a plaintiff who knowingly exploits an agreement containing an arbitration clause can be forced to arbitrate claims arising under it. Here, Amazon argued that the minors had used the benefits of Alexa, so they should be bound by the terms of the underlying agreement. However, the court says that this exception applies only where a non-signatory knowingly exploits an agreement and brings claims under the agreement. Where a non-signatory plaintiff receives the benefits of a product or service but brings non-contract (statutory or tort) claims against the signatory, the plaintiff cannot be forced to arbitrate. In this case, the claims in question are purely statutory in nature, so this exception doesn’t apply. The court is troubled by the possible implications of Amazon’s argument:

The Court concludes that neither principles of equitable estoppel nor “ordinary contract and agency principles” supports the logic and policy advanced by Amazon in this case, i.e., that any individual can be bound to arbitrate simply because he or she directly benefited in some way from using the product or service obtained by another via the contract. As federal courts have noted, compelling any non-primary user of a particular service to arbitration under the theory of equitable estoppel would lead to absurd results, as even a casual visitor to a residence could be bound by an agreement without notice because any use of those services could constitute receipt of a direct benefit. . . . That is neither the law of Washington, nor the law of this Circuit.

The court also rejects Amazon’s proposed “account-sharing” theory of equitable estoppel. In Nicosia, the court held that a plaintiff who used his spouse’s “Amazon mom” account and then asserted claims could be ordered to arbitrate. Because the plaintiff made a misrepresentation that allowed him to access the account and enjoy the contractual benefits, the court in that case held he should be bound by the arbitration clause. The court finds that case distinguishable. Unlike Nicosia and other cases where the plaintiff made a misrepresentation, Alexa can be freely used by anyone within close proximity to it. The plaintiff were not “asked to verify their identity, provide a password, or take any other steps to impersonate a registered accountholder to use the Alexa-enabled devices.”

__

We’ve blogged about whether a terms of service agreement can be enforced against minors on several occasions. The cases most often involve a minor’s right under state law to disaffirm a contract. Here, Amazon is dealing with whether a minor can be bound by a contract term, but the legal issue is much more basic: how to enforce contracts relating to household devices when all of the users are not required to indicate assent to the agreement. While in this case, the issue comes up in the context of claims asserted by minors, the same principles discussed in the case apply to possible claims asserted by neighbors or visitors. This case is a good illustration that courts easily rely on the fiction that internet users read and agree to terms of service agreements before using website and apps, but when it comes to third party users, legal doctrine can more easily pose a hurdle to enforcement of those terms.

Interestingly, the plaintiffs did not appear to focus on a minor’s ability to disaffirm, which is apparently a right under Washington law. As Eric notes, the court calls out Amazon for not including a term stating that parents agree to the terms on behalf of their children.

The court hints at the fact that Alexa does not take any measures to authenticate or request credentials from users. Perhaps this is not technically easy to accomplish or the cost outweighs the benefits, but one wonders whether that was a conscious decision on Amazon’s part to avoid a hurdle to using the product.

We blogged about the appeals court ruling in Nicosia (“Anarchy Has Ensued In Courts’ Handling of Online Contract Formation (Round Up Post)“) and it’s interesting to note that three years later, Amazon was able to convince the trial court to require arbitration. The latest Nicosia ruling escaped our attention somehow, but judging from this blog post touting its utility for e-commerce providers, it looks to be consequential.

For what it’s worth, the case is a long way away from facing the merits, but Washington’s privacy statute is fairly restrictive. See this post on Dillon v. Seattle Deposition Reporters for more on that.

NB: this is one of several lawsuits against Amazon alleging it improperly recorded and stored voiceprints.

__

Eric’s Comments

This is a bad ruling. The legal question in this case isn’t whether all potential Alexa users in a person’s household are bound by the homeowner’s contract with Amazon; the question is whether dependent minors, who couldn’t have agreed to Amazon’s contract without their parent’s consent anyways, are bound. Amazon correctly argued that it would be weird to block the parents’ lawsuits per the contract but allow the parents to advance the exact same claims using their children as proxies. The court embraces the formality that the kids aren’t signatories, even though the parents would have been the signatories to any contract with the kids.

Fortunately (?), this ruling appears to be fixable. The court says “Amazon has not cited any authority holding that the parents could not, on behalf of their children, have expressly agreed by contract to submit their children’s claims to arbitration.” In other words, a few extra words in Amazon’s TOU, saying that it binds the signatory and any of their dependents (minors and anyone else in the household without the capacity to enter into enforceable contracts), would have been enough. Sadly, that fix will be too late for Amazon to negate this case, but it’s never too late for *you* to add such a provision to your TOS.

The ruling does highlight a deep underlying problem with automatically recording technology, which has become ubiquitous. According to this ruling, Amazon has no way of implementing Alexa without picking up recordings of people who aren’t the buyers or their kids. So those non-signatories always will have the capacity to sue Amazon for the nonconsensual recording; and the cumulative legal exposure of those claims essentially negates the possibility of Alexa existing at all. Perhaps that’s the law, but it’s deeply anti-technology and anti-innovation. It reminds me a little of the ways courts have been interpreting Illinois’ BIPA. That law makes it impossible to do automated mass face-scanning of photos without the consent of everyone depicted, which de facto means that all face-scanning is illegal even if many folks would consent.

__

Case citation: B.F. v. Amazon, 19-910-RAJ-MLP (W.D. Wash. Oct. 21, 2019)

Related posts:

“Anarchy Has Ensued In Courts’ Handling of Online Contract Formation (Round Up Post)”

Minors’ Suit Over Facebook Credits Survives in Part – I.B. v. Facebook

Facebook’s “Browsewrap” Enforced Against Kids–EKD v. Facebook.”

Parents’ Lawsuit Against Apple for In-App Purchases by Minor Children Moves Forward — In re Apple In-App Purchase Litigation.”

Clickthrough Agreement Binding Against Minors–A.V. v. iParadigms