Does Yelp Have The ‘Most Trusted Reviews’? A Court Wants To Know More (Forbes Cross-Post)

Few online algorithms generate as much criticism as Yelp’s algorithm for filtering its users’ reviews, but Yelp has so far successfully avoided a serious legal challenge to its filter. Recently, a California appellate court green-lighted a lawsuit over how Yelp publicly describes its user reviews. This means a judge may look more closely into the prevalence of fake reviews on Yelp and the operation of Yelp’s filtering algorithm.

What Happened

James Demetriades owns the restaurants Jimmy’s Taverna, Rafters, and Red Lantern in Mammoth Lakes, California. He didn’t like the way Yelp’s review filter handled users’ reviews of his restaurant. For example, he alleged the Yelp filter screened out nearly half of the reviews on one of his restaurants, and it didn’t filter a review that allegedly contained false statements.

Based on a law Congress enacted in 1996, 47 USC 230 (Section 230), Yelp isn’t legally liable for its users’ reviews, even reviews that Yelp knows are fake. Section 230 also makes clear that Yelp isn’t liable for filtering reviews it deems objectionable, even if its filter makes mistakes. To avoid Section 230, Demetriades instead sued Yelp’s marketing descriptions of its filter algorithm, including the following statements:

1. “Yelp uses a filter to give consumers the most trusted reviews”;
2. “All reviews that live on people’s profile pages go through a remarkable filtering process that takes the reviews that are the most trustworthy and from the most established sources and displays them on the business page. This keeps the less trustworthy reviews out so that when it comes time to make a decision you can make that [decision] using information and insights that are actually helpful”;
3. “Rest assured that our engineers are working to make sure that whatever is up there is the most unbiased and accurate information you will be able to find about local businesses”;
4. “Yelp is always working to do as good a job as possible on a very complicated task—only showing the most trustworthy and useful content out there”; 5. “Yelp has an automated filter that suppresses a small portion of reviews—it targets those suspicious ones you see on other sites.”

Photo credit: Trustworthy arrow bar // ShutterStock

To me, this language sounds like typical marketing hype, or puffery, which isn’t legally actionable. The appellate court disagreed, concluding that Yelp made “specific and detailed statements intended to induce reliance” and spoke “with the authority of a website that intends to attract users with the accuracy of its filter.”

The lower court rejected Demetriades’ lawsuit as a “SLAPP,” or a lawsuit designed to suppress socially beneficial speech. California has a strong anti-SLAPP law that weeds out bad lawsuits early; it’s like a fast lane for cases we don’t want in the court system. The appellate court held Demetriades’ lawsuit wasn’t a SLAPP because Yelp’s statements about its review filter fit into a statutory exception for “commercial speech,” such as advertising. The court says:

Yelp’s statements about its review filter—as opposed to the content of the reviews themselves—are commercial speech about the quality of its product (the reliability of its review filter) intended to reach third parties to induce them to engage in a commercial transaction (patronizing Yelp’s website, which patronage induces businesses on Yelp to purchase advertising).

The appellate ruling doesn’t mean that Yelp will be liable for its descriptions of its review filter. The appellate court simply revived the case and sent it back to the lower court to begin a normal judicial process. Still, this is a dispiriting loss for Yelp. It gives more fuel to Yelp’s algorithm-haters and revives a lawsuit that poses some risk to Yelp.

Yelp sent me the following statement:

It’s important to note that the court did not classify Yelp’s use of its recommendation software as “commercial speech”–indeed, the court reaffirmed that Yelp’s actions to protect consumers through the use of automated recommendation software were not impacted by its decision. Yelp’s use of its recommendation software is protected under federal law, as has been repeatedly affirmed by federal and state courts nationwide. This opinion does nothing to impact those decisions.

Rather, the court analyzed five statements that Yelp made about its software, as part of a longer video previously posted on Yelp. While the court did not determine that any of these statements was a misrepresentation, it did find that the statements fell within the commercial speech exception to the anti-SLAPP statute. The trial court had found each of these statements to be mere statements of opinion that do not fall within the commercial speech exception.

This action was part of a larger effort by the plaintiff, a self-described multimillionaire and developer, to target critics of his restaurant and to suppress negative reviews on Yelp. We believe the Court did not reach the right result, and we are evaluating our next steps.

Implications

Proxy battle. Demetriades’ real target appears to be Yelp’s users reviews on his restaurants, but his lawsuit claims Yelp falsely advertising its own services. I call lawsuits like this “proxy battles” where false advertising is being pursued as a proxy for some other underlying concern. Demetriades can’t directly address his problem with the reviews directly due to Section 230, so false advertising is the best available alternative (even if it’s a poor substitute). Courts often don’t handle proxy battles well because false advertising laws are being asked to address a problem they weren’t designed to address. As a result, proxy battles are unpredictable and doctrinally messy.

Section 230. Section 230 provides ample protection against Yelp’s liability for its user reviews or the operation of its filter. The appellate court said Section 230 doesn’t apply to Demetriades’ false advertising claims because:

Nowhere does plaintiff seek to enjoin or hold Yelp liable for the statements of third parties (i.e., reviewers) on its website. Rather, plaintiff seeks to hold Yelp liable for its own statements regarding the accuracy of its filter.

The court’s distinction may sound logical, but it’s problematic in practice. First, some courts have applied Section 230 to protect against advertising claims when those claims are rendered false due to users’ activities. For example, in 2010, a Texas court held that Section 230 protected a website for claimings that its site content was “accurate” even though user-supplied content was allegedly false. Second, we’ve seen numerous cases where plaintiffs have sought an “end-run” around Section 230 by suing over the site’s marketing language. This ruling gives plaintiffs further incentives to pick apart review websites’ marketing and explanatory disclosures for anything that might possibly be inaccurate, and the cumulative effect of those lawsuits threatens to undermine Section 230.

Implications for federal anti-SLAPP law. Yelp has been lobbying for a federal anti-SLAPP law (something I favor as well). I imagine Yelp will study this ruling carefully to ensure that any proposed federal legislation provides it with adequate coverage.

Can review websites ever promise trustworthy reviews? In 2012, the UK Advertising Standards Authority “told TripAdvisor not to claim or imply that all the reviews that appeared on the website were from real travellers, or were honest, real or trusted.” Similar to Yelp, TripAdvisor had made claims such as “Reviews you can trust”, “… read reviews from real travellers”, “TripAdvisor offers trusted advice from real travellers” and “More than 50 million honest travel reviews and opinions from real travellers around the world.” Nevertheless, TripAdvisor (like every other review website) had fake reviews on its site. UK law differs from US law, but the net effect may be the same. Unless a review website can ensure that all of its user reviews are legit–which no review website can do–it faces some risk self-promoting the veracity of its users’ reviews.

Will Yelp have to disclose its filter algorithm? Many plaintiffs would love to inspect Yelp’s filtering algorithm closely. The opinion says that Demetriades doesn’t seek to “obtain information on the mechanics of the filter.” However, the filter’s efficacy is being questioned, which inherently raises questions about exactly what the filter does and doesn’t do. If Yelp’s filtering algorithm is potentially discoverable in the case, the case’s stakes will go up a lot. If Demetriades can get a court to order Yelp to disclose the algorithm to him, expect Yelp to settle, even if on unfavorable terms.

Case citation: Demetriades v. Yelp, Inc., 2014 WL 3661491 (Cal. App. Ct. July 24, 2014)