Rehearing Briefs in Enigma Software v. Malwarebytes


A good PUP. Photo by Anik Shrestha,

In September, in Enigma v. Malwarebytes, the Ninth Circuit issued a troubling Section 230(c)(2)(B) ruling that allowed plaintiffs’ allegations of anti-competitive animus to override the safe harbor for anti-threat software vendors. It was a 2-1 ruling on a key topic, so it’s the kind of case that could support further proceedings in the Ninth Circuit.

Perhaps not surprisingly, the defendant Malwarebytes has requested en banc or panel review. Its petition for rehearing.

Four amicus briefs were filed in support of Malwarebytes’ brief:

Cybersecurity Law professors’ amicus brief

Venkat and I drafted this brief (with the help of Jess Miers) and filed it on behalf of 7 other professors. Our introduction:

The panel or the Court en banc should rehear this case so that it can reevaluate the ruling’s consequences for cybersecurity. Though anti-competitive animus could be a troubling reason for one software program to block another, the Court’s decision overcorrects for this concern. The panel decision will foster spurious legal accusations of anti-competitive blocking of software programs that are, in fact, dangerous to businesses and consumers. These legal threats will hinder the ability of anti-threat software vendors to properly classify threats to businesses and consumers, which will make the Internet less safe for everyone.

Internet Association amicus brief

Three aspects of the majority’s decision especially concern IA and its members. First, the panel improperly imported a motive-based good-faith limitation into Section 230(c)(2)(B). As explained in Appellee’s rehearing petition, that defies fundamental rules of statutory interpretation and collapses an important distinction between subsection (c)(2)(A), which includes an express “good faith” requirement, and subsection (c)(2)(B), which conspicuously omits one.

Second, by uncritically accepting what appears from the opinion to be Appellant’s bare allegations of anticompetitive animus, the panel’s decision threatens to make it all too easy for plaintiffs to plead around Section 230(c)(2)(B). That result is squarely at odds with this Court’s decisions in Fair Hous. Council v., LLC, 521 F.3d 1157, 1162 (9th Cir. 2008) (en banc), and Kimzey v. Yelp!, Inc., 836 F.3d 1263 (9th Cir. 2016). Those cases make clear that because Section 230 protects service providers against protracted legal battles (not just ultimate liability), the immunity cannot be defeated at the pleading stage with conclusory assertions. The panel’s contrary approach puts the content-moderation decisions of online providers and users at risk of “death by ten thousand duck-bites,”, 521 F.3d at 1174, opening the door to costly litigation for any plaintiff willing to make even threadbare allegations of improper motive. That subverts Congress’s goal of encouraging and removing disincentives for the development and use of filtering technologies.

Third, the majority’s dictum that the “criteria for blocking online material must be based on the characteristics of the online material, i.e., its content, and not on the identity of the entity that produced it,” is particularly troubling. While perhaps unintended by the panel, this stray statement could be applied in ways that would further undermine the very practices that Section 230 was intended to protect. Online service providers and their users routinely make moderation decisions that apply to entities or individuals, rather than just isolated pieces of content. That happens, for example, when a provider terminates a user’s account or when users deploy tools like Twitter’s Block feature to filter content from certain other users. These measures are a vital part of online self-regulation and are covered by any coherent reading of Section 230(c)(2). The panel’s ambiguous language threatens to arbitrarily limit the ability of platforms and users to protect themselves against abusive, offensive, or problematic accounts or users. At a minimum, therefore, the Court should grant rehearing to correct (or strike) the panel’s errant dicta.

ESET amicus brief

The majority opinion in this case undermines internet security and harms consumer choice in at least two critical ways.

First, the opinion creates a major roadblock to effective computer security software. The decision undercuts statutory immunity for filtering technology whenever there are allegations of anticompetitive animus, even though a purveyor of objectionable material can easily position itself as a competitor and make a facially plausible claim of such animus. This undermines Congress’s goals in enacting the Communications Decency Act, 47 U.S.C. § 230 (1996) (CDA), and harms the procompetitive interests the majority opinion purports to protect.

Second, the decision substitutes litigation for the user choice that has created a thriving marketplace of protections available to consumers. Such choice now exists at two levels: when the user decides what security software to deploy, and when the user chooses to filter out an objectionable program with the aid of that software. The majority opinion would substitute litigation in which the user has no role for both of these choices.

EFF/CAUCE amicus brief

Amici represent the interests of Internet users and support Malwarebytes’ petition because the Enigma panel’s ruling will discourage the development of effective tools that allow users to customize their experiences online. Reading Section 230(c)(2)(B) (47 U.S.C. § 230(c)(2)(B)) to provide unequivocal protection to the providers of filtering tools, which the Enigma panel failed to do, is consistent with the plain meaning of the statute and congressional policy goals, and ultimately best empowers Internet users by incentivizing the development of robust and diverse filtering tools.

Filtering tools give Internet users choices. People use filtering tools to directly protect themselves and to craft the online experiences that comport with their values, by screening out spyware, adware, or other forms of malware, spam, or content they deem inappropriate or offensive. Platforms use filtering tools for the same reasons, enabling them to create diverse places for people online.

Amicus EFF also supports rehearing because it directly benefits from a plain reading of Section 230(c)(2)(B), as its public interest technologists have developed a free tool, called Privacy Badger, that stops advertisers and other third-party trackers from secretly tracking users as they browse the web. EFF’s ability to continue providing free privacy-enhancing tools to Internet users will be seriously threatened if the panel’s incorrect interpretation of Section 230(c)(2)(B) stands.

Finally, amicus EFF supports rehearing because ensuring that Section 230(c)(2)(B) unequivocally protects filtering tool providers encourages those providers to block harmful software that is used to perpetuate domestic violence and harassment. EFF is working to eradicate this so-called “stalkerware,” and that goal is more likely to be achieved when filtering tool providers have the unqualified Section 230(c)(2)(B) immunity that Congress intended.

Case library

Malwarebytes’ petition for rehearing. Supporting amicus briefs from cybersecurity law professors, EFF/CAUCE, ESET, and Internet Association.

Ninth Circuit ruling. Blog post on that ruling.

District court opinion. Blog post on that ruling.

Related decision in Enigma Software v. Bleeping Computer. Blog post on that ruling.