FOSTA Claim Can Proceed Against Twitter–Doe v. Twitter

The court summarizes the allegations:

Plaintiffs John Doe #1 and John Doe #2 allege that when they were thirteen years old they were solicited and recruited for sex trafficking and manipulated into providing to a third-party sex trafficker pornographic videos (“the Videos”) of themselves through the social media platform Snapchat. A few years later, when Plaintiffs were still in high school, links to the Videos were posted on Twitter. Plaintiffs allege that when they learned of the posts, they informed law enforcement and urgently requested that Twitter remove them but Twitter initially refused to do so, allowing the posts to remain on Twitter, where they accrued more than 167,000 views and 2,223 retweets. According to Plaintiffs, it wasn’t until the mother of one of the boys contacted an agent of the Department of Homeland Security, who initiated contact with Twitter and requested the removal of the material, that Twitter finally took down the posts, nine days later.

This is a confounding case because it’s hard to believe that Twitter would knowingly leave up CSAM. There has to be more to this story. However, on a motion to dismiss, we don’t get to hear Twitter’s side of the story, and the court must accept the plaintiffs’ allegations. Despite that, the court dismisses most of the plaintiffs’ claims due to Section 230 or other limits. Nevertheless, the allegation that Twitter facilitated sex trafficking–a bizarre way of characterizing these facts–survives due to FOSTA. The court’s FOSTA analysis will break your mind, and it’s a reminder that FOSTA likely reaches far beyond the paradigmatic Backpage scenario to potentially govern more traditional Internet content moderation decisions.

TVPRA Sex Trafficking Claim. The court says the beneficiary liability claim survives Twitter’s motion to dismiss because:

where a plaintiff seeks to impose civil liability under Section 1595 based on a violation of Section 1591(a)(2), not only does the “known or should have known” language of Section 1595 apply (rather than the actual knowledge standard of Section 1591(a)) but such a claim also does not require that a plaintiff demonstrate an overt act that furthered the sex trafficking aspect of the venture in order to satisfy the “participation in a venture” requirement.

To reach this conclusion, the court expressly disagrees with the ruling in Doe v. Kik:

There is no question that FOSTA is a remedial statute in that it carves out exceptions to CDA § 230 immunity, thereby affording remedies to victims of sex trafficking that otherwise would not have been available. Moreover, the broader statutory framework suggests that the Kik court’s reading of FOSTA improperly adopted the most restrictive possible reading of that provision when there is an equally (or more) plausible reading of the plain language of FOSTA…

The implication of [the Kik] reading is that a sex trafficking victim who seeks to impose civil liability on an ICS provider on the basis of beneficiary liability faces a higher burden than a victim of sex trafficking who seeks to impose such liability on other types of defendants. Had Congress intended such a limitation on Section 1595 liability as applied to interactive computer services, it could have clearly stated as much, but it did not do so….

the Court concludes that Plaintiffs’ Section 1595 claim against Twitter based on alleged violation of Section 1591(a)(2) is not subject to the more stringent requirements that apply to criminal violations of that provision.

Confusion among judges is hardly surprising. FOSTA was a poorly drafted mashup of two poorly drafted rival bills. None of FOSTA makes sense. No one can agree what FOSTA means. So the judge’s claim that Congress could have “clearly stated as much” ignores FOSTA’s flawed process and incomprehensible drafting. The only thing “clear” about FOSTA is that Congress had no idea what it was doing or why.

The court says that the plaintiff successfully alleged that Twitter participated in the sex trafficking venture because Twitter allegedly didn’t remove the videos in response to takedown notices:

these allegations are sufficient to allege an ongoing pattern of conduct amounting to a tacit agreement with the perpetrators in this case to allow them to post videos and photographs it knew or should have known were related to sex trafficking without blocking their accounts or the Videos

Read literally, the court is saying that every web host can be in a “tacit agreement” with sex traffickers simply by not responding to takedown notices. Given the risks of a FOSTA claim, this legal standard would create a potent notice-and-takedown scheme where web hosts will remove any alleged sex trafficking content without much scrutiny, and that hair-trigger response can be easily weaponized to remove legitimate content. When you see a judge use a phrase like “tacit agreement,” there’s probably some analytical corner-cutting going on.

The court also says that videos of past commercial sex create the risk of future sex trafficking, no matter how old the videos are: “regardless of when the Videos were created, the allegations that the Videos were being retweeted on a massive scale while they remained on the Twitter platform raise a plausible inference that Twitter’s failure to remove the Videos would result in future commercial sex trafficking.” What? The court didn’t make all of its associated assumptions clear here. The judge’s statement can’t be true without some further unstated qualifiers.

As for Twitter’s receipt of a benefit from the sex trafficking venture, the court says these allegations are good enough:

the FAC contains detailed allegations about how Twitter monetizes content, including CSAM, through advertising, sale of access to its API, and data collection. It further alleges that searching for hashtags that are known to relate to CSAM brings up promoted links and advertisements, offering a screenshot of advertising that appeared in connection with one such hashtag. Plaintiffs also specifically allege that the Videos of Plaintiffs were “monetized by Twitter and it receive financial benefit from [their] distribution on its platform.” While Twitter dismisses this allegation as conclusory, it is supported by allegation that the Videos were “viewed at least 167,000 times and retweeted 2,220 times for additional views,” and that “[t]he videos remained live approximately another seven days, resulting in substantially more views and retweets.”

So the court accepts the generic allegation that Twitter profited from the videos-at-issue. This is the kind of pleading that should require some supporting facts.

The plaintiffs adequately alleged the reduced scienter that the videos depicted victims of sex trafficking in light of the takedown notices:

they alerted Twitter that the Videos were created under threat when Plaintiffs were children and provided evidence of John Doe #1’s age in response to Twitter’s request for further information. Plaintiffs also allege that other Twitter users used the word “twinks” to describe the children in the Videos, which was “another indication that Plaintiffs were minors, and that this fact was evident from their appearance in the Videos.

It’s hard to draw a broader lesson from this ruling other than the obvious lesson that Internet services need to pay close attention to takedown notices alleging sex trafficking. Then again, I’m skeptical that Twitter didn’t, so let’s see the full set of facts.

The court’s analysis of the FOSTA/sex trafficking crime/230 interplay is hardly the last word on the subject. We’ll be getting appellate court interpretations soon enough.

Other Claims

  • 18 U.S.C. §§ 2258A and 2258B. These provisions don’t have a private right of action.
  • 18 U.S.C. §§ 2252A and 2255. Section 230 preempts these claims. Cites to Doe v. Bates and MA v. Village Voice.
  • Defective design (along the lines of Lemmon v. Snap). Section 230 preempts this claim because “the nature of the alleged design flaw in this case – and the harm that is alleged to flow from that flaw – is directly related to the posting of third-party content on Twitter…[plaintiffs] seek to impose liability on Twitter based on how well Twitter has designed its platform to prevent the posting of third-party content containing child pornography and to remove that content after it is posted.”
  • Negligence. The claims are based on Twitter’s alleged failure to remove the third-party videos, which is preempted by Section 230 (including a cite to In re Facebook, discussed below).
  • Cal. Civ. Code section 1708.85(a) (non-consensual pornography dissemination). The statute doesn’t apply to already-disseminated videos, which the videos-at-issue are.
  • Intrusion into Seclusion. Twitter’s role in the intrusion is based on republishing the third-party videos, which Section 230 preempts. Cite to Caraccioli v. Facebook.
  • California Constitution Invasion of Privacy. Same analysis as intrusion into seclusion.
  • UCL. Same analysis as negligence.

By resolving the ancillary claims, the court narrows this case down to the FOSTA claim. That will help both sides focus their energy for the next round in this litigation, I find it impossible to believe that Twitter facilitated sex trafficking sufficient to trigger FOSTA, but poor Congressional drafting show just how far FOSTA might reach. If FOSTA permits a notice-and-takedown for any material allegedly associated with sex trafficking, we’ve barely scratched the surface of its implications.

Case citation: Doe v. Twitter, Inc., 2021 WL 3675207 (N.D. Cal. August 19, 2021). The complaint.

* * *

BONUS: I’ve been sitting on the Texas Supreme Court ruling from June, using FOSTA to let a state sex trafficking claim to proceed against Facebook. Overall, the decision is a mix of good and bad news for defendants. Like Doe v. Twitter, the Supreme Court dismisses all of the non-FOSTA claims due to Section 230. However, the court says that FOSTA enables a state sex trafficking claim, meaning that both Twitter and Facebook–mass-market social media services having almost nothing in common with Backpage–are the real parties-in-interest from the FOSTA amendments. And, as always, I must remind everyone that Facebook paved the way for FOSTA by directing the Internet Associated to endorse the SESTA amendments, and then Sheryl Sandburg endorsed the “worst of both worlds” FOSTA before the House vote. Seeing FOSTA come back to expose Facebook to greater liability is Alanis-level “ironic.”

(In another irony, the Doe v. Twitter judge only cites this opinion once, and only on the question of Section 230’s applicability to negligence, not on the tough FOSTA analysis where it struggled the most).

Section 230. The authoring judge writes fannishly of Justice Thomas’ atrocious statement in the Enigma v. Malwarebytes cert denial, yet the opinion still endorses Section 230. This passage is illustrative of the judge’s schizophrenia:

We agree that Justice Thomas’s recent writing lays out a plausible reading of section 230’s text…[but] every existing judicial decision interpreting section 230 takes the contrary position….if the more limited view is only one reasonable reading of the text—and if the broader view is also reasonable—we are hard pressed to cast aside altogether the universal approach of every court to examine the matter over the twenty-five years of section 230’s existence.

Section 230 resolves all of the negligence claims:

the uniform view of federal courts interpreting this federal statute requires dismissal of claims alleging that interactive websites like Facebook should do more to protect their users from the malicious or objectionable activity of other users. The plaintiffs’ claims for negligence, negligent undertaking, gross negligence, and products liability all fit this mold….the prevailing judicial interpretation of section 230 has become deeply imbedded in the expectations of those who operate and use interactive internet services like Facebook. We are not interpreting section 230 on a clean slate, and we will not put the Texas court system at odds with the overwhelming federal precedent supporting dismissal of the plaintiffs’ common-law claims.

Section 230 also applies to the failure-to-warn/failure-to-protect claims:

These claims seek to impose liability on Facebook for harm caused by malicious users of its platforms solely because Facebook failed to adequately protect the innocent users from the malicious ones. All the actions Plaintiffs allege Facebook should have taken to protect them—warnings, restrictions on eligibility for accounts, removal of postings, etc.—are actions courts have consistently viewed as those of a “publisher” for purposes of section 230. Regardless of whether Plaintiffs’ claims are couched as failure to warn, negligence, or some other tort of omission, any liability would be premised on second-guessing of Facebook’s “decisions relating to the monitoring, screening, and deletion of [third-party] content from its network.”….

This is no less true simply because Facebook’s alleged negligent omissions include failures to “require accounts for minors to be linked to those of adults” or “deprive known criminals from having accounts.” These claims may be couched as complaints about Facebook’s “design and operation” of its platforms “rather than . . . its role as a publisher of third-party content,” but the company’s “alleged lack of safety features ‘is only relevant to [Plaintiffs’] injur[ies] to the extent that such features’” would have averted wrongful communication via Facebook’s platforms by third parties. Herrick v. Grindr LLC, 765 Fed. Appx. 586, 590 (2d Cir. 2019) (quoting Herrick v. Grindr, LLC, 306 F. Supp. 3d 579, 591 (S.D.N.Y. 2018)). At bottom, these “claims seek to hold” Facebook “liable for its failure to combat or remove offensive third-party content, and are barred by § 230.” Id….

Facebook’s decision not to combat potentially harmful communication “by changing its web site policies” on warnings, flagging of messages, or who may establish an account on its platforms “was as much an editorial decision” regarding third-party content “as a decision not to delete a particular posting.”

Plaintiffs’ failure-to-warn theory suffers from the same infirmities. The warnings Plaintiffs seek would only be necessary because of Facebook’s allegedly inadequate policing of third-party content transmitted via its platforms. “Although it is indirect, liability under such a theory nevertheless” ultimately arises from the company’s transmission of the harmful content. Herrick, 306 F. Supp. 3d at 591. Moreover, “a warning about third-party content is a form of editing, just as much as a disclaimer printed at the top of a page of classified ads in a newspaper would be.”

Recasting the claims as products liability doesn’t change the analysis:

Plaintiffs’ products-liability claims are likewise premised on the alleged failure by Facebook to “provid[e] adequate warnings and/or instructions regarding the dangers of ‘grooming’ and human trafficking” on its platforms. Like Plaintiffs’ other common-law claims, these claims seek to hold Facebook liable for failing to protect Plaintiffs from third-party users on the site. For that reason, courts have consistently held that such claims are barred by section 230. This has been the unanimous view of other courts confronted with claims alleging that defectively designed internet products allowed for transmission of harmful third-party communications. See Herrick, 765 Fed. Appx. at 590; Inman v. Technicolor USA, Inc., No. 11-cv-666, 2011 WL 5829024, at *8 (W.D. Pa. Nov. 18, 2011); Doe v. MySpace, Inc., 629 F. Supp. 2d 663 (E.D. Tex. 2009)…

While there have been a few instances in which products-liability claims against websites have been allowed to proceed despite defendants’ CDA objections, see Bolger v., LLC, 267 Cal. Rptr. 3d 601, 626–27 (Ct. App. 2020); Erie Ins. Co. v., Inc., 925 F.3d 135, 139–40 (4th Cir. 2019); State Farm Fire & Cas. Co. v., Inc., 390 F. Supp. 3d 964, 973–74 (W.D. Wis. 2019), plaintiffs in those cases alleged that defendants provided tangible goods that caused physical injury or property damage. Here, by contrast (as in other cases in which products-liability claims have been held barred by section 230), the allegedly harmful attribute of Facebook’s “products” was that they permitted transmission of third-party communication that resulted in harm to Plaintiffs….

Courts around the country have consistently held that section 230 protects defendants from similar claims of failure to warn of harmful third-party content or negligent failure to protect users from third-party content.

It’s startling to see how decisively Herrick v. Grindr applies to the plaintiffs’ claims.

The court goes on to opine on many core aspects of Section 230 jurisprudence, including:

  • Must-carry obligations are misguided. “For those who believe Facebook and other such platforms should refuse to censor their users’ speech, it would seem that dialing back the protections of section 230(c)(1)—and thereby expanding the civil liability these companies face for failing to censor allegedly objectionable posts—would be counterproductive.” For more on why must-carry obligations would be a disaster, see my article with Jess Miers.
  • There is no publisher/platform distinction in Section 230: “it is not a clear departure from the statutory text to understand section 230’s use of the word ‘publisher’ to include both ‘primary’ and ‘secondary’ publishers—that is, to view ‘publisher’ in the broader, generic sense adopted in Zeran and the many decisions following it….the construction of the provision at which these courts have arrived is a defensible reading of its plain language. Imposing a tort duty on a social media platform to warn of or protect against malicious third-party postings would in some sense ‘treat’ the platform ‘as a publisher’ of the postings by assigning to the platform editorial or oversight duties commonly associated with publishers….Plaintiffs argue that their common-law claims do not treat Facebook as a ‘publisher’ or ‘speaker’ because they ‘do not seek to hold [it] liable for exercising any sort of editorial function over its users’ communications,’ but instead merely for its own ‘failure to implement any measures to protect them” from “the dangers posed by its products.’ Yet this theory of liability, while phrased in terms of Facebook’s omissions, would in reality hold the company liable simply because it passively served as an ‘intermediar[y] for other parties’ . . . injurious messages.'”
  • Congress wants Section 230 read broadly: “Congress, with knowledge of the prevailing judicial understanding of section 230, has twice expanded its scope.” The two examples are the Dot Kids act and the SPEECH Act. Indeed, Congress expressly ratified the Zeran interpretation in the legislative history to the Dot Kids act. Thus, “Plaintiffs’ narrow view of section 230, while textually plausible, is not so convincing as to compel us to upset the many settled expectations associated with the prevailing judicial understanding of section 230….Under the view of section 230 adopted in every published decision of which we are aware, these [common law] claims ‘treat[]’ Facebook ‘as the publisher or speaker’ of third-party communication and are therefore barred.”
  • Section 230 doesn’t apply only to defamation claims: “Plaintiffs further argue that section 230’s prohibition on ‘treat[ing] an internet company as a ‘publisher or speaker’’ preempts only suits that ‘allege or implicate defamation,’ since this was the primary ‘kind of liability Congress had in mind’ when it enacted the provision. This proposed limitation on section 230 has been rejected by every court that has considered it.” The flagship example is the Wisconsin Supreme Court ruling in Daniel v. Armslist.

FOSTA. After the court’s rocky start with love letters to Justice Thomas, the court’s opinion on Section 230 turns into a powerful and decisive defense-favorable ruling. The analysis on FOSTA, however, turns the situation around yet again.

The court first gets Section 230 out of the way by saying that the plaintiffs are suing for Facebook’s first-party conduct, not any third-party content:

Section 230, as amended, does not withdraw from the states the authority to protect their citizens from internet companies whose own actions—as opposed to those of their users—amount to knowing or intentional participation in human trafficking…the statutory claim for knowingly or intentionally benefiting from participation in a human-trafficking venture is not barred by section 230 and may proceed to further litigation.

This conduct/content distinction has shown up in other cases, such as the criminal prosecutions over Backpage. Read literally, the court’s position seems accurate; EXCEPT THAT the only way that Facebook allegedly participated in the scheme was by publishing third-party content. Collapsing that content/conduct distinction lets judges sidestep Section 230 while retaining the illusion of fidelity to the rule of law.

With this move, the court addresses the question of whether Texas’ state anti-sex trafficking law can be used against Facebook.

Regarding participation in the venture, the court says:

Liability under these statutes requires a showing that a defendant acquired a benefit by “participat[ing]” in a human-trafficking “venture.” Such “participation” connotes more than mere passive acquiescence in trafficking conducted by others….“Participation” typically entails, at a minimum, an overt act in furtherance of the venture….It follows that a claim under section 98.002 arises not merely from a website’s failure to take action in response to the injurious communications of others, but instead from the website’s own affirmative acts to facilitate injurious communications.”

In the Twitter case, consider whether failing to respond to a takedown notice constitutes an “overt act.” In the Facebook case, the court says the plaintiffs made sufficient allegations of overt acts:

While many of Plaintiffs’ allegations accuse Facebook of failing to act as Plaintiffs believe it should have, the section 98.002 claims also allege overt acts by Facebook encouraging the use of its platforms for sex trafficking. For instance, the petitions state that Facebook “creat[ed] a breeding ground for sex traffickers to stalk and entrap survivors”; that “Facebook . . . knowingly aided, facilitated and assisted sex traffickers, including the sex trafficker[s] who recruited [Plaintiffs] from Facebook” and “knowingly benefitted” from rendering such assistance; that “Facebook has assisted and facilitated the trafficking of [Plaintiffs] and other minors on Facebook”; and that Facebook “uses the detailed information it collects and buys on its users to direct users to persons they likely want to meet” and, “[i]n doing so, . . . facilitates human trafficking by identifying potential targets, like [Plaintiffs], and connecting traffickers with those individuals.” Read liberally in Plaintiffs’ favor, these statements may be taken as alleging affirmative acts by Facebook to encourage unlawful conduct on its platforms….These allegations do not treat Facebook as a publisher who bears responsibility for the words or actions of third-party content providers. Instead, they treat Facebook like any other party who bears responsibility for its own wrongful acts.

I don’t see how the plaintiffs can ultimately prove these facts. Thus, despite Facebook losing the motion to dismiss, the opinion doesn’t suggest Facebook will lose the case. While Facebook would have preferred the Supreme Court dismissal, Facebook probably feels like this lawsuit has gone down in risk because of the steep challenges proving the alleged facts.

Finally, the court turns back to the question of Section 230’s applicability to the state sex trafficking civil claim. Why would it do this when it said earlier that this was a lawsuit over first-party conduct, not third-party content? All of this discussion seems like dicta based on that earlier move.

Worse, the court twists FOSTA’s language to suggest that FOSTA enables state civil claims to get around Section 230. FOSTA text makes it clear that it only excluded federal civil claims, state criminal prosecutions, and AG parens patriae claims. The court tries to explain:

Congress’s mandate that section 230 not “be construed” to bar federal civil statutory human-trafficking claims necessarily dictates that section 230 must not be construed to bar materially indistinguishable state civil claims either…If liability under federal section 1595 would not treat defendants as “speakers or publishers” within the meaning of section 230, it is hard to understand how liability under Texas’s section 98.002 could possibly do so…

The “Sense of Congress,” enacted as part of FOSTA’s text, was that “section 230 of the [CDA] was never intended to provide legal protection to . . . websites that facilitate traffickers in advertising the sale of unlawful sex acts with sex trafficking victims.” Pub. L. No. 115-164, § 2. If section 230 was “never intended” to immunize defendants against claims brought pursuant to 18 U.S.C § 1595, it stands to reason that the provision also never afforded immunity from analogous state-law causes of action.

This is baffling coming from a purportedly textualist judge/court. The words of the statute say nothing about state civil claims. Intuiting that Congress must have meant to write them is judicial activism. Especially when it’s dicta, it’s a pretty egregious mistake.

Case citationIn re Facebook, Inc., 2021 WL 2603687 (Tex. Supreme Ct. June 25, 2021)

* * *

More SESTA/FOSTA-Related Posts:

* FOSTA Survives Constitutional Challenge–US v. Martono
* 2H 2020 Quick Links, Part 4 (FOSTA)
Justice Thomas’ Anti-Section 230 Statement Doesn’t Support Reconsideration–JB v. Craigslist
Sex Trafficking Lawsuit Against Craigslist Moves Forward–ML v. Craigslist
Section 230 Preempts Another FOSTA Claim–Doe v. Kik
Section 230 Protects Craigslist from Sex Trafficking Claims, Despite FOSTA–JB v. Craigslist
Facebook Still Can’t Dismiss Sex Trafficking Victims’ Lawsuit in Texas State Court
Craigslist Denied Section 230 Immunity for Classified Ads from 2008–ML v. Craigslist
2H 2019 and Q1 2020 Quick Links, Part 3 (FOSTA/Backpage)
New Paper Explains How FOSTA Devastated Male Sex Workers
FOSTA Constitutional Challenge Revived–Woodhull Freedom Foundation v. US
New Civil FOSTA Lawsuits Push Expansive Legal Theories Against Unexpected Defendants (Guest Blog Post)
Section 230 Helps Salesforce Defeat Sex Trafficking Lawsuit–Doe v. Salesforce
Latest Linkwrap on FOSTA’s Aftermath
Section 230 Doesn’t End Lawsuit Claiming Facebook Facilitated Sex Trafficking–Doe v. Facebook
New Essay: The Complicated Story of FOSTA and Section 230
Who Benefited from FOSTA? (Spoiler: Probably No One)
FOSTA’s Political Curse
FOSTA Doesn’t Help Pro Se Litigant’s Defamation Claim Against Facebook
Constitutional Challenge to FOSTA Dismissed for Lack of Standing (Guest Blog Post)
An Update on the Constitutional Court Challenge to FOSTA–Woodhull Freedom v. US (Guest Blog Post)
Indianapolis Police Have Been “Blinded Lately Because They Shut Backpage Down”
Constitutional Challenge Against FOSTA Filed–Woodhull v. US (Guest Blog Post)
Catching Up on FOSTA Since Its Enactment (A Linkwrap)
More Aftermath from the ‘Worst of Both Worlds FOSTA’
‘Worst of Both Worlds’ FOSTA Signed Into Law, Completing Section 230’s Evisceration
Backpage Loses Another Section 230 Motion (Again Without SESTA/FOSTA)–Florida Abolitionists v. Backpage
District Court Ruling Highlights Congress’ Hastiness To Pass ‘Worst of Both Worlds FOSTA’– Doe 1 v. Backpage
More on the Unconstitutional Retroactivity of ‘Worst of Both Worlds FOSTA’ (Guest Blog Post)
Senate Passes ‘Worst of Both Worlds FOSTA’ (Linkwrap)
Why FOSTA’s Restriction on Prostitution Promotion Violates the First Amendment (Guest Blog Post)
SESTA’s Sponsors Still Don’t Understand Section 230 (As They Are About to Eviscerate It)
Can the ‘Worst of Both Worlds FOSTA’ Be Salvaged? Perhaps…and You Can Help (URGENT CALL TO ACTION)
Congress Probably Will Ruin Section 230 This Week (SESTA/FOSTA Updates)
What’s New With SESTA/FOSTA (January 17, 2018 edition)
New House Bill (Substitute FOSTA) Has More Promising Approach to Regulating Online Sex Trafficking
* My testimony at the House Energy & Commerce Committee: Balancing Section 230 and Anti-Sex Trafficking Initiatives
How SESTA Undermines Section 230’s Good Samaritan Provisions
Manager’s Amendment for SESTA Slightly Improves a Still-Terrible Bill
Another Human Trafficking Expert Raises Concerns About SESTA (Guest Blog Post)
Another SESTA Linkwrap (Week of October 30)
Recent SESTA Developments (A Linkwrap)
Section 230’s Applicability to ‘Inconsistent’ State Laws (Guest Blog Post)
An Overview of Congress’ Pending Legislation on Sex Trafficking (Guest Blog Post)
The DOJ’s Busts of MyRedbook & Rentboy Show How Backpage Might Be Prosecuted (Guest Blog Post)
Problems With SESTA’s Retroactivity Provision (Guest Blog Post)
My Senate Testimony on SESTA + SESTA Hearing Linkwrap
Debunking Some Myths About Section 230 and Sex Trafficking (Guest Blog Post)
Congress Is About To Ruin Its Online Free Speech Masterpiece (Cross-Post)
Backpage Executives Must Face Money Laundering Charges Despite Section 230–People v. Ferrer
How Section 230 Helps Sex Trafficking Victims (and SESTA Would Hurt Them) (guest blog post)
Sen. Portman Says SESTA Doesn’t Affect the Good Samaritan Defense. He’s Wrong
Senate’s “Stop Enabling Sex Traffickers Act of 2017”–and Section 230’s Imminent Evisceration
The “Allow States and Victims to Fight Online Sex Trafficking Act of 2017” Bill Would Be Bad News for Section 230
WARNING: Draft “No Immunity for Sex Traffickers Online Act” Bill Poses Major Threat to Section 230
The Implications of Excluding State Crimes from 47 U.S.C. § 230’s Immunity