Roundup of February’s ‘COMO at Scale Brussels’ Event

On Wednesday, I’m attending the IAPP event, Content Moderation in 2019, in Washington DC. We’ll be getting some of the old band back together again. Hope to see you there.

32786173807_50c2bacd77_k-300x153

Photographer: Olivier Anbergen

In anticipation of that, I’m finally posting my delayed roundup of COMO at Scale Brussels, held in February in the European Parliament building. Watch the videos. Professional photos from the event and my (vastly inferior) photo album. The prime mover behind that event was Prof. Aleksandra Kuczerawy of KU Leuven. Her recap of the eventBesedo’s recap.

In the afternoon, Google sponsored a follow-on event. Photos from that event.

* * *

My brief introductory remarks at the morning event in parliament:

I want to say thanks to the members of Parliament and their staff for convening this gathering.

I also want to say thanks to the event organizers, especially Aleksandra. As I’m sure you can appreciate, this was a hard conference to organize. Also, thanks to the presenters for sharing with us.

Today is the fourth event in the Content Moderation (COMO) series. The first was held a year ago in the Silicon Valley at Santa Clara University. It was inspired by FOSTA, legislation introduced in the United States Congress targeting the promotion of sex trafficking. It was clear that FOSTA’s drafters did not understand how Internet companies operationalize content moderation. Then again, this lack of awareness was partially due to the Internet companies themselves. They treated their content moderation operations as confidential (even if they weren’t trade secrets). The Santa Clara University conference was designed to overcome this confidentiality norm and publicly present operational details from a diverse set of Internet companies, so that they could help spot regulatory opportunities and pitfalls.

We followed the Santa Clara University conference with similar events in Washington DC and New York City. Today is our fourth event.

I’d like to briefly summarize four things I’ve learned from the conference series:

First, I’ve been impressed by the dedication and good faith of the people doing content moderation. We may not always agree with their methods or choices, but it’s clear to me that many people are trying hard to make good decisions.

Second, there is a wide diversity of how companies have operationalized their content moderation. Everything does things a little differently. I expect we’ll see even more diversity today.

Third, there is no magic wand that will solve all of the content moderation challenges. Technology plays an important and growing role, but even the best technology leaves a lot of hard work for humans to do.

Finally, the process of content moderation will never make everyone happy. By definition, moderation decisions create winners and losers. The people who didn’t get their desired outcome will always be unhappy with the result, and often with the process.

Today’s discussion bears on important social issues, and I’m delighted you are part of it.

* * *

Links from the prior three events:

Content Moderation and Removal at Scale, Santa Clara University, February 2018

Event page. My roundup post. My photo album.

Videos

Welcome and Introduction (including Sen. Ron Wyden’s opening remarks)

Legal Overview (presentations by Eric Goldman and Daphne Keller)

Overview of Each Company’s Operations (presentations from Automattic, Dropbox, Facebook, Google, Medium, Pinterest, Reddit, Wikipedia, and Yelp). If you only have time to watch one video, this is the one.

The History and Future of Content Moderation (panel featuring Nicole Wong, Charlotte Willner, and Dave Willner; moderated by Kate Klonick)

Session A: Employee/Contractor Hiring, Training and Mental Well-being (panelists from Automattic, Medium, and Pinterest)

Session B: Humans vs. Machines (panelists from Facebook, Wikimedia, and Yelp)

Session C: In-sourcing to Employees vs. Outsourcing to the Community or Vendors (panelists from Nextdoor, Pinterest, Reddit, Wikimedia, and Yelp)

Session D: Transparency and Appeals (panelists from Automattic, Medium, and Patreon)

Speaker Slides

Eric Goldman, US law overview

Daphne Keller, foreign law overview

Adelin Cai, Pinterest

Aaron Schur, Yelp

Techdirt Essays

Eric Goldman, It’s Time to Talk About Internet Companies’ Content Moderation Operations

Kate Klonick, Why The History Of Content Moderation Matters

Kevin Bankston & Liz Woolery, We Need To Shine A Light On Private Online Censorship

Alex Feerst, Implementing Transparency About Content Moderation

Jacob Rogers, International Inconsistencies In Copyright: Why It’s Hard To Know What’s Really Available To The Public

Adelin Cai, Putting Pinners First: How Pinterest Is Building Partnerships For Compassionate Content Moderation

Tarleton Gillespie, Moderation Is The Commodity

Paul Sieminski & Holly Hogan, Why (Allegedly) Defamatory Content On WordPress.com Doesn’t Come Down Without A Court Order

Sarah T. Roberts, Commercial Content Moderation & Worker Wellness: Challenges & Opportunities

Colin Sullivan, Trust Building As A Platform For Creative Businesses

__

COMO at Scale, Washington DC, May 2018

Event page. My roundup post.

Videos

Opening Remarks

Foundations: The Legal and Public Policy Framework for Content (Eric Goldman and Tiffany Li)

Under the Hood: UGC Moderation (Part 1) (Match – Tripadvisor – Twitter – Twitch – Vimeo)

Under the Hood: UGC Moderation (Part 2) (Github – Google – Wikimedia – Facebook)

You Make the Call: Audience Interactive (Emma Llanso and Mike Masnick)

Content Moderation and Law Enforcement

What Machines Are, and Aren’t, Good At

Transparency

Concluding Remarks

Presentations

My slides on the U.S. law of content moderation

Slides from GitHubTripAdvisorTwitchTwitter, and Vimeo

Photos

Conference organizer’s Flickr album

My photo album

__

COMO III: Content Moderation and the Future of Online Speech, St. Johns University (Manhattan), October 2018

Event page. My photo albumVideos.