Recap of Stanford E-Commerce Conference Panel on Takedown Notices
By Eric Goldman
Last month, I attended Stanford Law School’s annual E-commerce Law Conference, one of my favorite conferences of the year because of its subject material and the chance to hang out with so many friends. This year, the conference had a panel on takedown notices, plus the issue came up briefly at the plenary General Counsel’s session at the day’s end. My notes from the conference [as usual, this isn’t a verbatim transcript and it reflects my impressions of the discussion, so double-check the recording before relying on anything].
Daphne Keller, Google. The web search team tries to leave stuff up if legal. When takedown notice comes in, Google asks: does the law require us to remove it? For Google’s other services, takedown notices prompt Google to ask if it wants to remove the item under its own discretion because it violates Google’s policies. Google is trying to create communities. Daphne loves standing up for users, but Google’s “community guidelines” ruleset can be a lawyer’s best friend—Google can rely on the community guidelines to remove content rather than researching complex laws. Draft community guidelines aren’t the lawyer’s job, but inconsistent enforcement of the guidelines can confuse users; and if the guidelines are too broad, that can undermine legal positions (such as DMCA defenses).
[Personally, I recommend websites put all of their legally binding rules in the user agreement, and then I recommend they publish community guidelines to help shape the community’s norms by telling users what the website expects from the community. If this division is followed, the website shouldn’t enforce the community guidelines because they aren’t legally binding; and the community should be empowered to self-enforce violations of its community norms (which might ultimately diverge from the community guidelines—they are only guidelines, after all, and you can’t tell the community what it should feel).
At Epinions, sometimes we did rely on our internal policies to redress content that was subject to a takedown notice. For example, we had a strict rule against a user having multiple accounts, and if we found multiple accounts when researching a takedown notice, we typically killed the accounts regardless of the takedown’s merits.]
Daphne’s advice to other websites: satisfy the DMCA formalities, including REGISTERING YOUR DMCA AGENT. Websites also need a repeat infringer policy and an escalation process internally to ensure that complaints reach the lawyers when they need to. She thinks websites should have a transparency report so users know what is coming down.
She also recommended monitoring “hotspots” where third parties are experiencing lots of frustration about removal requests. Monitoring and responding to hotspots led to the creation of YouTube’s Content ID and the fast-track for removing content from web search.
She wishes the legal rules would keep tech companies from pretending to be judges. [I have more to say on CSRs as private judges coming soon.]
Cindy Cohn, EFF: Intermediaries feel like they are having 2 way conversation between them and content owners, but the users should be part of the conversation too. EFF tries to be a voice in the otherwise voiceless users who have their content coming down.
EFF is not a fan of Content ID. 1) Doesn’t work. Content owners are still suing YouTube; content owners are never satisfied. There is no evidence that content owners have ever been appeased by websites’ efforts to accommodate them. 2) Congress wrote a law about censorship and copyright infringement (512). She’s frustrated that it wasn’t enough. Content owners had their say at Congress, now they want more than Congress gave them.
She’s also frustrated that the automated notice-and-takedown systems don’t create a paper trail of takedown notices. To help users whose content has been removed, EFF needs to see the demand letters. [I would add (and maybe Cindy actually said this) that websites should be publishing the takedown notices they receive, via ChillingEffects or somewhere else. As part of their transparency reports, websites should also be providing enough aggregate detail about any automated/fast-track content removal systems so the public knows who is using them and how.]
Her recommended best practices: Websites should respond promptly when users make a putback request and the putback window expires. Sometimes intermediaries don’t realize the time-sensitivity of the removed content. Websites should also create a “dolphin hotline” by providing a designated contact that can handle requests related to high-value removed content that needs to be restored. [She didn’t explain the “dolphin” reference, but I assume this is a reference to dolphins caught in driftline fishing nets; similarly, high-value content can be swept up in robo-takedown notices that aren’t adequately vetted before submission.]
Chris Sundermeier, Reputation.com. His position is that Reputation.com actually represents consumer interests, not EFF. The Internet creates forum for immense damage via speech. This is why we have limits to free speech. Reputation.com deals with people whose lives are disrupted. It’s not always possible to get content removed. Reputation.com starts with premise that speech will stay up, plus the Streisand Effect: if you make waves, things will get worse. So Reputation.com’s users need alternatives to takedowns. The company’s goals: suppress content off first couple of pages of Google search results by building “welcome” content.
Reputation.com believes that search results don’t provide the most relevant content. Instead, they reflect the most popular content, and that tends to be the negative information.
His recommendation for users: There are many ways to create image of ourselves online. Everyone should have a LinkedIn profile. [BTW, if we’ve met in person at least once or swapped emails a few times, feel free to connect with me at LinkedIn.]
David Gingras, Ripoff Report (ROR). He has represented Ripoff Report since 2005 and litigated more CDA cases than just about anyone. [He also represents Nik Richie and thedirty.com. FWIW, I invited David to speak at our Section 230 conference in 2011 but unfortunately he had a scheduling conflict.]
As he litigates more Section 230 cases, he feels like the tide is shifting against defendants. For example, in Jones v, thedirty, the judge took 230 off the table, which left the defendant Nik Richie defending content he didn’t create.
David subscribes to the view that the remedy for unwanted negative speech is more speech. ROR mediates between victims and free speech advocates. Still, ROR has changed. ROR never removed any content except for copyright or child porn, but ROR feels there’s a need for a pressure relief value. Their solution: VIP Arbitration, modeled on the UDRP, which outsources some content removal options to third party arbitrators. Providing the VIP Arbitration option has reduced the volume of litigation against ROR. [I have a forthcoming academic paper that analyzes ROR’s VIP Arbitration option.]
Tim Alger of Perkins Coie (the panel moderator) asked David if ROR has seen a decline in cases against it that implicate 230? Gingras: ROR still get demands, but arbitration has given another option to these demands. He’s seeing more trademark claims over consumer reviews. People are desperate for a remedy. David hates saying no, so he thinks there needs to be more options than just saying no.
Tim Alger asked the panelists: what are worst practices?
Sundermeier’s worst practice: sending a flaming takedown letter. That can cause Streisand Effect and alienates the recipient. He recommends that people requesting content takedowns send polite, well-thought-out requests.
Gingras’ worst practices: (1) people who represent themselves and get emotional; (2) hiring dabbler lawyers who are like My Cousin Vinny that don’t know the law [though it was pointed out that Vinny won his case]. Gingras’ best practice: keep takedown demands short and simple, and don’t explain what “defamation” is to him (makes him angry on a personal level). Be specific, be friendly, and explain why you should be helped.
Cohn’s worst practices: when companies make takedown decisions without talking to their customers. Give the users an opportunity to explain their side. Give a counternotice option that doesn’t require law degree to use. Best practice: stand with your customers and stand up for users.
Keller’s worst practices: don’t be Perfect 10 [good advice generally, but she meant it in the context of their common practice of sending takedown notices that are difficult to process]. She recommends that senders formulate proper takedown notices—name the URLs where the problematic content resides, use a digital format, and specify which law is being violated. Web search sees nearly half-million URLs taken down a day due to copyright.
From the plenary General Counsel’s forum:
Kirsten Mellor, CafePress: unlike patent demands, where sometimes the best response is no response, TM takedowns benefit from immediate response (she said “lightning fast response”). CafePress chooses its battles very carefully. They’ve automated takedown responses, but this still requires human review. CafePress gets thousands of C&Ds/year. They do quick dispute resolution but sometimes they stand up for their users. Striking the right balance with takedown notices is a work-in-progress every day, but it’s the cost of doing business. [Recall CafePress’ bad loss in the Born to Rock case, though it was due to the fact they stood up for their users.]
Mike Jacobson, eBay: eBay gets millions of takedowns a year, so they’ve had to automate the processes. Not all takedown notices are good, so eBay has had to spend some time to figure out who is abusing the system and push back on those. Their goal has been to make the system efficient, but then add human element to prevent folks who are abusing the efficiency.