Lori Drew Conviction Reflections, Part 3 of 3: Lessons for Cyberlawyers Drafting User Agreements
By Eric Goldman
[Note: this is Part 3 of a special 3-part series on the Lori Drew conviction. Part 1 discussed why MySpace, the putative victim of Lori Drew’s crime, might end up regretting the conviction. Part 2 discussed some problems with holding Lori Drew responsible for a contract she never clicked through. This post concludes the series.]
Last week, I went to my first CLE conference for Cyberlawyers since Drew’s conviction, and the conference included panels about online contract drafting. Given that Drew’s conviction was based on MySpace’s user agreement and contract formation process, I expected some discussion about the case’s implications. Instead, I was very surprised that the panels had no discussion about the lawyer’s role in drafting MySpace’s contract or any lessons we as Cyberlawyers should take from her conviction for drafting future contracts. The materials were identical to the discussion we would have had before the conviction.
Personally, I think this is a huge oversight. We as Cyberlawyers cannot lose sight of the social responsibilities that complement our client responsibilities. And as this case illustrates, when we draft overly expansive contract restrictions in online user agreements, we may be unwittingly turning many or all of our clients’ users into criminals. Not only is criminalizing our clients’ customers users potentially bad for their businesses, but it is irresponsible—and unnecessary.
The problem is particularly acute for user behavior restrictions that the service provider never plans to enforce. As has been pointed out elsewhere, a contract restriction saying that “kids under 18 cannot use the website” has no legal meaning (it’s designed to deal with the voidability of contracts with minors, although this might be less of an issue than we thought), but it potentially criminalizes any minors who ignore the language. Similarly, a restriction on creating accounts using false registration information might be handy in those rare cases when the service provider is chasing spammers who create bogus accounts, but it also potentially criminalizes many users who legitimately might not want to tell the complete truth to the website during registration. Thus, while there might be some limited circumstances where these clauses are appropriate, for the most part we need to dump these overly expansive behavioral restrictions from our toolkits.
Based on the Lori Drew conviction and other recent developments, such as the JuicyCampus enforcement action, I have two recommendations for how Cyberlawyers should draft user agreements in the post-Lori Drew conviction era.
Use Generalized, Not Specific, Behavioral Restrictions
First, user agreements should rely more heavily on generalized permissive statements like “the website may terminate users at any time in its sole discretion” instead of laundry lists of prohibited user behaviors. Historically, unrestricted termination clauses were considered troublesome because they were cited against Napster as satisfying the “right and ability to supervise” prong of vicarious copyright infringement. However, we have since learned that Napster was an aberrational case and it would be foolish to try to change our in-the-field practices based on the case. At this point, I see no clear legal downside to using a generalized termination right, and it obviates the need for long, ambiguous, thesaurus-driven behavioral codes.
Alternatively, liquidated damages provisions can deter unwanted behavior without establishing negative covenants. For example, MySpace’s anti-spamming liquidated damages clause paid off big for MySpace in its lawsuit against theglobe.com.
Move Behavioral Restrictions to a Separate Community Norms Document
Second, behavioral restrictions that do not need to be specifically barred in the user agreement can be moved into a separate statement of community norms/standards. This way, users are told what they can do and not do, but the statement does not have the force of law. Ideally, other users can be given tools to help them enforce the community norms. Even better, the norms can be posted on a wiki so that the site’s users can help update them as the site’s community evolves.
This community norms approach has at least three benefits. First, because the norms aren’t part of the agreement, overzealous prosecutors can’t use them as a basis of prosecution, so the site avoids unwittingly criminalizing its users. Second, if the norms are phrased right, overzealous plaintiffs cannot argue that the negative restrictions are implicit marketing representations that such user conduct will not take place on the site. This will squelch analytically corrupt claims like the ones advanced by the New Jersey’s attorney general office against JuicyCampus. Third, a separate non-legal document may be a more effective tool to communicate site expectations than embedding those rules in a user agreement that no one will read (all statistics I’ve seen indicate that well less than 1% of users read user agreements).
I realize it’s unlikely Cyberlawyers will enthusiastically change their drafting techniques in response to the Lori Drew conviction. If nothing else, I find contract drafting attorneys tend to ossify their techniques; plus a number of forces conspire to push drafting attorneys to make longer and meaner user agreements. But at the same time, I think it would further compound the tragedies of Meier’s suicide if we don’t internalize the message that our user agreements are being used to try to send people to jail, whether we intend that or not.