Previewing the “Lessons from the First Internet Ages” Symposium
As I mentioned, I’m part of a team organizing a special virtual event called the “Lessons from the First Internet Ages Symposium.” The event starts tomorrow 10am Pacific, and it’s not too late to sign up!
The event will explore themes that emerged from a package of essays from an all-star group of Internet legends. Their essays are now available online! A roadmap to them:
[Note: the Knight Foundation has adopted lower-case “internet” despite my protestations. Matthew Prince and I even discuss this in our interview.]
Initially, he thought web users would share their content as freely as they read content. Instead, “the web took off as a medium with a few publishers and a lot of readers––and not the collaborative mind meld I had hoped for.” His solution: build an infomediary that allows web users to share their content with third parties using custom-set reader-specific permissions.
He identified several lessons:
- the value of “open architectures” and “relatively neutral infrastructure”
- the Internet is capable of being used for pro- and anti-social uses
- inventors keep finding new applications of the infrastructure
- changing the infrastructure is hard, such as to add more security or expand scale
- we need “sustainable business models to maintain and expand the reach of the internet”
- affordability has been a barrier to Internet adoption
He concludes: “There is still a great deal of room for new developments to improve the safety, security, privacy, reliability and utility of the internet.”
He says the Internet “has come to be defined by user-created content,” and Section 230 is a big part of that. He identifies two ways Section 230 can be improved: (1) Internet services should be required to honor court orders to remove content [Note: I’ve written 1k words why I don’t agree with that in this post (see point 4); and due process, the First Amendment, and the FRCP pertain this issue in addition to Section 230.] (2) He would have more clearly said that “platforms can be content creators or developers themselves.”
He reiterates a few places where Section 230 says exactly what he wants it to say: (1) the First Amendment protects politically biased content moderation, so Section 230 does not need to address it. (2) “Section 230 does not require political neutrality.”
He concludes:
There are other tweaks I might be tempted to make to my own legislative handiwork if I were to find myself back in 1996, still holding the pen. But then again, I might restrain myself, knowing as I do now the many aspects of the modern internet that we have come to take for granted, and that are dependent upon Section 230’s protections….It’s just as likely that instead of perfecting Section 230, opening the door for more changes would threaten its essential elements that we know now have made it work.
His vision for LinkedIn “was to inspire users to be more intentional about building the professional networks that positively impact both daily work life and long-term career development. This, in turn, would provide them with more economic security and autonomy while simultaneously enabling companies, industries and even entire geographic regions to operate more effectively.”
He learned that “as platforms grow, and negative emotions and behaviors like fear, anger, greed and malice gain a foothold, the lowest common denominator often sets the tone. So, what starts out as digital Eden devolves into a digital hell.”
He gave an example of how LinkedIn users bypassed writing detailed personal profiles and instead maximized their on-site connections. He says: “however hard I had anticipated it might be to convert emotions like pride and greed into more productive behaviors and aspirational identities, it has proven to be even harder in practice. To do so requires a deliberate, judicious, and consistently well-executed commitment to leadership and governance.”
For example, when LinkedIn launched its publishing platform, it gave authoring rights only to 150 top leaders so they could set authoring norms for the entire community. “So, when we opened up the platform to all our users after seventeen months of influencer-only usage, positive and healthy norms had been established.”
He concludes: “internet platform builders have an opportunity and an obligation to convert humanity’s negative passions into more positive outcomes. By continuously innovating and renovating the incentive structures and power structures that define these communities, we can both empower individuals and transform the madness of the masses into the prosperity and well-being of the crowd.”
He addresses three lessons from the battles over openness:
- “relatively few dedicated people, with the support of the government, can create an open environment for breathtaking economic and cultural flowering”
- it’s hard “to make protocols work as well as walled gardens, and that it requires many dedicated people who are willing to sit out the gold rush of venture capital, acquisitions and patents”
- “we will need dedicated government support, philanthropy and institutions in order to protect public access and libraries”
“[D]isinformation professionals are really good at taking a good thing and building a profitable lie around it, sometimes in order to manufacture a culture war…I feel I failed my community by not preparing for this from the beginning.”
She highlights two lessons:
- “Sunlight is not the best disinfectant when it comes to hate. It just reinforces power structures that already exist and favors young, cisgender white men….Allowing all speech reinforces the uneven playing field and makes it unsafe for people who are less powerful, have less of a voice and less of a platform on which to speak up.”
- “The fear that platform employees have of their users drives a lot of decision making.”
Her solutions:
- “Executives, companies and board members should be held accountable for their actions and inactions in preventing the harm that we see being caused by lax policies or poor implementation and enforcement.”
- “We need leaders with empathy for people who are experiencing harassment.”
Instead of submitting an essay, I had an hour-long virtual conversation with Matthew Prince. You can watch the video.
Some of the most interesting parts IMO:
- “when we have made decisions in the past to shut down particular customers, one of the very first things that we get are very legitimate, responsible, large platforms calling us and saying, ‘Well, how can we be sure that you’re not going to do the same thing to us someday?'”
- “I think there are some really significant consequences, where you deny fundamental Internet technologies for all but the most egregious content. It should almost never get to the point there where you say, ‘DNS is cut off for everyone,’ or, ‘Domain registration is cut off for everyone.’ That seems like the foundation on which all of the rest of the Internet is built, and it seems very dangerous if we start tinkering with that on a policy or editorial basis.”
- Regarding the Daily Stormer, “It was not an organization we were proud to have using our services. But again, if we thought about it from the perspective of if Cloudflare ran the entire Internet, should they be on the Internet or not, it felt like that was a pretty tough call for us to be making….the Daily Stormer people just did some of the most repugnant things that you can imagine doing as a human being. If you’re going to fire any customer, firing Neo-Nazi customers is really fun.”
- More on the Daily Stormer: “one of the things that was the final determination in us making the decision to kick this particular customer off was that one of our large customers said, basically, “It’s them or us.” That was not a comfortable conversation because we’re a business, and we have shareholders, and we responsibilities to them, and at the end of the day, sacrificing someone who’s paying us nothing and is repugnant for someone who’s paying us a lot and is a good organization––we didn’t love being in that situation, it put us in a hard place. What’s interesting is the general counsel of that organization, about six months later, called me back and said, ‘I owe you an apology.’ I said, ‘What are you talking about?’ and this person had just lived through another situation––this was a software company, and they had provided services in a way that a certain group of people found offensive, and it blew up in their face. The general counsel said, ‘I thought that this was really straightforward, but now having lived through it myself, I see that this is incredibly complicated.'”
- “One of the things that’s tricky about all this is that the Internet’s still in its absolute infancy, and it took us a really long time to figure out what were the norms around the printed word, what were the norms around radio, what were the norms around television, what were the norms around the telephone system, and then once you figure out what those norms are, what then are the appropriate laws that follow that? I think we’re still so, so, so early on that it’s not clear.”
- “it blows my mind that there isn’t a Fox News search engine… if you think of Facebook as the modern newspaper, it really is quite remarkable that there isn’t a conservative Facebook and a liberal Facebook”
- “one of the things that has become a priority for us now, which I wish had been a bigger priority earlier, is how do we figure out how to not only reduce, but make our impact on the environment literally negative? The Internet burns a huge amount of resources and energy resources, a lot of which is just wasted, and we have only recently realized how important that is. I think we’re doing a lot of things to make our carbon footprint negative.”
- “what’s been amazing about the Internet is it basically took the US approach to freedom of expression and exported it globally, which obviously has been very disruptive to a lot of businesses and a lot of institutions around the world. I think that, unfortunately, the world is not going to continue to accept the US view of Internet regulation going forward, and even the US might not accept the US view of Internet regulation.”
- “watching what happens in India and where India goes is something that we’re spending more time thinking about, and I would encourage people who are interested in the future of Internet regulation to spend time watching what happens there.”
- India’s Internet policy is “a result of imposing what was a radically libertarian view of freedom of expression on a world that doesn’t necessarily accept it.”
- “while there have been some real challenges the Internet has created, none of us can underestimate the amount of good that it’s done, and it’s actually important for us to all continue to remind people that. Can you imagine how much worse this pandemic would’ve been if it had happened just ten years earlier?”
- “whenever you write about the Internet, no matter what the AP says, capitalize it, because if I had to point to a moment in a time when it all started to go wrong, it was when AP said––I think in 2016––that you could now lowercase the Internet. I think what’s amazing about the Internet is that it’s a network of networks, and there is only one. So, I think it being a proper noun is important….if you care about the Internet, capitalize it.”
Three major lessons they learned:
1) “WE WOULD HAVE WORKED HARDER TO ADAPT TO NEW REGULATION, RATHER THAN RESISTING IT.”
“We’ve clearly entered an era of rapidly expanding global technology regulation—and that’s not necessarily a bad thing….We believe it is possible to both adapt to new rules and innovate successfully.”
2) “WE WOULD HAVE RECOGNIZED EARLIER THAT HEALTHY JOURNALISM IS CRITICAL TO THE DEMOCRACIES IN WHICH OUR BUSINESSES THRIVE.”
“We’ve seen time and again the value of trustworthy reporting and we appreciate the role journalism holds in a healthy democracy—and the risk that comes when it’s undermined. Knowing what we do now, we are proactively supporting journalism in several ways.”
3) “WE WOULD HAVE PARTNERED MORE BOLDLY TO HARNESS TECHNOLOGY TOWARD EQUITABLE OUTCOMES FOR PEOPLE AND THE PLANET.”
“We must engage boldly not only in corporate and social responsibility, but also in technology and social responsibility.”
He discussed Snapchat’s role in encouraging young people to vote.
Nirav Tolia (did you see him on Shark Tank on Friday night? Highlight 1. Highlight 2.)
Another 80 minute interview with me instead of an essay. Watch the video. Note: Nirav was my boss when I was general counsel at Epinions, and I did some legal work for him at Nextdoor in the early days.
Some of the most interesting parts IMO:
- Nirav talked about structure, incentives, and reputation as levers for managing the gathering, organizing, and presentation of user-generated content.
- “with structure, you can use forms, you can require certain things like a minimum word count. You can ensure that the formatting of the submissions and the intent of those submissions ends up being high quality and uniform, at least that was the theory. So unlike Amazon at the time, where you could write anything you wanted as a customer review on a product or service, at Epinions you would show up, there would be a subject, there would be a title, there would be a description, there would be the body, there would be a minimum word count, we would have a spellchecker built in, all of these things that, at the end of the day, were about uniformity and quality.”
- “The second thing is incentives. We thought a lot about why people do this. And ultimately, I think this is the one that we got the most wrong. We decided to pay people for content….The incentive system then had a kind of witty little approach, which is we would pay you according to how popular your review was. And we would measure that popularity by how many times it was read….this led to all kinds of things like reading circles, where reviewers would read each other’s reviews. There would be bots that would be created so that it would look like people were reading the reviews and they really weren’t, I mean, it ultimately was the wrong set of incentives.”
- “we had this idea of reputation where you didn’t have to use your real name, and it’s not just that you didn’t have to use your real name, you didn’t use your real name. However, you would build a reputation based on the number of people that would follow you or be your fans or read your reviews…Because we felt like creating these fan bases would be valuable. And it was, except for when people started to say, “If you scratch my back, I’ll scratch yours.” And that created all kinds of strange dynamics within the marketplace.”
- “we’ve now created a world where everyone feels like they have a voice––everyone. That wasn’t the case when Epinions started. The idea of Epinions, this idea that there were amateurs who had expertise, that was something that people were a little puzzled about. No one’s puzzled by that today. In fact, we’re at the other extreme, there are people you’ve never heard of that are the influencers, right? They have no credentials, they have no qualifications, they just have large fan bases.”
- “if I were creating the next version of Epinions, now twenty years later, I would say to myself, “Without ignoring the fact that we’re in a mobile world where people have short attention spans, how can I still raise the quality bar?” So maybe there’s a different approach. Maybe something goes into some kind of beta area before it gets pushed out and is read by everyone. But I don’t think it would be very difficult to swim against this tide of short-form content in snippets on a mobile phone. Extremely difficult. The closest I’ve seen that people have done that is the emergence now with user-generated content and online community of newsletters. The most important example I can think of, of longer-form, high-quality content that still creates online community and is, in most cases, created by what we might think of as amateurs or everyday people versus professionals is things like Substack, which are quite interesting to look at today.”
- “some of the greatest review writers in Epinions’ history, they were not professionals. They were not looking to quit their jobs and write professionally. Now maybe Epinions made the mistake of paying them so much that they thought they could do that, but that was not their intention in joining the service. Their intention was to share their expertise. But then we messed everything up by deciding to pay people for sharing their expertise. Where, in most cases, and you see this in Wikipedia and you see this all over the web today, people are perfectly happy to share their expertise because it gives them joy. That’s the currency that they’re receiving. That’s the incentive. The incentive is that they are sharing their expertise with the world. They’re helping people. And then, if it gets to be big enough, they find ways to monetize that. But I feel like Epinions taking the central role in that monetization, that was a pretty big mistake.”
- “You don’t use your real name on Wikipedia. You don’t really get attribution on Wikipedia. You have to go behind the curtain to understand who wrote these pages. And I would say less than 1 percent of the people who enjoy Wikipedia every single day actually go behind that curtain. They don’t click through, they just read the content. So, Wikipedia said, “Oh, your reputation is not important to us. What’s important is the reputation of Wikipedia. So, if you’re up for contributing and making sure that the reputation of Wikipedia is here, then you can contribute, if not, we’re not interested.” So maybe an interesting topic to consider is if Wikipedia is the model for a high-quality output of online community and user-generated content.”
- “‘trust’ is a word that we used at Epinions, and it’s such a loaded word….ultimately it led to what we think of as, again, this kind of circle of trust, not web of trust, where people would say, “Hey, look, I will say that I trust you if you say you trust me.” And then we’ll sort of do this quid pro quo kind of thing. And then, all of a sudden, we can’t tell where the trust really is.”
- “what would we do differently? Well, maybe you would get twenty points of trust and you could only give out twenty. And after you gave out twenty, after you trusted twenty people, you can’t trust anyone else. And maybe you can’t trust anyone until we know that you’ve read thirty pieces of their content. Or you have to write why you trust someone.”
- “Expressions like “web of trust,” those were very novel at the beginning, and we loved them because they were catchy, and people responded to them. But when we realized they weren’t quite right, it was very difficult, if not impossible to put the genie back in the bottle…we had succeeded beyond our wildest dreams in one way, which is our users felt like Epinions was theirs. But then it was almost impossible to change those things.”
- Regarding Nextdoor: “growth is the lifeblood of any consumer internet company, particularly a social media service. And, from the very beginning, think about the friction we created, no discoverability on search engines or across the web, no ability to join and even see the service until you verified your address. And finally, no ability for membership to spread from one neighborhood to another, in a viral way. Doing these things…was just counter to all the best practices. And we weren’t doing these things to be stubborn, or because we felt like it’s just better to think differently or something like that, we were doing these things very specifically because we felt like we needed to build a system that was based on trust. And the way we could do that was to as well as possible understand what goes on in the real world and then create an online environment that proxies those customs. So that was really the beginning of Nextdoor, which was very different than anything that had been created until then.”
- “I would say, of all the things that I’m most disturbed by that occur on Nextdoor, racial profiling’s at the top of the list. But the other side of the coin is all the work that we’ve done to build this online system, the work that we’ve done around trying to reduce racial profiling, is some of the work that I’m proudest of.”
Nirav’s conclusion:
We believe that technology is a force for good, and it can be. You just have to be thoughtful about the application and you have to be committed to sticking with it. Because, as we talked about, the stakes are really high and there are people out there who are acting in their own self-interest. And, instead, what we need to try to communicate to them, and what we need to try to build in these systems, is to make it easy to act in everyone’s self-interest, in our collective self-interest. And the reason community is so important to me is because, to me, the definition of community is being part of something that’s larger than ourselves.
Some standout parts of her essay:
- “Looking back at nearly twenty-five years working in the tech sector, I remember that early period imbued with a sense of freedom and possibility. We believed the broad reach of the internet would empower previously unheard or ignored minorities. It would improve democracy and circumvent authoritarians. New modes for connection and sharing would bring us closer together. In this new landscape, some, like Barlow, believed we would not need nor want governments to regulate the internet. Much of this has been true, and still, it seems terribly naive now. In part, this is because the early internet was not really connecting all the people of the world.”
- “We need nuanced legislation that restrains large tech players, punishes exploitative practices and lets nascent competitors thrive. We must figure out how to fund a healthier ecosystem, one that does not rely on exploiting people’s data, attention and worst instincts. We need to adopt and build on technical design principles that support such an ecosystem. Most of all, we must demand more of our public discourse, and work to shift and align our norms of communication in a global community.”
- At Google, “we thought the worst thing that could happen—the greatest threat to internet companies and the communities they serve—was to be blocked in a country or prevented from making our platforms and their content available freely. What we’ve discovered in the last few years, however, is that being blocked may not be the worst thing that can happen. Being turned into a weapon against our users and against our own government may be worse.”
- “I did not foresee the broad and coordinated weaponization of these open and free spaces that we built and advocated for. For a period, the bad actors could be managed or minimized. But, over time, these spaces have become playgrounds for trolls. They have read our terms of service, and they come right up to the line of acceptable behavior and then dominated these platforms for the lulz. The ecosystem that we hoped would encourage the vulnerable to speak freely and help communities gather around common interests is used by bad actors to bully, harass, threaten and take up so much space on the platform that they push other users off the service.”
- “these spaces have been infiltrated by malicious state actors and self-identified insurrectionists. They use the same trolling techniques, not just for entertainment, but to undermine our institutions, our communities and our trust in one another, in known facts and in our democracy.”
- “we in the tech community must be honest about the troubling shift in how our platforms and products are used. We should not minimize the dangers created on these platforms that we enable or encourage. We likewise should not overstate what we can solve or make better. We should work urgently and creatively to build a better ecosystem.”
- In the mid-2000s, “The pillars shifted to personalization (tailoring content to what we already know about the user), engagement (encouraging and measuring our success and profit based on how long a user stayed on the service) and, once again, speed. As we now know, the combination of these new pillars has been a rocket ship for the most inflammatory, polarizing content. It is a toolset for manipulation. We won’t change the nature of today’s internet and create healthy conversations just by taking down more content, having more rules or being more transparent about how we handle complaints. All of those practices are good, but none of them fundamentally change the product that is operating as designed: encouraging engagement in part by promoting highly viral content, filtering for personalization so users narrow rather than expand their world, and fetishizing the speed at which we deliver information.”
He says: “no law in a generation has done more than Section 230 to build online communities and promote informed, productive and equitable communication online. Section 230 was instrumental in fostering new avenues where regular Americans, particularly those from marginalized communities, could speak and be heard…. I don’t regret that Chris and I succeeded in protecting free speech online, then and now.”
He wishes he/we had been more proactive on broadband availability, competition, and privacy. In particular:
- “Meaningful access to the internet also means ensuring that the digital world is accessible to Americans with disabilities and complies with the Americans with Disabilities Act, in the same way that physical infrastructure must.”
- “I wish my colleagues and I had secured strong protection for consumers’ personal information at the dawn of the internet era.”
- “I will fight tooth and nail to protect strong encryption, to ensure private communications stay secure from hackers, stalkers and other criminals.”
He concludes: “Americans must not accept the perverse bargain that a healthy national discourse requires sacrificing free speech.”
* * *
The essay package is a rich resource filled with countless insights. It’s also a great historical snapshot. I encourage you to read the whole package.
Meanwhile, want more of this? Come to the symposium and hear really smart and provocative commenters discuss their insights from the essay package.