CAMBRIDGE, Mass.--Harvard Law and Berkman Center scholar Yochai Benkler and Wikipedia founder Jimmy Wales deconstructed Wikipedia and discussed peer production models at an event here Thursday.
Benkler, who is the Jack N. and Lillian R. Berkman Professor of Entrepreneurial Studies at the Harvard Law School and co-director of the Berkman Center, were participating in a program marking the Berkman Center's 10th anniversary at the Harvard Law School (see my earlier coverage of the conference). Wales is a Berkman Fellow and hopes to find ways for groups to come to better decisions in his research.
During his remarks, Wales outlined what makes Wikipedia different in light of the perception that world's most-relied-upon information resource is counterintuitive. The following are notes from his remarks from the session (in his voice):
There were a lot of mistakes made in the early social design of the Internet. The unmoderated Usenet groups were difficult to control and exclude bad behavior. It gave the Internet a bad name in some circles, leading to spam, trolls and flamewars, and still exists today.
Given that background, and seeing the worst brought out in people, the community has no means to self-regulate. You end up with the top-down police state to manage it.
The idea that anyone could edit anything at any time made obvious that most people were horrible and it makes the Internet worse. I've learned the analogy to a restaurant. You've been given the task to design a restaurant and serve steaks. So customers have access to knives, and people with knives might stab people, so you need to keep people in a cage. This model makes a bad society, and its view of human nature we mostly avoid except at the airport.
People get the idea the only way to design a social space is to have top-down control. Wikipedia is more like a restaurant--people go in and eat and don't start stabbing people, and there are tools and institutions to deal with misbehavior.
The technology allows us to have a space that is safe and you can block the worst offenders. But how does neutrality fit into this?
Neutrality Point of View (NPOV) is absolute and non-negotiable in Wikipedia. The problems come up in obscure topics, such as Japanese anime. For common topics people come together and make a decent statement on what it is. It turns out what is really important is that participants have a shared vision of what they want to accomplish.
Mutually-assured destruction is inherent in Wikipedia. People who want to push an agenda end up having to write "for the enemy" rather than to those who share the same bias. Most people are pretty reasonable, but you don't get that sense from TV where they put up two people on opposite sides. Most people are in the middle and aware of pros and cons of issues.
We are really strongly focused on consensus. One criticism of Wikipedia is that the majority rules. But majority is not the right way to describe it. We strive for consensus rather than majority rule. If you are in a group of five or ten people working on an article, if 30 percent of those working on the article dissent, then continue to write the article until all but the most unreasonable agree. Those who continue to disagree are typically exhibiting non-collaborative, and sometimes abusive, behavior.
As Wikipedia has grown, there are subcommunities and a risk of changing interactions from small group to more atomistic random people. It's a lot harder to maintain civility. It's a lot harder to be rude to people you know.
Companies working on their own entries are mostly overblown. We see that a lot more from small mom-and-pop companies trying to get an article on Wikipedia. A lot of communications professional understand that interacting with social media requires accepting the norms.
I definitely think we have a problem with the amount of tradition and jargon. People trying to change their biographies, for example, found their changes were reverted with strange codes as an explanation. People should not be required to become expert Wikipedians to join the conversation. It gets really hard when there is too much jargon.
How does Wikipedia decide what is published? The community decides on a case-by-base basis. Wikipedia has gotten bigger in two ways--sheer size of the work, which means when we started out we were covering George Bush and Michael Jackson and they are so famous they don't care. Secondly, we have become very powerful in search engines like Google, so it actually matters to people. Because of those two factors, it's becoming much more on the minds of the community to say how to thoughtfully reflect on question. We look for reliable sources--verifiability. Someone could start an article about their mother, but if they are not well known, who can verify it, so we can't have an article about it. We also look at the question of human dignity. One of the rules of biographies is that if the person is only notable for one event, and perhaps did some bad thing, and it's a unique odd event, we typically try to have an article about the event, not the person.
Intentional vulnerability is really important. Sometimes it's reported that a Wikipedia page was hacked, which we chuckle at. The advanced computer skills to hack Wikipedia are not much. Put a curse word on Wikipedia and we fix it in one or two seconds so it's not that thrilling. We do actually lock the front page though.
Wikia and Wikipedia are completely separate. The only link between the two is that Wales is the founder. Like Mozilla (which makes money from Firefox) Wikipedia could follow that model, but nobody is thinking about it that much.
Given enough time humans will screw up Wikipedia just as they have screwed up everything else, but so far it's not too bad.
As you would expect, Benkler took a more academic approach to deconstructing Wikipedia. "Ten years ago we would not have had this conversation," he said, referring to the rapid changes in the last decade. "We are moving generationally from 90s of imagining the world and projecting hopes and fears to a more detailed analysis, moving beyond hoping to organizing our research and getting large scale data and new modes of analysis."
The author of The Wealth of Networks, Benkler said he had been studying Wikipedia since it was four months old. He compared it to the Encyclopedia Britannica, which represents the "structure of authority over knowledge," rather that the process of conversation and human interaction as in Wikipedia. He noted that Wikipedia has moved from being quirky and on the side to something that is mainstream.
"Encyclopedia Britannica is a stable view of knowledge embedded in a human relation and legal system. It was challenged by a much more loosely coupled system that allows for much greater change and unpredictability, and requires more learning and critique," Benkler said. It requires the freedom to change, the will to engage and a certain cooperation dynamic, he added.
Benkler concluded that a very different model of human motivation is needed, that is much more capable of cooperation. It will require looking at many disciplines, including experimental economics, game theory, and organization sociology. "We are building systems loosely coupled because we can't design perfect systems. We have to allow freedom as a practical human agency designed for cooperation to replace rational actor model with something much more rich and close to way the conversation is," Benkler said.