Tuesday, March 07, 2006

The campaign against Wikipedia

(Editor's note: This post began as a news item at Xark!, but grew into a stand-alone essay.)

I first noticed this back in February while speaking about Web trends to a Public Relations/Business Communications class at a local college. When I asked about Wikipedia, everyone who spoke expressed a clear message: Wikipedia, to them, was not so much a resource as it was a threat.

Multiple students reported they had been told by their instructors not to use it -- ever*. Some spoke of professors who routinely threatened to punish anyone caught using it. And even my host allowed that her attitude toward the online encyclopedia was less than charitable.

Last night, speaking to a group of high-school journalists, I got similar responses. In these instances, I detect not only scholarly skepticism, but something more. Something bordering on scorn.

Like the college students from last month, these high schoolers knew that anybody could edit Wikipedia, though none of them expressed any understanding about how the system functioned, what a wiki is, or how a community of editors becomes a self-correcting entity, etc.

It's a disconnect. I see Wikipedia as a way of thinking about information and virtual community. They see it as a free-for-all. Their teachers see it as anarchy.

I call this a backlash. Wikipedia came out of nowhere, fast, to become the largest encyclopedia in history. Some academics, who as a group are used to controlling such things, were horrified by the wiki concept -- and financially threatened by the open-source, free-info, non-profit model that keeps Wikipedia a living, growing document. Rather than checking it out further, a segment of academia appears to have united against it.

From a PR standpoint, the big blow came in December when John Seigenthaler wrote a widely publicized piece citing inaccuracies in a Wikipedia article on his life.The founding editorial director of USA Today called Wikipedia "a flawed and irresponsible research tool" in a column that reflected the attitude I've noticed among some academics -- that whatever else Wikipedia may be, it also is a sandbox for malcontents, anarchists and children who run around with scissors. In other words: Not For Serious Adults.

In the wake of the Seigenthaler column, big-name bloggers and technorati, including Dave Winer and Adam Curry, came out with their own criticisms. Making matters worse, Wikipedia founder (or, as the case may be, co-founder) Jimmy Wales got "caught" editing his Wikipedia bio and taking out references to his early collaborators.

Wikipedians got their say in the ensuing coverage, but from an outsider's perspective, it seemed like the Wikimedia Foundation -- a concept I dearly love -- was suddenly in public-relations damage control mode.

Here's my take:

The idea that Wikipedia is less accurate because it doesn't have top-down editorial control is, itself, inaccurate. A better question would be, When do we know it to be accurate? The whole concept of a collaborative information project is based on the idea that community collaboration will identify and correct errors -- in public. Traditional media also involves editing and fact-checking, but it does so before publication and without transparency. Yet traditional media routinely stumble when it comes to correcting the errors that slip past those all-too-human pre-pub controls.

"Does Wikipedia have errors?" isn't a meaningful question, but "what errors will an individual Wikipedia entry contain in the snapshot of time that I see when I call up the entry?" is a question that actually takes us somewhere.

Wikipedia asks that we correct the errors we see -- and, unlike the popular stereotype of Wikipedia as an irresponsible Wild West of disinformation -- Wikipedia as a process includes multiple feedback loops that address vandalism, inaccuracies, biased writing, etc. It assumes that people are adults.

The question, then, is not whether Wikipedia has editorial quality controls (it does), but whether those controls work fast enough.

That's an open-ended question (fast enough compared to what?), but I contend that a wiki-model encyclopedia will probably correct its errors far faster than a proprietary encyclopedia. My reasoning? Top-down, for-profit editoral control pays a few people to ride herd on a large range. It cannot mobilize as many corrective resources, as quickly, as the Wikipedia community can.

Plus, is Wikipedia really inaccurate? Again, accurate compared to what?

Nature decided to compare Wikipedia to Britannica, considered the Gold Standard of traditional enclopedias. Its finding? On a survey of 42 science articles, Britannica was more accurate.

But how much more accurate? Not much. The Nature study found an average of four errors in its Wikipedia entries... compared to three errors, on average, in a Britannica entry.

The debate goes back and forth, with some Wikipedians contending that the average Wikipedia entry is 2.6 times longer than the average Britannica entry, then doing the math to produce a lower error rate. Yada yada yada. I don't care. Framing this as a competition between Wikipedia and Britannica misses the point.

The more telling comparison is between Wikipedia and Google, because when you consider how I've come to use Wikipedia, it's as an alternative to general web search engines. Wikipedia is just as fast, far more relevant and much more accurate in the information it returns. Viewed as a form of curated search, Wikipedia looks a lot less threatening.

In this sense, Wikipedia is a through-point, not a destination. And, ironically, this was exactly how my teachers told me I was supposed to use an encyclopedia Back In The Day.

Not only are we comparing Wikipedia to the wrong standard and failing to understand it as a process and a community, we're also missing the most valuable points of the accuracy debate by taking Wikipedia out of its natural context: the larger Web. Dave Winer has been cited by Wiki-haters for his criticisms, but that's far from the complete picture. Consider this Scripting News post from December, in which Winer addresses the Siegenthaler case and the larger ethical question of who-should-edit-what (emphasis added):
Ross Mayfield sees the pros and cons of editing your bio page on Wikipedia. Here's my take on it. No, you must not edit your bio page, or any page about a topic in which you have an interest. It's impossible to disclose that interest, so the poor reader has no idea how to credit what's on the page. This is the weakness of Wikipedia, in fact of all wiki. But his point about the knowledge you have about yourself is an important one. Imho, the obvious answer is that your page, on your site, edited only by you, should be linked to from the equivalent Wikipedia page, in a consistent and prominent way. Your review of a page about something you're involved in is important, but it must be clear to the reader that they are reading something that's interested. Ultimately, this combination of wiki and blogging is going to be the answer. It's how Jimmy Wales will be able to tell us he doesn't think the stuff on his Bomis site was porn and how his Ferrari cost less than most SUVs, and how Adam Curry can tell you all about himself and edit everyone else out. Now the question is, who is qualified to edit the Wikipedia page?
That's a great question, with multiple possible "correct" answers. But Dave Winer's perspective demonstrates how wholistic thinking trumps simplistic, out-of-context analysis. Lets see Wikipedia for what it is, what it can be, how it fits into its environment, and encourage people to use it properly.

Exactly. So what if I can't cite Wikipedia the same way I would a static source? It's still immensely valuable to me.

Should we take what we find at Wikipedia at face value? No. Duh. But let's restate the question: Should we take ANY information we find, online or otherwise, at face value? Answers, please, on a post card.

Ultimately, the Wikipedia controversy, if it can be called that, is about how we feel about control. I know where I come down on that subject, and it's right where Jimmy Wales was when he spoke to USA Today in December: "'Any place where the general public is allowed to freely express their opinion without having any sort of prior approval from authority — it is dangerous,' Wales says. 'Free speech is dangerous. But it's also incredibly powerful and useful.'"


(* March 8 editor's note: I've been thinking about the sentence where I wrote that some students said they had been told not to use Wikipedia "ever," and I feel I should clarify this statement. I wasn't taking notes, but the more I think about that sentence the more I worry that it is misleading. They were certainly told not to use Wikipedia in the limited sense of citing it in a footnote. And at least one student mentioned that a teacher had told him that it was OK to go to Wikipedia so long as he didn't use any information he found there. My interpretation of their comments doesn't change, but I think the wording in my original sentence overstates the level of explicit prohibition. Restated, it would be this: they're free to read Wikipedia if they choose to do so, but they are not to use it in their assignments. -- dc)