Archives for posts with tag: credibility

For the past sixteen months, the Wikimedia Foundation has been having uncomfortable conversations about how we handle controversial imagery in our projects — including, a few weeks ago, the staging of a referendum on an image hiding feature requested by our Board. The purpose of this post is not to talk specifically about the referendum results or the image hiding feature: for that, I’ll be talking in more official venues. The purpose of this post is to step back and assess where we’re at, and to call for a change in tone and emphasis in our discussions.

Please note also that due to the nature of the topic, you may find yourself offended by this post, and/or the materials linked from it.

In March 2010, editors on the German Wikipedia ran a poll asking their colleagues whether they would support a rule restricting the types of material that could appear on the German home page. Thirteen voted in favour of restrictions, and 233 voted against. A few weeks later, the German Wikipedia featured the article about the vulva on its home page, which included a close-up photograph of an open vagina. Twenty-three minutes after the article went up, a reader in Berlin wrote “you can’t be serious?!,” and called for the image to be taken down. This initiated an on-wiki discussion that eventually reached 73,000 words – the length of a shortish novel. It included a straw poll in which 29 people voted to remove the image and 30 voted to keep it. The image was kept, and the article remained on the front page for its full 24 hours.

A few months later, in June, the Wikimedia Foundation Board of Trustees began to discuss how the Wikimedia community was handling controversial imagery. Why? Because some people seemed to be using Commons to stockpile commercial porn; because the German community had put a close-up photo of a vagina on its homepage; and because upskirt photos and controversial editorial cartoons seemed to be being categorized in ways that seemed designed to be provocative, and the people who complained about them were being shot down.

The Wikimedia Foundation was concerned that a kind of market failure might be happening — that the Wikimedia community, which is generally so successful at achieving good decision quality through a consensus process, was for some reason failing to handle the issue of controversial material well. It set out to explore what was going on, and whether we needed to handle controversial imagery differently.

That triggered community members’ fears of censorship and editorial interference. And so we find ourselves today, sixteen months later, locked in angry debate. At a meeting in Nuremberg a few weeks ago, German Wikipedian User:Carbidfischer furiously denounced our Board Chair Ting Chen. The other day –as far as I know for the first time ever– somebody called someone else an asshole on one of our mailing lists. User:Niabot created this parody image. It’s unpleasant and unconstructive, and if you’re familiar with transactional analysis, or with the work done by the Arbinger Institute, you’ll recognize the bad patterns here.

The purpose of this post is to figure out why we aren’t handling this problem well, and how we can get back on track.

So: backing up.

Is there a problem with how the Wikimedia projects handle potentially-objectionable material? I say yes. The problems that led the Board to want to address this issue still exist: they have not been solved.

So what’s the solution? I have read pages upon pages of community discussion about the issue, and I sympathize and agree with much of what’s been said. Wikipedia is not, and should never be, censored. It should not be editorially interfered with.

But refusing censorship doesn’t mean we have no standards. Editors make editorial judgments every day, when we assess notability of topics, reliability of sources, and so forth. The German Wikipedia particularly is known to have extremely rigorous standards.

So why do we refrain from the expression of editorial judgment on this one issue?

I think there are two major reasons.

First, we have a fairly narrow range of views represented in our discussions.

We know that our core community represents just a sliver of society: mainly well-educated young men in wealthy countries, clustered in Europe and North America. It shouldn’t surprise us, therefore, when we skew liberal/libertarian/permissive, especially on issues related to sexuality and religion. Our demographic and attitudinal narrowness is a shame because at the heart of the projects is the belief that many eyes make all bugs shallow and yet, we’re not practicing what we preach. Instead, we’ve become an echo chamber: we hear only voices like our own, expressing points of view we already agree with. People who believe other things fall silent or abandon the conversation or are reduced to impotent rage. Or, and even likelier, they never made it to the table in the first place.

Second, we are confusing editorial judgment with censorship.

Censorship is imposed from outside. Editorial judgment is something we do every day in the projects. Applying editorial judgment to potentially-objectionable material is something that honourable journalists and educators do every day: it is not the same as censorship, nor does it constitute self-censorship.

In newsrooms, editors don’t vote on whether they personally are offended by material they know their readers will find objectionable, and they don’t make decisions based on whether the angry letters outnumber the supportive ones. They exercise empathy, and at their best they are taking a kind of ‘balance of harm’ approach — aiming to maximize benefit and minimize cost. The job is to provide useful information to as many people as possible, and they know that if people flee in disgust, they won’t benefit from anything the newsroom is offering. That doesn’t mean newsrooms publish only material that’s comfortable for their readers: it means they aim to exercise good judgment, and discomfit readers only when –on balance– discomfort is warranted.

How does that apply to us? It’s true that when people go to the article about the penis, they probably expect to see an image of a penis, just like they do when they look it up in a reference book in their public library. It’s also true that they probably wouldn’t benefit much from a gallery of cellphone camera shots of penises, and that’s why we don’t have those galleries on our articles. In lots of areas, we are currently doing a good job.

But not always.

When an editor asks if the image cleavage_(breasts).jpg really belongs in the article about clothing necklines, she shouldn’t get shouted down about prudishness: we should try to find better images that don’t overly sexualize a non-sexual topic. When an editor writes “you can’t be serious?!” after vagina,anus,perineum_(detail).jpg is posted on the front page, the response shouldn’t be WP:NOTCENSORED: we should have a discussion about who visits the homepage, and we should try to understand, and be sensitive to, their expectations and circumstances and needs. When we get thousands of angry e-mails about our decision to republish the Jyllands-Posten Muhammad cartoons, we should acknowledge the offence the cartoons cause, and explain why, on balance, we think they warrant publication anyway. None of that is censorship. It’s just good judgment. It demonstrates transparency, a willingness to be accountable, and a desire to help and serve our readers — and it would earn us trust.

I believe that in our discussions to date, we’ve gotten ourselves derailed by the censorship issue. I know that some people believe that the Wikimedia Foundation is intending to coercively intervene into the projects, in effect overruling the judgment of the editorial community. I don’t see it that way, I regret that others do, and I dislike the ‘authoritarian parent / rebellious adolescent’ dynamic we seem to be having trouble resisting.

Wikipedia is not censored. It should never be censored. That doesn’t relieve us of the obligation to be thoughtful and responsible.

So: what needs to happen?

We need to have a discussion about how to responsibly handle objectionable imagery. That discussion doesn’t need to happen with the Wikimedia Foundation (or at least, not solely with the Wikimedia Foundation). The projects should be talking internally about how to avoid unnecessarily surprising and offending readers, without compromising any of our core values.

Those community members who are acting like provocateurs and agitators need to stop. Demonizing and stereotyping people we disagree with pushes everyone into extremist positions and makes a good outcome much less likely. We need to look for common ground and talk calmly and thoughtfully with each other, staying rooted in our shared purpose. Some editors have been doing that throughout our discussions: I am seriously grateful to those people, and I wish others would follow their example.

“Wikipedia is not censored” is true. And, we need to stop using it as a conversation killer. It’s the beginning of the conversation, not the end of it.

We need to set aside anxieties about who’s in charge, and quit fighting with each other. We need to be aware of who’s not at the table. We need to bring in new voices and new perspectives that are currently lacking, and really listen to them. Those community members who’ve been afraid to talk need to speak up, and those who’ve been driven away need to come back.

The purpose of this post is to call for that responsible engagement.

Like I said at the top of this post, my purpose in writing this is not to talk about the referendum results or the image hiding feature: for that, I’ll be talking in more official venues.

Tonight I went to see historian Timothy Garton Ash talk with his friend Tobias Wolff at Stanford. The occasion was the publication of Timothy’s newest book, a collection of essays and reportage loosely built around the idea that “facts are subversive.”  Timothy’s premise seems to be –roughly, loosely– that people in power are often trying to construct narratives in support of a particular economic, political or culture agenda, and that facts –even very small ones– can sometimes trip that up.

One thing they talked about was about honesty in memoirs — for example, Mary McCarthy’s 1957 autobiography Memories of a Catholic Girlhood, in which McCarthy disarmingly confesses that “the temptation to invent has been very strong,” and “there are cases when I am not sure myself whether I am making something up.” And about George Orwell’s Homage to Catalonia, in which Orwell wrote:

I have tried to write objectively about the Barcelona fighting, though, obviously, no one can be completely objective on a question of this kind. One is practically obliged to take sides, and it must be clear enough which side I am on. Again, I must inevitably have made mistakes of fact, not only here but in other parts of this narrative. It is very difficult to write accurately about the Spanish war, because of the lack of non-propagandist documents. I warn everyone against my bias, and I warn everyone against my mistakes. Still, I have done my best to be honest.” (1)

This brought into focus for me something I’ve long half-recognized — both in my own experiences of reading Wikipedia, and the stories people tell me about how they use it themselves. Article after article after article on Wikipedia is studded with warnings to the reader. “This article needs references that appear in reliable third-party sources.” “This article needs attention from an expert on the subject.” “This article may be too technical for most readers to understand.”  On this page, you can see 24 common warning notices — and there are many, many more.

And I think that’s one of the reasons people trust Wikipedia, and why some feel such fondness for it. Wikipedia contains mistakes and vandalism: it is sometimes wrong. But people know they can trust it not to be aiming to manipulate them — to sell them something, either a product or a position. Wikipedia is just aiming to tell people the truth, and it’s refreshingly honest about its own limitations.

Tobias Wolff said tonight that sometimes such disclaimers are used manipulatively, as corroborating detail to add versimilitude to text that might otherwise be unpersuasive. I think that’s true. But in the case of Wikipedia, which is written by multitudes, disclaimers are added to pages by honest editors who are trying to help. They may not themselves be able to fix an article, but at the very least, they want to help readers know what they’re getting into. I like that.

(1) I looked that up on Google Books when I got home. Yay, Google Books!

I stumbled recently across sociologist Gary Marx‘s documentation of tactics covertly used by external parties to hurt or help social/political movements [1].

Like for example the FBI attempts to discredit Martin Luther King Jr. by painting him as a womanizer.   Or the CIA’s 1967 project Operation CHAOS, designed to monitor the student antiwar movement. Or the FBI’s attempts under COINTELPRO in the late sixties to undermine what it called “black nationalist hate groups” by inciting rivalries among them.

I’m kind of a categorization geek, so I liked Marx’s crisp table of the ways in which folks have aimed to covertly undermine the movements that they found threatening. By investigating and harassing participants, and discrediting leaders. Fomenting internal conflict: encouraging jealousy, suspicion, factionalism and personal animosity. Spreading damaging misinformation. Undermining morale and thwarting recruitment efforts. Undermining activities that generate revenue. Encouraging hostility between the movement and its potential allies and partners. Creating similar organizations that compete for resources and public mindshare. Sabotaging events and projects. And so forth.

Reading all this, I started thinking about Wikimedia, which is of course a sort of social movement. Our goal is to make information easily available for people everywhere around the world – free of commercialism, free of charge, free of bias. That’s a radical mission.

Given that, it’s interesting to look at how external entities have responded to Wikipedia’s extraordinary success – especially those who have reason (or think they might have reason) to feel threatened by it.

So for example, the media. Conventional media business models are crumbling, and media organizations are struggling to persuasively articulate their value proposition.  Some see Wikipedia as a competitor. So it doesn’t surprise me that –with a fervour that can border on the obsessive– some media talk so relentlessly about why Wikipedia can’t succeed, and make predictions about how quickly, and in what manner, it will fail.  Cultural and educational and PR organizations have less of a megaphone, but apart from that their initial responses have been pretty similar. [2]

None of that is surprising. What has surprised me though, is the other side of the balance sheet.

Marx posits a world in which detractors work against a social movement, and supporters work in favour of it.

At Wikimedia, we’ve had our share of detractors. But I’ve found myself more surprised by the other side — surprised that Wikimedia’s most articulate and passionate supporters –its core editors– don’t do more to promote its success.

Here are some of the things Marx says people can do to support social movements:

  • Work to create a favourable public image for the movement
  • Support participants and help recruit new participants
  • Help with effective communications
  • Support revenue-generating activities
  • Build and sustain participant morale
  • Build and support leaders
  • Encourage internal solidarity: support kindness, understanding, generosity and a sense of common purpose
  • Encourage external solidarity: support the development of common cause between the movement and its potential allies and partners
  • Support movement events and projects.

I want to be clear: lots of Wikimedia editors (and other supporters) do this work. We have a communications committee which is sometimes remarkably effective. The Wikimedia network of international chapters is excellent at outreach work – particularly the German chapter, which pioneered the Wikipedia Academy concept, and lots of other initiatives. Editorial and movement leadership emerges almost entirely organically at Wikimedia, and I have seen it warmly and enthusiastically supported. And we have some really terrific editors working tirelessly to develop strategic partnerships with cultural and educational institutions. So there is lots of good work being done.

But even so: sometimes when I read our mailing lists, I laugh out loud at how Wikimedians can be our own worst enemies. We subject each other to relentless scrutiny — criticizing our own leaders and supporters and activities, monitoring, speculating, worrying, and poking and prodding each other. All, frequently, in public.

I’ve been trying to figure out why we’re like this. And I think there are two main contributing factors. One is, Wikipedians are engaged first and foremost in building an encyclopedia, and knowledge workers of the encyclopedia-writing type are famously fussy, fastidious, fact-obsessed and obsessive about neutrality. So it makes sense that neutrality is a value that extends to our communications about the Wikimedia projects. We don’t want to shill for anybody, including, LOL, ourselves.

Second though is the disease of paranoia, which seems unavoidable in social movements. Anybody who’s committed themselves to working to advance a cause, particularly voluntarily –and who has only very limited control over the rest of their social movement– is vulnerable to paranoia. It makes sense: you’ve worked incredibly hard for something you care about a lot, without any expectation of reward, so of course you worry that others could destroy what you’ve accomplished.

(Lawyer and writer Bill Eddy tossed off a fascinating aside in his book High-Conflict People in Legal Disputes – to the effect that groups often instinctively elevate the most paranoid among them into leadership positions. Essentially because although hyper-paranoid leaders may often mistake innocence for evil, it can at least be assumed that they will never do the reverse. As in Michael Shermer‘s TED talk: better a false positive, than a false negative that results in being eaten by a predator.) The upshot: social movements often exist in a kind of amplified state of vigilance, which is probably occasionally useful, but equally often just wasted effort, or carries with it an opportunity cost, or is just really destructive.

Personally, I would like to see the core Wikimedia community better support itself and its own success.

[1] From Gary Marx’s chapter “External Efforts to Damage or Facilitate Social Movements: Some Patterns, Explanations, Outcomes, and Complications,” in the book The Dynamics of Social Movements, edited by M. Zald and J. McCarthy, Winthrop Publishers, 1979.

[2] I should be careful to be clear here. First, Wikimedia’s got lots of supporters — and we’ve always had strong supporters in traditional media. I don’t want conventional media to see Wikipedia as a threat and I don’t think it is a threat: I think Wikipedia’s a useful complement, part of a balanced information diet. Second, everybody’s reaction to Wikipedia has gotten warmer over time, as Wikipedia’s earned credibility. But the original systemic pressures haven’t changed: they are still what they always were.