Archives for posts with tag: dissent

For the past sixteen months, the Wikimedia Foundation has been having uncomfortable conversations about how we handle controversial imagery in our projects — including, a few weeks ago, the staging of a referendum on an image hiding feature requested by our Board. The purpose of this post is not to talk specifically about the referendum results or the image hiding feature: for that, I’ll be talking in more official venues. The purpose of this post is to step back and assess where we’re at, and to call for a change in tone and emphasis in our discussions.

Please note also that due to the nature of the topic, you may find yourself offended by this post, and/or the materials linked from it.

In March 2010, editors on the German Wikipedia ran a poll asking their colleagues whether they would support a rule restricting the types of material that could appear on the German home page. Thirteen voted in favour of restrictions, and 233 voted against. A few weeks later, the German Wikipedia featured the article about the vulva on its home page, which included a close-up photograph of an open vagina. Twenty-three minutes after the article went up, a reader in Berlin wrote “you can’t be serious?!,” and called for the image to be taken down. This initiated an on-wiki discussion that eventually reached 73,000 words – the length of a shortish novel. It included a straw poll in which 29 people voted to remove the image and 30 voted to keep it. The image was kept, and the article remained on the front page for its full 24 hours.

A few months later, in June, the Wikimedia Foundation Board of Trustees began to discuss how the Wikimedia community was handling controversial imagery. Why? Because some people seemed to be using Commons to stockpile commercial porn; because the German community had put a close-up photo of a vagina on its homepage; and because upskirt photos and controversial editorial cartoons seemed to be being categorized in ways that seemed designed to be provocative, and the people who complained about them were being shot down.

The Wikimedia Foundation was concerned that a kind of market failure might be happening — that the Wikimedia community, which is generally so successful at achieving good decision quality through a consensus process, was for some reason failing to handle the issue of controversial material well. It set out to explore what was going on, and whether we needed to handle controversial imagery differently.

That triggered community members’ fears of censorship and editorial interference. And so we find ourselves today, sixteen months later, locked in angry debate. At a meeting in Nuremberg a few weeks ago, German Wikipedian User:Carbidfischer furiously denounced our Board Chair Ting Chen. The other day –as far as I know for the first time ever– somebody called someone else an asshole on one of our mailing lists. User:Niabot created this parody image. It’s unpleasant and unconstructive, and if you’re familiar with transactional analysis, or with the work done by the Arbinger Institute, you’ll recognize the bad patterns here.

The purpose of this post is to figure out why we aren’t handling this problem well, and how we can get back on track.

So: backing up.

Is there a problem with how the Wikimedia projects handle potentially-objectionable material? I say yes. The problems that led the Board to want to address this issue still exist: they have not been solved.

So what’s the solution? I have read pages upon pages of community discussion about the issue, and I sympathize and agree with much of what’s been said. Wikipedia is not, and should never be, censored. It should not be editorially interfered with.

But refusing censorship doesn’t mean we have no standards. Editors make editorial judgments every day, when we assess notability of topics, reliability of sources, and so forth. The German Wikipedia particularly is known to have extremely rigorous standards.

So why do we refrain from the expression of editorial judgment on this one issue?

I think there are two major reasons.

First, we have a fairly narrow range of views represented in our discussions.

We know that our core community represents just a sliver of society: mainly well-educated young men in wealthy countries, clustered in Europe and North America. It shouldn’t surprise us, therefore, when we skew liberal/libertarian/permissive, especially on issues related to sexuality and religion. Our demographic and attitudinal narrowness is a shame because at the heart of the projects is the belief that many eyes make all bugs shallow and yet, we’re not practicing what we preach. Instead, we’ve become an echo chamber: we hear only voices like our own, expressing points of view we already agree with. People who believe other things fall silent or abandon the conversation or are reduced to impotent rage. Or, and even likelier, they never made it to the table in the first place.

Second, we are confusing editorial judgment with censorship.

Censorship is imposed from outside. Editorial judgment is something we do every day in the projects. Applying editorial judgment to potentially-objectionable material is something that honourable journalists and educators do every day: it is not the same as censorship, nor does it constitute self-censorship.

In newsrooms, editors don’t vote on whether they personally are offended by material they know their readers will find objectionable, and they don’t make decisions based on whether the angry letters outnumber the supportive ones. They exercise empathy, and at their best they are taking a kind of ‘balance of harm’ approach — aiming to maximize benefit and minimize cost. The job is to provide useful information to as many people as possible, and they know that if people flee in disgust, they won’t benefit from anything the newsroom is offering. That doesn’t mean newsrooms publish only material that’s comfortable for their readers: it means they aim to exercise good judgment, and discomfit readers only when –on balance– discomfort is warranted.

How does that apply to us? It’s true that when people go to the article about the penis, they probably expect to see an image of a penis, just like they do when they look it up in a reference book in their public library. It’s also true that they probably wouldn’t benefit much from a gallery of cellphone camera shots of penises, and that’s why we don’t have those galleries on our articles. In lots of areas, we are currently doing a good job.

But not always.

When an editor asks if the image cleavage_(breasts).jpg really belongs in the article about clothing necklines, she shouldn’t get shouted down about prudishness: we should try to find better images that don’t overly sexualize a non-sexual topic. When an editor writes “you can’t be serious?!” after vagina,anus,perineum_(detail).jpg is posted on the front page, the response shouldn’t be WP:NOTCENSORED: we should have a discussion about who visits the homepage, and we should try to understand, and be sensitive to, their expectations and circumstances and needs. When we get thousands of angry e-mails about our decision to republish the Jyllands-Posten Muhammad cartoons, we should acknowledge the offence the cartoons cause, and explain why, on balance, we think they warrant publication anyway. None of that is censorship. It’s just good judgment. It demonstrates transparency, a willingness to be accountable, and a desire to help and serve our readers — and it would earn us trust.

I believe that in our discussions to date, we’ve gotten ourselves derailed by the censorship issue. I know that some people believe that the Wikimedia Foundation is intending to coercively intervene into the projects, in effect overruling the judgment of the editorial community. I don’t see it that way, I regret that others do, and I dislike the ‘authoritarian parent / rebellious adolescent’ dynamic we seem to be having trouble resisting.

Wikipedia is not censored. It should never be censored. That doesn’t relieve us of the obligation to be thoughtful and responsible.

So: what needs to happen?

We need to have a discussion about how to responsibly handle objectionable imagery. That discussion doesn’t need to happen with the Wikimedia Foundation (or at least, not solely with the Wikimedia Foundation). The projects should be talking internally about how to avoid unnecessarily surprising and offending readers, without compromising any of our core values.

Those community members who are acting like provocateurs and agitators need to stop. Demonizing and stereotyping people we disagree with pushes everyone into extremist positions and makes a good outcome much less likely. We need to look for common ground and talk calmly and thoughtfully with each other, staying rooted in our shared purpose. Some editors have been doing that throughout our discussions: I am seriously grateful to those people, and I wish others would follow their example.

“Wikipedia is not censored” is true. And, we need to stop using it as a conversation killer. It’s the beginning of the conversation, not the end of it.

We need to set aside anxieties about who’s in charge, and quit fighting with each other. We need to be aware of who’s not at the table. We need to bring in new voices and new perspectives that are currently lacking, and really listen to them. Those community members who’ve been afraid to talk need to speak up, and those who’ve been driven away need to come back.

The purpose of this post is to call for that responsible engagement.

Like I said at the top of this post, my purpose in writing this is not to talk about the referendum results or the image hiding feature: for that, I’ll be talking in more official venues.

In my downtime while travelling, I read about two years worth of Less Wrong, a rationalist community blog that Kat Walsh introduced me to. It’s a great read, especially for people who fall into what Less Wrong co-founder Eliezer Yudkowsky hilariously and aptly labels “the atheist/libertarian/technophile/sf-fan/Silicon-Valley/programmer/early-adopter crowd” – and there are a couple of posts I think are particularly worth calling to the attention of experienced, committed Wikimedia community members.

Here are four posts I think every Wikimedian should read.

1. How to Save the World lays out a rationalist approach to making the world a better place. My favourite –and the most applicable to us– “identify a cause with lots of leverage.” In the words of the author:

It’s noble to try and save the world, but it’s ineffective and unrealistic to try and do it all on your own. So let’s start out by joining forces with an established organization who’s already working on what you care about. Seriously, unless you’re already ridiculously rich + brilliant or ludicrously influential, going solo or further fragmenting the philanthropic world by creating US-Charity#1,238,202 is almost certainly a mistake. Now that we’re all working together here, let’s keep in mind that only a few charitable organizations are truly great investments — and the vast majority just aren’t. So maximize your leverage by investing your time and money into supporting the best non-profits with the largest expected pay-offs.

2. Defecting By Accident: A Flaw Common to Analytical People lays out the author’s view that highly analytical people tend to frequently “defect by accident” – basically, they hurt their ability to advance their own agenda by alienating others with unnecessary pedantry, sarcasm, and disagreeableness. The author offers eight tips for behavioural changes to make accidental defectors more effective, and recommends three books to increase influence persuasive ability — including Robert Cialdini’s excellent Influence: The Psychology of Persuasion [1].

3. Why Our Kind Can’t Cooperate. A post that argues that yes, a group which can’t tolerate disagreement isn’t rational. But also that a group that tolerates only disagreement is equally irrational.

Our culture puts all the emphasis on heroic disagreement and heroic defiance, and none on heroic agreement or heroic group consensus. We signal our superior intelligence and our membership in the nonconformist community by inventing clever objections to others’ arguments. Perhaps that is why the atheist/libertarian/technophile/sf-fan/Silicon-Valley/programmer/early-adopter crowd stays marginalized, losing battles with less nonconformist factions in larger society. No, we’re not losing because we’re so superior, we’re losing because our exclusively individualist traditions sabotage our ability to cooperate.

4. Your Price For Joining. This picks up where Poul-Henning Kamp’s Why Should I Care What Color the Bikeshed Is? leaves off, arguing that “people in the atheist/libertarian/technophile/sf-fan/etcetera cluster often set their joining prices way way way too high.” In the words of the author:

I observe that people underestimate the costs of what they ask for, or perhaps just act on instinct, and set their prices way way way too high. If the nonconformist crowd ever wants to get anything done together, we need to move in the direction of joining groups and staying there at least a little more easily. Even in the face of annoyances and imperfections! Even in the face of unresponsiveness to our own better ideas!

These are themes I think about / write about, a lot: collaboration, dissent, how groups can work together productively. I worry sometimes that Wikimedians think I’m hyper-critical and don’t see the strengths of our (argumentative, lively, sometimes ungenerous) culture. So to be super-clear: no! I very much value our culture, scrappiness and all. That doesn’t mean I don’t see its limitations though, and I do think we should always be aiming to improve and make ourselves more effective. That’s what these essays are about, and that’s why I’m recommending them.

[1] I e-mailed Robert Cialdini once looking for advice about a particular problem I was having working well with some Wikimedia community members. Surprisingly to me, he called me within just a few minutes, and we talked for more than an hour while I walked through an airport. I wouldn’t say he was able to fully solve my problem, but it was a helpful conversation and I was amazed by his generosity.

The Wikimedia Foundation Board of Trustees met in San Francisco a few weeks ago, and had a long and serious discussion about controversial content in the Wikimedia projects. (Why? Because we’re the only major site that doesn’t treat controversial material –e.g., sexually-explicit imagery, violent imagery, culturally offensive imagery– differently from everything else. The Board wanted –in effect– to probe into whether that was helping or hurting our effectiveness at fulfilling our mission.)

Out of that agenda item, we found ourselves talking about what it looks like when change is handled well at Wikimedia, what good leadership looks like in our context, and what patterns we can see in work that’s been done to date.

I found that fascinating, so I’ve done some further thinking since the meeting. The purpose of this post is to document some good patterns of leadership and change-making that I’ve observed at Wikimedia.

Couple of quick caveats: For this post, I’ve picked three little case studies of successful change at Wikimedia. I’m defining successful change here as ‘change that stuck’ – not as ‘change that led to a desirable outcome.’ (I think all these three outcomes were good, but that’s moot for the purposes of this. What I’m aiming to do here is extract patterns of effective process.) Please note also that I picked these examples quickly without a criteria set – my goal was just to pick a few examples I’m familiar with, and could therefore easily analyze. It’s the patterns that matter, not so much the examples.

That said: here are three case studies of successful change at Wikimedia.

  • The Board’s statement on biographies of living people. Policies regarding biographies had been a topic of concern among experienced Wikipedians for years, mainly because there is real potential for people to be damaged when the Wikipedia article about them is biased, vandalized or inaccurate, and because our experience shows us that articles about non-famous people are particularly vulnerable to skew or error, because they aren’t read and edited by enough people. And, that potential for damage –particularly to the non-famous– grows along with Wikipedia’s popularity. In April 2009, the Board of Trustees held a discussion about BLPs, and then issued a statement which essentially reflected best practices that had been developed by the Wikipedia community, and recommended their consistent adoption.  The Board statement was taken seriously: it’s been translated into 18 languages, discussed internally throughout the editing community, and has been cited and used as policies and practices evolve.

  • The strategy project of 2009-10. Almost 10 years after Wikipedia was founded, the Board and I felt like it was time to stop down and assess: what are we doing well, and where do we want to focus our efforts going forward. So in spring 2009, the Wikimedia Board of Trustees asked me to launch a collaborative, transparent, participatory strategy development project, designed to create a five-year plan for the Wikimedia movement. Over the next year, more than 1,000 people participated in the project, in more than 50 languages. The resultant plan is housed on the strategy wiki here, and a summary version will be published this winter. You can never really tell the quality of strategy until it’s implemented (and sometimes not even then), but the project itself has accomplished what it set out to do.

  • The license migration of May 2009. When I joined Wikimedia this process was already underway, so I only observed first-hand the last half of it. But it was lovely to watch. Essentially: some very smart and experienced people in leadership positions at Wikimedia decided it made sense to switch from the GFDL to CC-BY-SA. But, they didn’t themselves have the moral or legal right to make the switch – it needed to be made by the writers of the Wikimedia projects, who had originally released their work under the GFDL. So, the people who wanted the switch launched a long campaign to 1) negotiate a license migration process that Richard Stallman (creator of the GFDL and a hero of the free software movement) would be able to support, and 2) explain to the Wikimedia community why they thought the license migration made sense. Then, the Wikimedia board endorsed the migration, and held a referendum. It passed with very little opposition, and the switch was made.

Here are nine patterns I think we can extract from those examples:

  1. The person/people leading the change didn’t wait for it to happen naturally – they stepped up and took responsibility for making it happen. The strategy project grew out of a conversation between then-board Chair Michael Snow and me, because we felt that Wikimedia needed a coherent plan. The BLP statement was started by me and the Board, because we were worried that as Wikipedia grew more popular, consistent policy in this area was essential. The license migration was started by Jimmy Wales, Erik Moeller and others because they wanted it to be much easier for people to reuse Wikimedia content. In all these instances, someone identified a change they thought should be made, and designed and executed a process aimed at creating that change.
  2. A single person didn’t make the change themselves. A group of people worked together to make it happen. More than a thousand people worked on the strategy project. Probably hundreds have contributed (over several years) to tightening up BLP policies and practices. I’m guessing dozens of people contributed to the license migration. The lesson here is that in our context, lasting change can’t be produced by a single person.
  3. Early in the process, somebody put serious energy towards achieving a global/meta understanding of the issue, from many different perspectives. It might be worth pointing out that this is not something we normally do: in order to do amazing work, Random Editor X doesn’t have any need to understand the global whole; he or she can work quietly, excellently, pretty much alone. But in order to make change that involves multiple constituencies, the person doing it needs to understand the perspectives of everyone implicated by that change.
  4. The process was carefully designed to ask the right people the right questions at the right time. The license migration was an exemplar here: The people designing the process quite rightly understood that there was no point in asking editors’ opinions about something many of them probably didn’t understand. On the other hand, the change couldn’t be made without the approval of editors. So, an education campaign was designed that gave editors access to information about the proposed migration from multiple sources and perspectives, prior to the vote.
  5. A person or a group of people dedicated lots of hours towards figuring out what should happen, and making it happen. In each case here, lots of people did lots of real work: researching, synthesizing, analyzing, facilitating, imagining, anticipating, planning, communicating.
  6. The work was done mostly in public and was made as visible as possible, in an attempt to bolster trust and understanding among non-participants. This is fundamental. We knew for example that the strategy project couldn’t succeed if it happened behind closed doors. Again and again throughout the process, Eugene Eric Kim resisted people’s attempts to move the work to private spaces, because he knew it was critical for acceptance that the work be observable.
  7. Some discussion happened in private, inside a small group of people who trust each other and can work easily together. That’s uncomfortable to say, because transparency and openness are core values for us and anything that contradicts them feels wrong. But it’s true: people need safe spaces to kick around notions and test their own assumptions. I know for example that at the beginning of the Board’s BLP conversations, I had all kinds of ideas about ‘the problem of BLPs’ that turned out to be flat-out wrong. I needed to feel free to air my bad ideas, and get them poked at and refuted by people I could trust, before I could start to make any progress thinking about the issue. Similarly, the Board exchanged more than 300 e-mails about controversial content inside its private mailing list, before it felt comfortable enough to frame the issue up in a resolution that would be published. That private kicking around needs to happen so that people can test and accelerate and evolve their own thinking.
  8. People put their own credibility on the line, endorsing the change and trying to persuade others to believe in it. In a decentralized movement, there’s a strong gravitational pull towards the status quo, and whenever anyone tries to make change, they’re in effect saying to hundreds or thousands of people “Hey! Look over here! Something needs to happen, and I know what it is.” That’s a risky thing to do, because they might be perceived in a bunch of negative ways – as naiive or overreacting, as wrong or stupid or presumptuous, or even as insincere – pretending to want to help, but really motivated by inappropriate personal self-interest. Putting yourself on the line for something you believe in, in the face of suspicion or apathy, is brave. And it’s critical.
  9. Most people involved –either as participants or observers– wanted more than anything else to advance the Wikimedia mission, and they trusted that the others involved wanted the same thing. This is critical too. I have sometimes despaired at the strength of our default to the status quo: it is very, very hard to get things done in our context. But I am always reassured by the intelligence of Wikimedia community members, and by their dedication to our shared mission. I believe that if everyone’s aligned in wanting to achieve the mission, that’s our essential foundation for making good decisions.

Like I said earlier — these are just examples I’ve seen or been involved in personally. I’d be very interested to hear other examples of successful change at Wikimedia, plus observations & thinking about patterns we can extract from them.

I spent this past weekend with Wikimedia trustee Phoebe Ayers at the Quaker Center in Ben Lomond, California, attending a workshop called Business Among Friends: Clerking as a Spiritual Discipline.

Neither Phoebe nor I are Quakers, but we’re curious about them. I first read about the similarities between Quaker and Wikimedian decision-making practices in Joseph Reagle‘s excellent new analysis of the Wikipedia community Good Faith Collaboration – and since then, I’ve read a dozen or so Quaker books and pamphlets. I’ve been especially interested in the practice of “clerking.”

The job of the Quaker clerk is to facilitate Quaker meetings – to create the agenda, set the tone, traffic-cop the discussion, listen, help resolve conflict, and understand and document agreement. It’s a role that reminds me a lot of leadership (both formal & informal) at the Wikimedia Foundation, and so I’ve been curious to learn more about it. The purpose of this post is to share some impressions and identify a few Quaker practices that I think Wikimedia could usefully adopt.

Disclaimer! Experienced Quakers will probably find that my grasp of some practices is shaky, and I may have mischaracterized things. People who attended the workshop may remember things differently from me. And, I am going to use words like “consensus” in the non-Quaker sense, so that they’ll make sense for readers who aren’t familiar with Quakerism. My apologies for errors and misunderstandings.

Based on what I’d read, I expected the Quakers to be mostly middle-aged or older, mostly white, and really, really friendly. They were exactly that.

But I was surprised to discover also some unexpected commonalities with Wikimedians. Both speak in acronyms (WP:NPOV, meet M&O, FCL and FAP). Both are really proud of their work, and yet tend towards self-criticism rather than self-promotion. Both talk a lot, and are precise and articulate in the way they use language (the Quakers I met spoke in complex sentences, studded with caveats and parentheticals). Both resist speaking on behalf of their group. And both have a strong individualistic streak, and describe themselves as skeptical about leadership and authority.

(To that last point: On Saturday night, Quaker adults and teenagers played a game called Big Wind Blows, which is kind of like musical chairs. Everyone’s in a circle and the person in the centre, who doesn’t have a seat, calls out “Big wind blows for everybody who [has X characteristic].” Everybody without the characteristic stays in their chair; everybody with it runs around looking for a new seat. On Saturday night, the first person in the centre said something like “Big wind blows for everyone with brown hair.” Second was “everyone who’s wearing blue jeans.” Third was “everyone who’s gone to jail for a matter of conscience.” Four Quakers in the group had chosen jail rather than, say, serving in the military or paying taxes. And doing that was considered ‘normal’ enough to be fodder for a game.)

Tomorrow, once I’ve cleaned them up, I’ll post some detailed notes I took. For now though, I’ll elaborate on a few Quaker practices that I think we Wikimedians could learn from. Most of this will be applicable for face-to-face meetings (i.e., our board meetings, Wikimania, meet-ups), but there may be relevance here for on-wiki work too.

Everybody who’s part of the movement shares responsibility for helping it succeed. Nobody gets to sit on the sidelines and watch things fail.

The Quakers talk a lot about “clerking yourself,” which basically means taking personal responsibility for the group’s collective success. People are expected to behave in a disciplined fashion, including managing themselves emotionally. They’re expected to be open-minded, open to learning and changing their minds. They’re expected to pay attention and listen carefully to each other. They’re expected to avoid the temptation to get mad or show off, and to instead speak “with love rather than judgment.” They’re expected to restrain themselves from talking too much, from interrupting other people, and from repeating the same arguments again and again. Quakers are expected to be willing and able to calmly, thoughtfully, explore areas of disagreement. If they’re feeling shy or reticent or silenced, they’re expected to say that, so that other people can find ways to support them and ensure they’re heard. And if other people are behaving badly, everyone is expected to try to help them behave better.

All this, obviously, is aspirational. As someone at the workshop said, Quakers aren’t paragons, and they’re just as likely as anyone else to be childish and whiny and egotistical. But they’re expected to try really hard not to be.

Setting the right tone is critical for success.

All weekend, I was struck by the Quakers’ skill at establishing and maintaining a rich, healthy emotional tone.

The most obvious example of this is the Quakers’ use of silence. Quakers really value silence: it’s built into all of their religious meetings and their discussions, and during the weekend, we probably spent a combined total of two or three hours together in silence — sometimes for long stretches, and sometimes just for a few minutes. That does something really interesting: it makes everybody more judicious. You have time to reflect, to organize your thoughts, to calm down. You get to listen to other people, rather than using their speaking time to plan what you’ll say next. What you say is smarter and more thoughtful than it would’ve been otherwise.

That’s just one technique the Quakers use: there were lots of others. Elizabeth and Eric, who facilitated the workshop, modeled warmth and patience and respect. They thanked people, a lot. They acknowledged and welcomed the new people. They opened the meetings in a circle, with everyone holding hands.

It reminded me of something Sal Giambanco of Omidyar Network once told me – that he recommends non-profit boards kick off their meetings with a recitation of their mission statement. It’s the same kind of thing – rituals and practices designed to remind us that what we’re doing together is meaningful, so that we can approach it in a spirit of love and respect.

Sometimes you have to kick out difficult people. Maybe.

The people attending the workshop were all experienced Quakers. And it was clear from the stories they told and the questions they asked, that Quaker meetings suffer from difficult people.

This reminded me of Wikimedia. Because it didn’t seem like difficult people were necessarily over-represented inside Quakerism. Rather, it seemed like a normal number of difficult people created stress and anxiety disproportionate to their actual numbers. Elizabeth says that many clerks have shared with her stories about a single problematic member in their meeting, who wants attention or influence and takes advantage of the consensus process to grandstand and delay or block action for months or even years. Quakers call these people ‘dissenting spirits’ or ‘chronic objectors,’ and characterize them as “needing to hold themselves out of alignment with the group.” Elizabeth describes them as people who, no matter how much trust is extended to them, are unable to develop trust in others. Their disruptive presence can drive away others, and sometimes even threaten the survival of the group.

Which sounded sadly familiar to me.

Here’s what I think happens. Where other groups might unhesitatingly excommunicate a person who repeatedly broke their rules, it seems to me that the Wikimedia projects and the Quakers both tend to agonize instead, presumably because both groups pride themselves on being highly inclusive and tolerant. (Remember I said the Quakers are strongly individualistic? I suspect that, like some Wikimedians, some Quakers have a history of getting kicked out of various groups, and so they have a lot of empathy for people having that kind of difficulty.)

But even the Quakers, it seems, have their limits. As Elizabeth wrote in her book on clerking, “A healthy meeting will provide spiritual nurture for the ‘difficult’ Friend, but will understand that protecting the safety of the meeting has priority. It will not confuse ‘being loving’ or ‘being Quakerly’ with tolerating the destructive behavior of an individual, but will understand that setting firm limits is loving.”

This was probably the most uncomfortable topic that got addressed during the workshop, and it was the only time I remember when Elizabeth and Eric seemed to disagree. It’s a tough topic, both for the Quakers and Wikimedians.

I want to thank Jacob Stone and Gretta Stone, directors of the Quaker Center in Ben Lomond, as well as Elizabeth Boardman (Davis Meeting) and Eric Moon (Berkeley Meeting), facilitators of the workshop. Everybody at the workshop was enormously welcoming to Phoebe and me: we are really grateful. Seriously: it was lovely.

I’ll publish more notes –rougher, longer– probably tomorrow.

I never thought much about the Quakers [1] until I read Joseph Reagle‘s excellent new book Good Faith Collaboration: The Culture of Wikipedia (forthcoming from MIT Press in September), in which Joseph references the Quaker consensus decisionmaking processes – and specifically, how Quakers resolve dissent.

Joseph cites the sociological study Beyond Majority Rule: Voteless Decisions in the Society of Friends – an exploration of Quaker decisionmaking by Jesuit priest Michael J. Sheeran, who had spent two years observing and interviewing Quakers for his Princeton PhD thesis, which afterwards was published by the Quakers and is now considered a definitive guide on the subject.

Consensus decisionmaking (CDM) is a really interesting topic for Wikimedians because we make most of our decisions by consensus, and we struggle every day with CDM’s inherent limitations. It’s slow and sometimes tedious, it’s messy and vulnerable to disruption, and –most problematically– it’s got a strong built-in bias towards the status quo. CDM creates weird perverse incentives – for example, it gives a lot of power to people who say no, which can make saying no attractive for people who want to be powerful. And it can act to empower people with strong views, regardless of their legitimacy or correctness.

Beyond Majority Rule was so fascinating that it’s sent me on a bit of a Quaker reading binge, and in the past month or so I’ve read about a dozen books and pamphlets on Quaker practices.  I’ve been interested to see what values and practices the Quakers and Wikimedians share, and whether there are things the Quakers do, that we might usefully adopt.

For the most part, Quaker practices likely aren’t particularly adaptable for mass collaboration, because they don’t scale easily.  They seem best-suited to smallish groups that are able to meet frequently, face-to-face.

But some Quaker practices, I think, are relevant to Wikimedia, and we are either already using versions of them, or should consider it. The Quaker “clerk” role, I think, is very similar to our leadership roles such as board or committee chair. The Quaker decisionmaking process has strong similarities to how our board of trustees makes its decisions, and I think Quaker methods of reconciling dissent might be particularly useful for us.  (Quakers have better-codified levels of dissent and paths to resolution than we do — I think we could adopt some of this.) And the Quaker schools’ delineation of roles-and-responsibilities among board, staff and community members, could I think also be a good model for us.

I plan to write more about the Quakers in coming weeks. For now though, here’s a list of what I’ve been reading:

[1] Quakers have their roots in 17th century England. There are about 360,000 Quakers today, mainly in Africa, the Asia Pacific Region, the UK and North America. Most consider themselves Christians, although a few identify as agnostic, atheistic, or as members of non-Christian faith traditions such as Judaism or Islam. Quakers are probably best known for their belief that the word of God is still emergent rather than fully known, their silent and “unprogrammed” religious services which have no leaders, hymns or incantations, their centuries-old tradition of pacifism and social activism, and their consensus decision-making process.

Read more about the Quakers at Wikipedia.