Active Wikimedia editors in good standing are voting on a referendum measure that might put at least some of the media collective's famous disagreements over images to rest. The referendum asks Wikimedians to decide whether to implement a system for readers to conceal pictures that they would prefer not to view, via preference settings.

The object of this measure is to further what Wikimedia participants call the "principle of least astonishment, or least surprise" for users. But under the referendum proposal, these potentially upsetting pictures would not be deleted. They would simply require further clicking to view, an option that a Wikimedia report calls "shuttering."

Some images, such as those depicting genitals, sexual practices, or mass death and disfigurement, "will inevitably still have the power to disturb some viewers, especially if they are children, or if they are happened upon unintentionally," the referendum page notes. "The point of the opt-in personal image hiding feature is to help alleviate that surprise and dismay, by making the images unavailable for viewing without a second command."

My full support

The Wikimedia Foundation oversees a variety of image-rich, collectively edited projects, such as Wikipedia, Wiktionary, and Wikimedia Commons. But controversy occasionally breaks out over some of the graphic content on these sites. According to reports, about a year ago Wikimedia founder Jimmy Wales gave up some control over Wikimedia material, following his controversial attempt to delete pornographic images from various Wikimedia pages. In his own words, Wales eventually relinquished "virtually all permissions to actually do things from the 'Founder' flag."

"I even removed my ability to edit semi-protected pages!" he added, although: "(I've kept permissions related to 'viewing' things.)"

All this followed a Wikipedia cofounder's letter to the Federal Bureau of Investigation claiming that Wikimedia Commons "may be knowingly distributing child pornography."

"I don't know if there is any more," Larry Sanger added after forwarding the FBI various images he saw as examples, "but I wouldn't be surprised if there is—the content on the various Wikimedia projects, including Wikipedia and Wikimedia Commons and various others, are truly vast."

This commentary brought a fast response from the Wikimedia Foundation's official blog. "The Wikimedia Foundation obeys the law," it declared in late April of 2010. "In the weeks since Sanger's published allegations, the Wikimedia Foundation has not been contacted by the FBI or any other law-enforcement agency with regard to allegedly illegal content on any Wikimedia projects. Our community of volunteer editors takes action to remove illegal material when such material is brought to its attention."

But Wales also put out a statement disclosing that "Wikimedia Commons admins who wish to remove from the project all images that are of little or no educational value but which appeal solely to prurient interests have my full support. This includes immediate deletion of all pornographic images."

At the time we reported on this issue, a timeline story posted on Wikipedia acknowledged Wales' deletions and confirmed that some of his 'founder' privileges were removed.

Cat herding

None of this sort of turbulence at a participatory organization like Wikimedia should surprise anybody. Scholarly Wikimedia watchers have been predicting the site's downfall for years, on the grounds that its open-ended editing process would continuously expose it to credibility bashing miscreants. Debates on whether to deploy a flag and approve revision system (so as to avoid posting bio pages for politicians that prematurely report their deaths) have been hot topics around the commons for a while.

Two years ago, ISPs across the United Kingdom blocked Wikipedia following a decency group's complaint about the site's publication of an old Scorpions album cover, Virgin Killer. The cover displayed an image of an almost completely naked girl who looked to be under eighteen. Eventually the group rescinded the objection, since, after all, the picture is ubiquitous on the 'Net.

Despite the controversy, Wikimedia is still around, a good thing, and testimony to the foundation's ability to gently shepard its vast herd of content creating felines in the right direction when necessary. That hasn't been an easy process when it comes to controversial images. When last we checked, the process for figuring out "inappropriate" graphics sometimes consisted of editors arguing with each other over a specific picture until everybody became exhausted and gave up.

Open and full access

The image referendum was launched at the unanimous request of the Wikimedia Foundation's Board of Directors on May 29. The Board explained that it was acting on the basis of a 2010 Draft Study on Wikimedia Controversial Content, which defined intellectual openness and public service as the first and second principles of Wikimedia.

But "there are a very small number of cases—a minority of a minority—" the draft contended, "where that compatibility of principles breaks down, where increased intellectual openness threatens to reduce our public service, rather than increase it":

Maybe it's when we publish images considered sacred to one religious group or another. Maybe it's when we present unprotected content of sexual behaviour or violence to children without warning. Maybe it's when we present shock images on publicly-highlighted pages. In effect, the study we have been undertaking is made up of this minority of a minority—where the principles of openness and service grate against each other, and seem to need some lubrication. The question we must ask and answer is: Is that lubrication possible without destroying the very principles on which the enterprise is founded?

And so Part II of the Draft Study posited a definition of controversial images in categorical terms: "our surmise is that only three categories of images can currently pass this test—sexual images, violent images, and certain images considered sacred by one spiritual tradition or another."

Surprise and dismay

For the first two categories, the draft recommended that a feature be added that would allow users to "shutter" rather than delete questionable images:

[t]hat there be an option prominently visible on all WMF [Wikimedia Foundation] pages (like the Search Settings options on Google), available to registered and non-registered users alike, that, when selected, will place all images in Commons Categories defined as sexual (penises, vulvas, masturbation, etc.) or violent (images of massacres, lynchings, etc.) into either collapsible or other forms of shuttered viewing, wherever these images might appear on WMF sites. The rationales for this are several. Images of sexuality and violence are necessary components of Wikimedia projects for them to fulfill their mandates to be open, free and educational. However, these images—of genital areas and sexual practices on the one hand, or mass graves and mutilated corpses on the other—will inevitably still have the power to disturb some viewers, especially if they are children, or if they are happened upon unintentionally. The point of the "button" we're proposing is to help alleviate that surprise and dismay, by making the images unavailable for viewing without a second command.

For other categories deemed controversial, shuttering operations would only be available to registered users.

It should be noted that the draft went further than just these ideas. The authors acknowledged that their "Recommendation 4" would probably "engender a fair bit of reaction." That proposal urged Commons editors to review nudity images, "where breasts, genital areas (pubis, vulva, penis) and/or buttocks are clearly visible, and the intent of the image, to a reasonable person, is merely to arouse, not educate, with a view to deleting such images from Commons."

But the Board steered clear of this suggestion, focusing instead on recommendations 7 and 9, which emphasize shuttering. Specifically:

We ask the Executive Director, in consultation with the community, to develop and implement a personal image hiding feature that will enable readers to easily hide images hosted on the projects that they do not wish to view, either when first viewing the image or ahead of time through preference settings. We affirm that no image should be permanently removed because of this feature, only hidden; that the language used in the interface and development of this feature be as neutral and inclusive as possible; that the principle of least astonishment for the reader is applied; and that the feature be visible, clear and usable on all Wikimedia projects for both logged-in and logged-out readers.

Leave me alone

A small historical side note: this "principle of least astonishment" is reminiscent of another principle—that enunciated in the Supreme Court's Pacifica vs. FCC decision in 1978. In that ruling, which upheld a Federal Communications Commission sanction against a radio station that broadcast George Carlin's famous Seven Dirty Words monologue, the high court rejected arguments that one could duck exposure to indecency by simply avoiding the radio broadcast in question.

"Patently offensive, indecent material presented over the airwaves confronts the citizen, not only in public, but also in the privacy of the home, where the individual's right to be left alone plainly outweighs the First Amendment rights of an intruder," Justice John Paul Stevens argued.

Because the broadcast audience is constantly tuning in and out, prior warnings cannot completely protect the listener or viewer from unexpected program content. To say that one may avoid further offense by turning off the radio when he hears indecent language is like saying that the remedy for an assault is to run away after the first blow. One may hang up on an indecent phone call, but that option does not give the caller a constitutional immunity or avoid a harm that has already taken place.

But the similarities between Stevens' "right to be left alone" and Wikimedia's "least astonishment" principle shouldn't obscure the very different context here. Wikimedia is a nongovernmental entity considering a voluntary remedy to a perceived problem—not a government agency imposing a language code on broadcasters.

How the voting works

Wikimedians are being asked to vote on a scale of one to ten (five equals neutral; ten equals "strongly in favor") on the following aspects of the proposal:

for the Wikimedia projects to offer this feature to readers.

that the feature be usable by both logged-in and logged-out readers.

that hiding be reversible: readers should be supported if they decide to change their minds.

that individuals be able to report or flag images that they see as controversial, that have not yet been categorized as such.

that the feature allow readers to quickly and easily choose which types of images they want to hide (e.g., 5-10 categories), so that people could choose for example to hide sexual imagery but not violent imagery.

that the feature be culturally neutral (as much as possible, it should aim to reflect a global or multi-cultural view of what imagery is potentially controversial).

Referendum voting ends on August 30. Wikimedia editors who have not been blocked on more than one project and have made at least ten edits before August 1 may vote from a registered account. Developers, staff, contractors, and board members can vote, too. The results will be announced on September 1.