Less than 10% of Wikipedia contributors identify as female — and it shows. Women worldwide are working to change that.

The 2017 political and media landscape has heightened our sense of responsibility to parse out biases in the information we consume. Now more than ever, when I read news or commentary, I ask myself: who wrote this piece? Who paid the writer? What are their values? What do they hope to gain from publishing this?

Wikipedia, despite its somewhat polarizing reputation, makes it easier to answer those questions. Its scale dilutes the bias of any one author, and most Wikipedia articles have numerous authors — sometimes hundreds, if the article is popular or controversial. When I read a Wikipedia article, I feel empowered to come to my own conclusions, rather than constantly questioning if I should agree with the conclusions of one reporter or scholar. The platform is also nonprofit and transparent; I can easily see what sources informed an article, how it evolved, and who contributed to it.

I’ve felt invested in Wikipedia and its importance for years, so I finally created an editor’s account in 2016. Poking around the site’s discussion boards and guidelines, I quickly noticed it shared surface-level similarities to other nerdy forum-based communities on the internet, like Reddit. And while I didn’t notice as much overt sexism as on Reddit, everything from the slang to the usernames to the heated competition for Barn Stars (think Wikipedia Boy Scout badges) felt extremely, well, dude-ish.

Soon, I learned that statistics supported my anecdotal observations. Less than 10% of Wikipedia contributors identify as female, and that gender disparity in the community impacts the content. Not only are women less-covered on Wikipedia than men, but articles about them are more likely to mention their gender and relationship status, link to pages about men, and describe their work in gendered terms (such as in 2013, when the New York Times reported that a user had removed all female novelists from the list of American novelists and placed them on a new one called “American woman novelists”).

In response to this type of bias, and because the art world suffers from similar issues, four friends in 2014 formed Art+Feminism, a collective that aims “to create meaningful changes to the body of knowledge available about feminism and the arts on Wikipedia.” In under three years, it’s become a worldwide movement — every March since 2014 (to generally coincide with International Women’s Day), A+F members have gathered at more than 280 events on six continents to create and edit thousands of Wikipedia pages.

On Saturday, March 11, I arrived at New York’s Museum of Modern Art with a colleague for our first edit-a-thon. The day’s itinerary was flexible and fluid. You could spent your whole day in trainings, panels, and breakout sessions; you could spend your whole day editing (alone or with help from the event staff); or you could do a mix of the two.

In addition to A+F, organizations that led programming included Brooklyn’s Interference Archive, a library of historical materials related to social and political activism; AfroCrowd, a group seeking to increase the number of people of African descent in the Wikipedia community; and Professional Organization for Women in the Arts (POWarts), which assisted with learning to edit. There were also a handful of representatives from Wikipedia and the Wikimedia Foundation, including Alexandra Wang and Winifred Olliff.

I started my day with a panel moderated by Kimberly Drew (aka @museummammy), featuring Joanne McNeil and Zara Rahman. The conversation touched on the importance of nonprofit information, how to responsibly break your own echo chambers, and internet presence vs. anonymity. When discussing Wikipedia’s role in the “post-truth” era, Rahman said, “It’s not truth that’s broken, it’s credibility. [We look to certain institutions and think] ‘You should be the one to tell me the truth, but I don’t know anymore.’” As our trust in institutions wanes, the importance of easily accessible, wide-ranging information sources increases.

After that, we quickly got down to business with the “Learn how to edit Wikipedia” session. I had made minor revisions to Wikipedia before, but the training taught me how to use additional tools and explained aspects of the process that previously confused me. The training also reviewed Wikipedia’s “core content policies,” which Wikipedia uses to determine if content is appropriate. The policies can be summarized as, “Wikipedia content is intended to be factual, notable, verifiable with cited external sources, and neutrally presented.”

But a larger conversation loomed. Some audience members shared anecdotes about how Wikipedia admins leveraged these policies against them when writing about women or people of color. A slide about Wikipedia’s notability guidelines asked, “What if notability guidelines reproduce structural sexism and racism? How can we address and amend this?” Those are fraught, important questions that significantly impact Wikipedia’s potential as an inclusive platform and the work we aimed to do that day.

After training ended, I sat down to edit. I quickly realized that the articles I’d had in mind (for soul singer Sharon Jones and rapper Gangsta Boo) had both been significantly updated since I last read them, so I brainstormed other options. I looked up Jubilee, one of my favorite local DJs, to see if she had a page; I found a “stub,” meaning someone had created an article for her, but no content had been added. Jubilee has over a decade of experience in electronic music, and her work has been covered in numerous reputable publications, so her article provided a perfect starting place.

I wrote and submitted my article for Jubilee, and made a few updates to existing articles for Dawn Richard (a musician) and Brit Bennett (a writer). I breathed a sigh of relief when my work wasn’t flagged or deleted by an administrator for failing to meet the aforementioned criteria. My colleague, however, was not so fortunate. She attempted to publish an article about the Lovie Olivia, a black visual artist suggested by AfroCrowd, which provided a sample task lists of black artists whose articles needed creating or editing. But an admin flagged it for speedy deletion and cited criteria CSD A7: The article “does not indicate why its subject is important or significant” — an application of the previously discussed notability guidelines. (The deletion was later reversed after additional edits.)

In the last session of the day, a “notability working group” provided a formal opportunity to discuss this issue head on. In this meeting, community members who felt stifled by the notability guidelines shared their experiences with Wikimedia representatives who help shape the implementation and evolution of Wikipedia’s principles. The room felt tense but optimistic.

Some participants described the Wikipedia community as more hierarchical than Wikimedia staff admitted, and highlighted ways in which strict interpretations of Wikipedia’s criteria can discriminate against content from and about marginalized groups. For example, one woman described how the verifiability criteria and list of “acceptable” sources precludes histories from African cultures because they are oral, not written — let alone covered by Western scholars or journalists. A Wikimedia representative responded by describing pilot projects that worked to mitigate this problem, including use of audio as a source.

The conversation illustrated that any discussion about Wikipedia, its rules, and inclusivity cannot be disentangled from much bigger, stickier questions: who gets to decide what knowledge is worth documenting and sharing, what sources are legitimate, and whose perspectives are valid? And if these decisions are made by a community dominated by white men, how inclusive can Wikipedia be without changes to its governance and policies?

To that point, a Wikipedia representative reminded us that Wikipedia does not seek to be an indiscriminate collection of information; the core content policies exist to narrow its scope, even if they were not intentionally designed to discriminate against marginalized groups. But, realistically, as information availability continues to explode, Wikipedia’s influence as it manages that information will only increase, as will the impact of its decisions about what information is acceptable.

Fortunately, Wikipedia representatives’ responses to these concerns illustrated a crucial difference between Wikipedia and other internet platforms: Wikipedia acknowledges that as an eternal work in progress, its rules are subject to debate and interpretation. One Wikipedia rep admitted that when handling disputes, Wikipedia often contradicts itself and previous decisions. While that could lead to unfairness, that flexibility also presents opportunity for change: women and people of color may face barriers to moving up the Wikipedia hierarchy, but once they do, they have wide-ranging powers to shape the direction of the Wikipedia community and content.

This stance also demonstrates Wikipedia’s humility: it acknowledges that it won’t get everything right at first. Wikipedia, like any community, needs governance to ensure its members work toward Wikipedia’s stated goal. But, unlike other technology platforms — and perhaps unlike our government — it acknowledges that new information, common sense, and diverse perspectives should influence its policies. Thus, the Wikipedia community, if managed appropriately, might be able to correct the discriminatory application of its policies over time.

I left MoMA with a deeper understanding of the barriers that face editors who fight Wikipedia’s gender and racial divides. They wage this battle with every article they edit and every policy they question. But more importantly, I felt inspired by the potential of their—and now my—work. If Wikipedia becomes a platform that documents collective human knowledge without the fogged lens of systemic biases, it would be nothing short of revolutionary.