Wikipedia is one of the defining websites of the web 2.0 movement, and has come to embody the so called wisdom of crowds like no other. So it’s fascinating that this so human of endeavors is considering turning to automation in a bid to improve the service.

It is rolling out automated software that can detect whether a page has been vandalised or has been the victim of a honest mistake by a respected user.

Automated editors

The software, which has been developed by the Wikimedia Foundation, aims to fill the gap created by the dwindling number of active contributors to the site. There has been a 40% decline in users in the past few years.

Much of this is believed to be due to the way users are managed, and the high standards expected of newcomers.

The hope is that the new tool will help to offset some of these challenges. The system, known as the Objective Revision Evaluation Service (ORES), is trained to check the quality of new changes to the site, and can then use this heuristic to judge whether the change was authentic or not.

The tool will show editors the recent edits to a page, and make undoing them a simple, one click process. It’s hoped that the new tools will both help secure the quality of content on the site, but without scaring off new users from participating.

Guiding users

ORES will now guide editors towards the most damaging alterations automatically by identifying mistakes that are clearly made by new users and separating those from vindictive edits.

“I suspect the aggressive behavior of Wikipedians doing quality control is because they’re making judgments really fast and they’re not encouraged to have a human interaction with the person,” the editors say. “This enables a tool to say, ‘If you’re going to revert this, maybe you should be careful and send the person who made the edit a message.’”

The new tool is already available on the English, Turkish, Farsi and Portuguese versions of Wikipedia, with the hope for more languages to be added soon.

Suffice to say, changing such a strong community as can be found at the core of Wikipedia is often easier said than done, but hopefully the new tool will be sophisticated enough to gain the approval of editors who justifiably see Wikipedia as their baby.

It is also a change that isn’t being forced on editors, so the development team are hopeful that it will be adopted.

“In some ways it’s weird to introduce AI and machine learning to a massive social thing, but I don’t see what we’re doing as any different to making other software changes to the site,” they say. “Every change we make affects behavior.”