Collaborative production, where people have to coordinate with one another to get anything done, is considerably harder that simple sharing, but the results can be more profound. New tools allow large groups to collaborate, by taking advantage of nonfinancial motivations and by allowing for wildly differing levels of contribution.

Clay Shirky – Here Comes Everybody: How change happens when people come together

Many organisations today are embracing wikis and other Web 2.0 / Enterprise 2.0 technologies in order to improve collaboration, innovation and knowledge management. I suspect that many of these organisations are pursuing these technologies with little fundamental understanding of the human dynamics involved with their use.

Clay Shirky in his book Here Comes Everybody: How change happens when people come together provides some compelling insight into the behavioural aspects of wikis. I have tried to summarise the essence of these insights regarding wikis as follows:

Wikis are user editable websites based on the premise that groups of people who want to collaborate also tend to trust each other. Consequently, small groups should be able to work on a shared writing effort without needing formal management or process.

Wikis allow readers to add, alter, or delete content on pages. Whenever a user edits anything on a given webpage, the wiki records the change and saves the previous version. Every wiki page is therefore the sum total of accumulated changes, with all earlier edits stored as historical documentation. This simplicity means that not only can new content be quickly added, but existing content can be quickly deleted – or restored.

Clay Shirky uses Wikipedia as a case study for wikis on a global scale as it is perhaps the most famous example of ‘distributed collaboration’ which has become one of the most visited websites in the world.

Wikipedia articles are a process, not a product, and as a result are never truly finished. Once an article exists, it starts to get readers. Soon a self-selecting group of those readers decide to become contributors. Some of them add new text, some edit the existing article, some add references to other articles or external sources, and some fix typos and grammatical errors. All contributions can be incremental and not all edits are improvements. For a Wikipedia article to improve, the good edits simply have to outweigh the bad ones. Wikipedia assumes that new errors will be introduced less frequently than existing ones will be corrected. This assumption has proven correct; despite occasional vandalism, Wikipedia articles get better, on average, over time.

With wikis, a predictable pattern emerges over time: readers continue to read, some of them become contributors, the wiki continues to grow, and articles continue to improve. The process is more like creating a coral reef, the sum of millions of individual actions, than creating a car. And the key to creating those individual actions is to hand as much freedom as possible to the average user.

Given that everyone now has the tools to contribute equally, you might expect a huge increase in equality of participation – but you would be wrong. Social media contributions tend to follow a predictable ‘power law’ pattern where the most active participant is generally much more active than the person in the number two slot and far more active than the average. For example, fewer than 2% of Wikipedia users ever contribute, yet this is enough to create profound value for millions of users. Consequently, large social systems cannot be understood as a simple aggregation of the behaviour of some nonexistent ‘average’ user.

This power law relationship was first discovered by Vilfredo Pareto, an Italian economist in the early 1900s who studied the distribution of wealth. This same power law was the subject of Chris Anderson’s The Long Tail.

In such social systems (as Wikipedia and Flickr) the most active participants tend to be much more active that the median participant, so active in fact that any measure of ‘average’ participation becomes meaningless. There is a steep decline from a few wildly active participants to a large group of barely active participants, and though the average is easy to calculate, it doesn’t tell you much about any given participant.

Such systems have some surprising characteristics – the first is that, by definition, most participants are ‘below average’ in their contributions/participation. The second is that as systems get larger, the imbalance between the few and the many gets larger, not smaller.

Consequently, you cannot understand Wikipedia (or indeed any large social system) by looking at any one user or even a small group and assuming they are representative of the whole. The most active few users account for a majority of the edits, even though they make up a minority, and often a tiny minority, of contributors.

We typically have a hard time thinking about systems like Wikipedia that exhibit power law distributions – we’re used to being able to extract useful averages from small samples and to reason about the whole system based on those averages. When you encounter a system like Wikipedia where there is no representative user, the habits of mind that come from thinking about averages are not merely useless, they are harmful. With a system like Wikipedia, it is important to concentrate not on the individual users, but on the behaviour of the collective.

This obviously has implications for rewarding and recognising contributors to wikis.

Why do people contribute to a wiki in the fist place? Making a mark on the work is a common human desire. This desire to make a meaningful contribution where we can is part of what drives Wikipedia’s spontaneous division of labour.

Another motivation is the desire to do a good thing. The genius of wikis is in part predicated on the ability to make nonfinancial motivations add up to something of global significance.

Wikis reward those who invest in improving them and explains why both experts and amateurs are willing to contribute.

Wikis are a hybrid of tool and community. Wikipedia, and all wikis, grow if enough people care about them, and they die if they don’t. This isn’t a result of the software – but part of the community that uses the software. Within Wikipedia there are many examples of contentious articles on subjects like abortion and Islam where complete deletions of the article’s content have been restored in less than two minutes.

As with every fusion of group and tool, the defence against vandalism is the result of novel technology (all edits and deletes can be quickly and easily reversed) and a novel social strategy. Wikis provide ways for groups to work together, and to defend the output of that work, but these capabilities are available only when most of the participants are committed to those outcomes.

Wikipedia is a living study in apparent contradictions – a chaotic process, with unpredictable and wildly uneven contributions made by non-expert contributors acting out of variable motivations, has created a global resource of tremendous daily value.

Wikipedia is the product not of collectivism but of unending argumentation. The articles grow not from harmonious thought but from constant scrutiny and correction. Wikipedia, unlike a commercial encyclopaedia, does not have to be efficient – it merely has to be effective. If enough people see an article, the chance that an error will be caught and fixed improves with time.

Thank you Clay for your insight … Wikis are indeed simple but will undoubtedly have a profound affect on how we collaborate and share in future.