Back in the summer, we reported that Nature, perhaps the most prestigious scientific journal on the planet, was experimenting with open peer review (OPR). Instead of simply assigning manuscripts to a set of anonymous reviewers, Nature offered the authors the opportunity to also have their work posted on a publicly accessible site that provided a mechanism for making comments on the text. The editors would then combine the peer reviewers' recommendations with the public comments when making decisions regarding whether to publish the work or to require additional revisions.

Nature ran a trial of this system (details are available as a FAQ) from the beginning of June until the end of September, and followed up on it with surveys of authors and editors. Their resulting analysis makes it clear that, at least as structured, OPR went nowhere. Out of nearly 1,400 manuscripts chosen for review during that period, only five percent of the authors agreed to have theirs subjected to OPR. Those 71 manuscripts elicited a grand total of 92 comments; nearly half received no comments at all, and over half of the comments were directed at eight individual papers.

The editors handling the papers did not find the results all that appealing. Ideally, OPR would supplement an editor's knowledge by attracting comments from people with a detailed technical understanding of the field. Instead, most comments were judged as being either editorial suggestions or devoid of content (comments such as "nice work" were considered unhelpful).

The editorial staff at Nature appear to do a good job of analyzing the results of their experiment. They suggest that the exceptionally poor response in the biosciences could result in part because of potential patenting issues connected with the research. The high level of competition in this field would also allow OPR to alert competitors of pending publications, providing the competition with the opportunity to push through a similar publication on short notice.

Glaringly absent from the discussion, however, was any consideration of the fact that Nature's OPR process stripped a critical element from normal peer reviewing: anonymity. Anyone posting a comment on a manuscript was required to provide a verifiable name and institution. Although this was undoubtedly helpful in allowing the editors to screen for potential crackpots, it may have inhibited many from leaving comments. After all, it's one thing to anonymously reveal misunderstandings and lack of knowledge to a single editor during peer review. It's a different matter entirely to do so in a forum that could be viewed by everyone working in your field.

Having participated in this process a number of times as a peer reviewer, I can say with certainty that providing a decent review requires a significant amount of effort. Finding the time to make that effort is challenging even when editors contact you directly and set deadlines for your responses. Changing the scientific culture so that researchers hunt down relevant publications on the web and provide a detailed and technical review without any incentive or deadline is a long-term project, and one that Nature's brief experiment did not address.

Further reading:

A general debate on the process of peer review at Nature.