A grass-roots group of biologists has started a website to keep track of the proliferation of experiments in academic peer review — including trials and platforms aimed at making the review process faster, cheaper, and more transparent and interactive.

“Peer review is deeply embedded into the scientific process, and it serves multiple purposes: giving and receiving feedback, error detection and correction, and filtering and curation. Yet, there’s little evidence that it functions optimally,” notes the site, called ReimagineReview.

At launch, it listed around two dozen peer-review projects or trials, filed under categories such as transparency; quality of review; bias in review; speed; and incentives and recognition for reviewing. The site’s creators aim to add more projects and they invite scientists working on new ideas to add their experiments to its registry.

Let’s make peer review scientific

Among the trials listed on the site is a pilot project run last year by the journal eLife in which, once editors had decided to send a manuscript out for review, the journal committed to publishing the paper alongside its reviews. The site also includes long-established ventures such as the firm Publons, which aims to give authors credit for their reviews, and F1000, a publisher that operates platforms on which papers are posted before organized peer-review follows.

“We needed a way of gathering information about who was experimenting in this space,” says Jessica Polka, the executive director of ASAPbio, which launched ReimagineReview. “There’s not necessarily a lot of awareness of these projects among researchers.” ASAPbio, which stands for ‘accelerating science and publication in biology’, was founded in 2015 to encourage life scientists to use preprints, but has since branched out to become a non-profit organization, headquartered in San Franciso, California, that aims to promote transparency and innovation in the life sciences.

The idea for the site grew out of a February 2018 meeting on peer review, held for scientists, funders and publishers at the Howard Hughes Medical Institute (HHMI) in Chevy Chase, Maryland. At that meeting, attendees generally agreed that it would be a good thing if journals made the text of peer-review reports public, says Polka. But they also wanted more studies on the pros and cons of revealing reviewers’ names or keeping them anonymous, and on innovative ideas in peer review in general.

“﻿There was a lot of activity, but not a lot of ways to dig into what was working and what wasn’t,” says Katja Brose, a science programme officer at the Chan Zuckerberg Initiative (CZI) in Redwood City, California. CZI officials attended the 2018 meeting and funded the new registry, which was developed in partnership with the HHMI and London-based biomedical-research charity the Wellcome Trust. (CZI also funds preprint site BioRxiv and protocols.io, an open-access site to share research methods).

Peer review: Troubled from the start

“It’s a great resource for publishers who want to think about new innovations in peer review,” says Tony Ross-Hellauer, an information scientist at Graz University of Technology in Austria, who saw the site before its launch. He is working with ASAPbio on a different initiative, a website called TRANSPOSE that hopes to crowdsource a list of journal policies for open peer review and on whether journals will publish manuscripts that have previously been posted online as preprints.

At this early stage, many of the initiatives recorded on ReimagineReview are pilot projects run by publishers, or new platforms for peer review, rather than scientific experiments that will lead to clear conclusions about how to make the process better. “We take a relatively loose interpretation of ‘experimentation’,” Polka says. “It’s possible to learn from projects that are not necessarily conceived as controlled trials of peer review, but nevertheless could illuminate effective innovations”.