In the digital age, a growing number of researchers and publishers are using more than just citation counts to track the impact of their articles. In an essay in PLoS Biology, three authors from a major UK research-funding agency argue that alternative metrics — or altmetrics, such as social-media mentions — can help funders to measure the full reach of the research that they support. Some researchers have already used these metrics in their favour. On his lab blog, Fernando Maestre, an ecologist at King Juan Carlos University in Madrid, explained how he included altmetrics in a successful grant proposal earlier this year. But not everyone is convinced that the new metrics are good for science. John Gilleard, a veterinary parasitologist at the University of Calgary in Canada, asked on Twitter:

The term altmetrics generally refers to any measure of a paper’s impact — including the number of times it is viewed or downloaded, along with mentions on Twitter, in blogs, news articles and elsewhere. A growing number of companies offer services that track these measures, such as Altmetric (which provides data to Nature for this column), Impactstory and Plum Analytics, and many journals display such information for their articles.

The PLoS Biology paper1 was written by representatives of the Wellcome Trust, which invests about £600 million (US$936 million) a year in UK and international biomedical research. They make the case that funding agencies could increasingly use altmetrics when making spending decisions. The authors write that compared to citations, “altmetrics offer research funders greater intelligence regarding the use and reuse of research, both among traditional academic audiences and stakeholders outside of academia.”

The measures could also help young researchers who have fewer citations than their more senior colleagues, they add. However, many questions about altmetrics remain, they say. Co-author Adam Dinsmore wrote on the Wellcome Trust blog that now is the time to dig deeper into the “meaning and validity of altmetrics as proxies of research impact”.

Speaking to Nature, Maestre says that altmetrics aren’t a perfect measure of quality. Some papers, he says, undoubtedly gain a high profile on social media because they are quirky or ultra-fashionable rather than especially insightful. But, overall, he believes that “the best research gets noticed, and this is reflected in altmetrics”. He adds that researchers can boost their scores by blogging and tweeting about their research — within reasonable limits — and encouraging their university press office to do the same.

Some, however, are uncomfortable with the implication that researchers who attract a lot of online attention could have the upper hand when it comes to winning grants. In an interview, Gilleard said that he has no problem with scientists who want to promote their work. The worry, he says, is that some researchers could try to boost their altmetric scores by making every incremental advance sound like a groundbreaking achievement. “Scientists may start asking themselves if they need to do something that is more palatable for the general public,” he says.

Gilleard notes that altmetrics are useful, but urges researchers and funding agencies alike to look beyond the numbers. “It’s not the score that counts, it’s the information behind it.”