Do scientific papers written by well-known scholars get more attention than they otherwise would receive because of their authors’ high profiles? As reported in a news release by MIT , a new study co-authored by an MIT economist reports that high-status authorship does increase how frequently papers are cited in the life sciences — but finds some subtle twists in how this happens.“We found that there was an effect of status,” says Pierre Azoulay, an associate professor at the MIT Sloan School of Management, and co-author of a paper on the subject , published this month in the journal Management Science . But that effect, he adds, is not “overwhelming.”

The study reports that citations of papers increase by 12 percent, above the expected level, when their authors are awarded prestigious investigator status at the Howard Hughes Medical Institute (HHMI), a major private research organization. However, certain kinds of research papers are boosted more than others by the increased prestige that accompanies the HHMI award, Azoulay notes.

“We find much more of an effect on recent papers, published in a short window before the prize,” Azoulay says. Moreover, he adds, the greatest gains come for papers in new areas of research, and for papers published in lower-profile journals. Younger researchers who had lower profiles previously were more likely to see a change as well.

“The effect was much more pronounced when there was more reason to be uncertain about the quality of the science or the scientist before the prize,” Azoulay observes.

Identifying the ‘Matthew Effect’

The paper, titled, “Matthew: Effect or Fable?” was co-authored by Azoulay, Toby Stuart of the University of California at Berkeley, and Yanbo Wang of Boston University. The title references the “Matthew Effect,” a term coined by sociologist Robert K. Merton to describe the possibility that the work of those with high status receives greater attention than equivalent work by those who are not as well known.

Positively identifying this phenomenon in scientific paper citations is difficult, however, because it is hard to separate the status of the author from the quality of the paper. It is possible, after all, that better-known researchers are simply producing higher-quality papers, which get more attention as a result.

But Azoulay, Stuart, and Wang have a way to address this issue: They look at papers first published before the authors became HHMI investigators, then examine the citation rates for those papers after the HHMI appointments occurred, compared to a baseline of similar papers whose authors did not receive HHMI appointments.

More specifically, each paper in the study is paired with what Azoulay calls a “fraternal twin,” that is, another paper published in the exact same journal, at the same time, with the same initial citation pattern. For good measure, the authors of the papers in this comparison group were all scientists who had received other early-career awards.

In all, from 1984 through 2003, 443 scientists were named HHMI investigators. The current study examines 3,636 papers written by 424 of those scientists, comparing them to 3,636 papers in the control group.

“You couldn’t tell them [the pairs of papers] apart in terms of citation trajectories, up until the time of the prize,” Azoulay says.

Beyond the overall 12 percent increase in citations, the effect was nearly twice as great for papers published in lower-profile journals. Alternately, Azoulay points out, “If your paper was published in Cell or Nature or Science, the HHMI [award] doesn’t add a lot.”

Toward the scientific study of scientists

Other researchers think the study adds value to the burgeoning data-based literature on the work of scientists. Benjamin Jones, a professor at Northwestern University’s Kellogg School of Management who has read the paper, says the study contains “compelling empirical evidence” and “strongly suggests that eminence itself matters” when it comes to recognition of published papers.

Moreover, Jones adds, it is conceivable that the careers of scientists “might diverge substantially on account of the Matthew Effect, rather than due to the quality of the work itself. This possibility, among others, are interesting avenues for further research, motivated by Azoulay, Stuart, and Wang’s findings.”

As Azoulay acknowledges, scientists themselves are not always entirely comfortable with studies of the citations given papers, since some scientists may feel the quality of some papers may not be represented by citation data in the first place; worthy research can escape wide notice for extended periods of time.

Still, Azoulay and other scholars have used citation data to glean new insights and quantify observations about the scientific enterprise. For instance, drawing on his own proprietary database of more than 12,000 life scientists, Azoulay has found that bioscience advances are encouraged by longer-term grants with more freedom for researchers, and that physical proximity among scientists increases citation rates, among other things.

The study behind this month’s paper was funded in part by the National Science Foundation.