Journal rankings are a rigged game. The blacklist of history of economic thought journals isn’t a fluke nor a conspiracy—it exposes how citation rankings really work

As usual, in June, Clarivate Analytics—the company that owns the influential Web of Science bibliographic database—published the list of journals excluded from the Journal Citation Reports (JCR) for “anomalous citation patterns.”

This year’s announcement sent shockwaves through the small and integrated, but international, community of historians of economic thought: three of the discipline’s main journals were blacklisted—

Journal of the History of Economic Thought (JHET), published by the American Society of History of Economic Thought (Cambridge University Press),

European Journal of the History of Economic Thought (EJHET), published by the European Society for the History of Economic Thought (Routledge),

History of Economic Ideas (HEI), published by the Italian publisher Fabrizio Serra Editore.

Because of their exclusion, these three journals will not receive their Impact Factor (the process is explained in greater detail here). Technically speaking, the Impact Factor (IF) of a journal is calculated as the average number of citations received in a reference year (in this case 2017) by articles published in the journal in the previous two years (2016 and 2015). But often the Impact Factor functions as a proxy for the relative importance of a journal within its field.

This case is relevant for at least three reasons:

It draws attention to practices adopted by scientific communities in relation to publishing and citation habits and strategies; It reveals how these practices are, in fact, sanctioned only in the most blatant cases, while widespread strategic behavior is tolerated by rewarding the largest scientific networks; It shows how incentives created in one country can have consequences for a whole scientific community defined at a world level.

It seems that we are not dealing here with an error by Clarivate, or with a rotten apple in a system that otherwise works: we are dealing here with a system fundamentally based on the wrong incentives. At the same time, the case of the history of economic thought shows that attempts by small scientific communities to adapt to the system of citation rankings will ultimately fail. Instead, they should sit out the rigged game of journal rankings entirely.

The Definitions

What makes a citation pattern “anomalous” is, ultimately, a question of definitions. For Clarivate, there are two types of anomalous patterns:

the abnormal use of self-citations by a journal to “pump-up” its impact factor;

the citation stacking, that is the repeated and unusual exchange of citations between a small group of journals over the IF citation window.

The three journals on the history of economic thought have fallen into the latter category: in particular, JHET and EJHET have benefited (the “recipient”) from citations coming from HEI (the “donor”). This means that during 2017, HEI mentioned in an anomalous way articles published in JHET and EJHET during the previous two years (2016 and 2015).

The definition of anomaly remains to be explained: in the case of citation stacking, the evidence is based on the value of two indicators:

An “anomalous” percentage of citations from the journals involved, which are included in the calculation of the (numerator of the) IF. In our case, according to Clarivate, HEI generated 56% and 64% of the citations entering in the calculation of the IF of EJHET and JHET, respectively.

An “abnormal” concentration of citations to articles published in the reference years for the calculation of the IF. To this end, for each journal, Clarivate analyzes data of citations in 2017 to articles published in each of the years 2014-2017. In our case, in 2017, 86% and 88% of the citations donated by HEI and received by EJHET and JHET, respectively, in 2017 were to articles published in 2015 and 2016 (and only 14% and 12% to articles published in 2014 and 2017). In the following graph I have reported the anomaly calculated on Web of Science (WoS) data.

All the evidence indicates that an anomaly has occurred. And, as a consequence, HEI (as donor) and EJHET and JHET (as recipients) are excluded from the 2017 edition of the Journal of Citation Report.

To the Origins of Citation Stacking

At this point we have to ask: do these thresholds indicate that, below them, misbehavior does not exist or that its impact is negligible? And in that second case, what are the conditions for the diffusion of these practices and what are their consequences?

Citation stacking is a phenomenon well known in bibliometrics. There are many different ways of pumping-up citation indicators of journals, such as citation collusion between journals, as has been well documented.

The case of history of economic thought journals is of particular interest since there is not a usual citation cartel, but a donor journal that has not taken any direct advantage of the scheme.

It is not difficult to understand what happened: HEI published an article in 2017, from now on Lange et al.

Lange et al. is a survey of the contributions of the history of economic thought published during 2015 and 2016 in other journals on the history of economic thought. The article contains 212 bibliographical references, a number almost four times the average: the 21 articles of HEI indexed by WoS in 2017 contained a total of 1,267 bibliographical references, with an average of about 60 references per article.

Lange et al alone results in 170 valid citations in the WoS database (42 references are to “non-source,” that is to materials not indexed in WoS). The following table shows the journals that have benefited of the citations from Lange et al.:

EJHET and JHET, the two journals suspended together with HEI, are among the journals that benefited the most from the Lange et al. citations. The citations generated by Lange et al. represent for JHET 63.1% of the total number of citations received in 2017; for EJHET they represent 53.5%. The first two percentages are very close to the first citation anomaly value used by Clarivate; they indicate also that very few citations to articles published on EJHET and JHET in the previous two years originated from other articles published on HEI in 2017.

History of Political Economy (HOPE) is the journal that has benefited from the most citations from Lange et al., but it was not included by Clarivate on the blacklist. We will come back to that shortly.

Two Issues to Clarify

To get to the bottom of this, we must answer two questions:

The summary of Lange et. al. stated clearly that it is the second survey article published by HEI; in 2016 an article by Giulia Bianchi was published surveying the articles of HET published in the period 2014-2015. Why did Clarivate not detect any anomalies in 2016? If HOPE is the journal that has benefited most from citations from Lange et. al, why is it not blacklisted by Clarivate?

To answer question 1: Bianchi’s survey in 2016 contained a much smaller number of references (69) than Lange et. al., generating a number of citations (56) that did not impact the two indicators of anomalies calculated by Clarivate. Bianchi (2016) in fact entered into the calculation of the IF of JHET for 17 quotations and only 9 for that of the EJHET; also in this case the major beneficiary was HOPE with 18 quotations valid for the IF (the information on this point in the letter sent to Clarivate by EJHET and JHET’s editors is incorrect).

The answer to the second question is more complicated. From the data that we have been able to reconstruct, of the 92 quotations received by HOPE in 2017, 48 come from Lange, equal to 52%. We can perhaps conjecture that Clarivate considers 55% as the critical threshold (EJHET 56% and JHET 64%) for the anomaly in the first indicator, since, as the following graph shows, in relation to the second indicator, there are no substantial differences between the blacklisted journals and HOPE.

Who Gains and Who Loses from Clarivate’s Decisions?

Up to this point, I have analyzed the facts behind Clarivate’s decision. Let me now begin to make some more general reflections.

The first is only apparently of a technical nature.

Journals guilty of citation stacking have been removed from the IF calculation. But at the moment all the “anomalous” citations continue to be included in the WoS database. Therefore, all articles and journals cited by Lange et al. continue to enjoy a citation advantage generated by a practice that Clarivate has decided to sanction.

In particular, HOPE, thanks to the HEI article, rose in 2017 to a value of IF=1.415 from IF=0.595 in 2016; a value equal to 5 times the average for the years 2011-2015.

If the quotations from the HEI article are removed from the IF calculation for HOPE, its IF would stand at 0.662, a value in line with the journal’s citation history.

In the following graph I have reported the trends of the IF of journals in the period 2011-2016; and I have calculated values of the IF for HEI, EJHET and JHET in 2017. The HOPE figure for 2017 is that of the JCR. As you can see, the effects of “citation stacking” on HOPE is quite similar to what JHET and EJHET would have experienced had they not been sanctioned by Clarivate.

It is also possible to calculate the trend of the IF of the journals of the history of economic thought net of the effects of the citation stacking. The following graph shows that, in the absence of citation stacking, the IF values would have remained at the usual values of recent years.

A Technical Incident That Is Easy to Solve?

The temptation to dismiss the case as a “technical accident” is very strong. There are two arguments, but neither of them appears to me to be robust.

According to the first argument claimed by the editors of EJHET and JHET in a joint letter to Clarivate, it is only the small size of the industry that has led Clarivate to erroneously assess “usual” behavior (the publication of a survey) as abnormal. It would thus be enough for Clarivate to revisit its decision, or to allow editors to make their own case before having their magazine blacklisted.

That the history of economic thought is a very specialized disciplinary field with sui generis citational behavior is certainly true: I had spoken about it here. And it is equally true that the small size of the sector has allowed Clarivate to have an easy life in discovering the citation anomaly. But the citation anomaly remains anyway and it is hardly be considered a technical error, as I have shown in the above.

The second argument concerns an alleged “anomaly” of the episode. HEI would have raised the IF’s of journals in its sector by its review without any benefit being derived from it. The managing editor of HEI, commenting on what happened in a comment to a post in Scholarly Kitchen, wrote that JHET and EJHET have no role in the affair.

What happened to the journals of the history of thought is not, however, an anomaly in the panorama of scientific literature. It is true that the most known mode of “citation stacking” is generated by collusion between journals that agree to cross exchange citations. But there are many documented similar precedents to this case. I limit myself to remember the most sensational one that concerned Chaos, Solitons and Fractals and its then editor-in-chief Mohamed El Naschie that, thanks to the citation stacking, brought the University of Alexandria of Egypt to the fore in the international rankings. In that case the citations to Chaos Solitons and Fractals came largely from The International Journal of Nonlinear Sciences and Numerical Simulation (a story recounted here).

The Italian Institutional Landscape

In the comments of international readers, the institutional background on which this affair is based probably escapes attention. In Italy, the massive research evaluation exercises (VQR), the careers of scholars (national qualification to the role of professors: ASN) and even their participation in PhD colleges depend, in the social sciences and humanities, on journal rankings developed by the ministerial evaluation agency ANVUR.

These rankings are based also on bibliometric indicators calculated by WoS (VQR) and Scopus (VQR and ASN). It is in the interest of the scientific community to provide citations to journals in their field to raise them to the top of the rankings. In particular, for the purposes of an individual’s career, it is essential that certain journals be rated as “A” in their research field.

Let me be clear that I am not arguing that HEI has published the reviews with the sole purpose of pumping up citations of history of economic thought journals. I am arguing that the bibliographic references contained in the HEI reviews are precious to defend/strengthen the position of the journals in the rankings on which the fate of Italian scholars depend.

Citations generated by HEI have allowed JHET, EJHET and HOPE to substantially improve their placement in Scopus databases, whose indicators are those considered in the journal rankings currently used in Italy for the ASN (and Scopus is completely impermeable to the decisions by its competitor Clarivate).

SCOPUS indicators for the journals of history of economic thinking in 2013

SCOPUS indicators for the journals of history of economic thinking in 2017

The history of history of economic thought journals shows how the rules imposed in a country dramatically change the background conditions in which scientific communication takes place. And how a flapping of wings in Italy (a rule defined by the evaluation agency) can cause a hurricane in the international system of scientific communication (the blacklisting of the major journals in the history of economic thought).

This anomaly has been detected for history of economic thought because of the small size of the sector, but how many other journals are pumping up their citations to climb the rankings without being “discovered”?

The Bottom Line

In a discussion on social media, Yann Giraud argued that a “private company acts as a cop in science;” and Beatrice Cherrier wondered if WoS’s intention was not to “kill small (hence non profitable) fields.”

More prosaically, Clarivate must safeguard its flagship product: Web of Science. Therefore, it must ensure that citation indicators appear solid and difficult for editors and researchers to manipulate. Put more explicitly: Clarivate must keep the level of bibliometric doping within acceptable thresholds.

To this end every year Clarivate makes a blacklist of reprobates, so that anyone, especially an editor, is able to measure their citation behavior, keeping the level of doping below a predefined standard: not to increase the percentage of self-citations too much; not to stack citations above a certain threshold.

Every year, for various reasons, someone is found positive at the anti-doping control. The culprits and punishment are announced to the public square: for a year a journal will no longer have the IF. But the inflated data remains in the database.

If the logic of using journal rankings (bibliometric or non-bibliometric) to evaluate the quality of research is not abandoned, there will be no way out: it is pointless to complain about the bad cop when it is the scientific community that has attributed the role of cop in science to Clarivate and Scopus.

The community of historians of economic thought has not given signs of wanting to withdraw from the logic of journal rankings, except to complain about the rules of Clarivate, when things have not gone well. To realize this, it is enough to note that JHET’s Impact Factor is featured on the magazine’s home page.

For Italy the situation is even more delicate because it is the government evaluation agency that has given a small group of professors the task of being the cops of “state science,” also by using data from Clarivate and Scopus. The Italian scientific communities have produced in response some buzz and chatter around the coffee machines, and especially the hasty race to occupy the seats available to participate in the great game of the Italian evaluation.



Final Considerations

This story could be the occasion for even the small community of historians of economic thought to rethink its strategies of scientific communication.

When the European Science Foundation published a ranking of scientific journals in the area of humanities in 2008/2009, all the journals in the history of science signed a very tough document in which they declared that they refused to be indexed. The ESF has since refrained from publishing journal rankings.

The Declaration on Research Assessment (DORA) explicitly provides commitments for magazine publishers, but no history of economic thought journal has adhered to those principles.

There are dozens of open science experiments to overcome the rigidities of scientific communication imposed by large international publishers and companies that produce citation indices. A small community could easily cooperate to take some of these paths, instead of shutting themselves in the blockhouse to defend the citation indices of their journals in national and international rankings.

This event could also be an opportunity for the small scientific community of historians of economic thought to take a firm stand against evaluation practices based on the ranking of journals.

This article was originally published in Italian by ROARS