This is a free translation of a previous french publication: Faut-il signer l’accord d’Elsevier? I’ve put it in the public domain: feel free to share.

France is on the verge of signing a national agreement with Elsevier. The forthcoming deal has been announced a week ago by the main negotiator on the French side, Couperin, a consortium of public and research libraries. The announcement is short on details: the global price of the 5-years national subscription has not been communicated.

Yet, a much deeper account has been leaked. Considering the information to be of public interest, a librarian, Daniel Bourrion, published this comprehensive document… before being force to withdraw it by his administration. Nothing really disappear on the Internet : the document soon reappeared on my blog and has, since then, been extensively disseminated. It appears the whole deal is to cost 190 millions. A disturbing amount as French Universities are going into dire economic troubles.

Couperin claims the deal is quite favorable. While subscription prices have been continuously increasing for the past 30 years, they were actually lowered: « On the five years, the mean annual prices paid by all the parties concerned will be inferior to the 2013 cost ».

Not only is the new agreement less expensive, but it would also include several significant enhancements. The number of partnered institutions goes from 147 to over 600. Data mining is no longer submitted to the law of the jungle, but to a global legal framework.

All theses points appear quite convincing. They have not shaken my intimate conviction: this agreement should not be signed.

Nurturing a speculative monopole with public funds

Elsevier is no ordinary publisher. This international group listed in three stock exchanges draws colossal margins. In 2010, its overall profits exceeded one third of its revenue: 720 millions v. 2 billions. It is the outspoken leader of the scientific edition: the revenue of its three main competitors (Springer, Wiley and Informa) equals half its revenue.

How to account for such a success? Beyond the assumed quality of Elsevier services, the company commercial strategy is effectively reinforced by a worldwide manipulation of scientific institutions. Elsevier’s Deputy Director of Universal Sustainable Research Access, David Tempest, has recently disclosed a staggering justification of the confidentiality of Elsevier’s agreements.

indeed there are confidentiality clauses inherent in the system, in our Freedom Collections. The Freedom Collections do give a lot of choice and there is a lot of discount in there to the librarians. And the use, and the cost per use has been dropping dramatically, year on year. And so we have to ensure that, in order to have fair competition between different countries, that we have this level of confidentiality to make that work. Otherwise everybody would drive down, drive down, drive drive drive, and that would mean that…

This horrible system, where institutions are free to get commercial information, drive down prices and opt for the more interesting offers has a name: free trade. Elsevier’s monopoly depends solely on the continuous ignorance of its costumers. In a well-balanced market, its business would get into serious trouble. And all of this is done at this expense of taxpayers’ money.

What is Elsevier selling?

Elsevier publications are indirectly subsidized by public funds. Scholar are to publish, preferably in acknowledged journals in order to get grants and career advancements. Writing and evaluating an academic paper requires a lot of time: this significant investment is ultimately paid by scientific institution, which frequently happened to be public institution.

What brings Elsevier to all this? Not much. The effective production cost of publications has become quite marginal, if not nullified, with the replacement of printing editions by electronic editions.

The world leader of scientific publishing looks more and more like a cultural industry: it does no trade in commodities, it trades in fetishes. The name of a prestigious journal gathers the same level of symbolic attention than a drawing of Mickey Mouse. It is deeply anchored in the daily life and representations of one or several scientific communities. Although the journal, once bought by Elsevier, has a quite different editorial and social structure than the one that ensures its preeminence, its aura remains. The metrics tends to perpetuate the aura. The more the publication is quoted, the higher goes the impact factor; the higher the impact factor is, the more attention the publication will draw from esteemed researchers.

Open access has not fundamentally altered this vicious mechanism. Following the example of Springer, Elsevier has begun to adopt the author fees model: a researcher buys (generally through its institution), a right to publish ; its work would therefore be freely accessible online. This model reinforces the fetishist trend of the scientific publishing market. Elsevier does not even have to make believe that it sells a commodity: it explicitly claims to monetize some symbolic recognition.

In an open access world, the value has not only been displaced toward the journal names star-system. Elsevier has also invested a lot in additional services: bibliometric database like Scopus or ScienceDirect and an API allowing text and data mining of the whole Elsevier corpus. Both these services are not so good. Scorpus includes numerous shortcomings: updating is insufficient, bibliographic data are sometimes inconsistent, only a fraction of scientific publications are taken into accounts (conference are completely overlooked). The new data mining license creates a very unworkable framework for data analysis: use of the API is mandatory (robots are forbidden on the Elsevier usual websites), queries are limited to 10 000 per week and cannot exceed 200 characters. This prevents a lot of projects: chemical names are often longer than 200 characters. Finally the license goes against one of the core principle of intellectual property: informational public domain. Elsevier claims some property rights on its published information: all the outcome of the data mining project has to be published with a non commercial license, regardless be it an extensive quotations or a mere information.

What makes additional services especially interesting is that they bring some new currency in the evergreen Elsevier economy: metadata. Consultations statistics, forms to get a right to read or a right to data mine feed a gigantic log of scientific activities. From careers evolution to institutions structure, through research projects duration and topic et bibliographic queries, Elsevier knows it all. This wide profiling can be sold to partners or directly used by Elsevier marketing strategy. The publishers accomplishes a radical transformation that brings it closer to the new web industries, like Facebook or Google.

This Elseverian imperialism is not a fatality. The « fetishist » trends of scientific activities were quite acceptable 20 years ago. Days are only 24-hours long: it was clearly more advisable to limit its reader’s horizon to some trustworthy journals. Digital networks have partly remove this limitation. The growing accessibility of publications and the development of annotation tools have enhanced our reading capabilities. And trustworthiness does no longer solely depends on the prestige of a journal.

For instance, the research community surrounding Wikipedia (the « Wiki Studies ») have developed an innovative ecosystem allowing for the horizontal exchange, diffusion and legitimation of publications. Works are evaluated and publicized through a collectively written newsletter, the Wikimedia Research Newsletter. They are archived and classified in two community-driven databases, Wikipapers and Wikilit. In such a context, the editorial frame has lost much of its aura. Preprints gets more disseminated than official versions, as they can be effectively evaluated by the research community.

Do we still need it?

The fundamental shift happened sometime in 2011. According to a study funded by the European Union, since that year more than half of all academic publications have been freely available online.

This evaluation is low. It takes only into account the « official open access » (that is, done with the approval of all the interested parties: publisher, authors…). Yet, there is also a much more significant « dark open access »: researchers putting previous publications online without the agreement of the publisher, or pure illegal copies, without even the consent of the authors. Personally, I always succeed in finding 9 publications out of 10, and my research field is not too open access friendly (mostly social sciences: history, media studies, economics…).

For recent publications, the Elsevier subscription appears quite unnecessary. Besides the agreement is to last five years. By the next few years, most of Elsevier publications are likely to goes into the « official open access ». What’s the use of acquiring a set of free works for 190 millions?

For older publications, the issue is more tricky, as open access is not retroactive. An efficient solution would imply a modification of the French intellectual property code: creating a fair use. Publications already made profitable could be copied for scientific and pedagogic ends (that’s about the exact terms of the US fair use). Scientific institution could digitize older Elsevier journals and create a new bibliometric database. 190 millions would be much more than enough to fund all this process, and this would be a one-shot investment: once Elsevier’s corpus is recreated, taxpayers will never have to fund it again.

France would probably not be the only european country to rebuke Elsevier. Germany and the United Kingdom have already initiated bolder open access policies. A german law passed last year allows, under some conditions, every researcher to disseminate its work 12 months after its initial publication, regardless of the rights of the publisher. The digitization of Elsevier’s corpus could even be an european project.

The importance of being open

The possibilities are numerous: these few ideas do claim to act as a definite solution to the Elsevier problem. The negotiation between Couperin, the Ministry of Research and Elsevier does not seem to have really taken into account this wide set of alternatives. This is hardly surprising. As a popular French saying puts it: « there’s more ideas in two heads than in one ». The negotiation was limited to some heads, whereas it involved a lot of people: researchers, librarians and, more widely, French citizens who will pay the final bill.

An open negotiation would have been quite beneficial. The shortcomings of the deal would have been clearly exposed: the true value of Elsevier’s publications, the efficiency of the additional services, the problematic data mining legal terms…

Instead we are getting to a really weird situation. Hundred of millions of euros will be invested in a deal with an opaque company, delivering a devaluated corpus and questionable services. Meanwhile, French research faces hard austerity measures, that drives some university to bankruptcy…