In an earlier opinion piece, I discussed how impact factor, scientific quality, and writing ability came together in a cycle that did not necessarily select for the best science. In this piece, we'll look at how some commercial publishing houses (Elsevier, in this particular case) are bringing disrepute to the scientific enterprise.

The practice of subscription bundling is used to make it more economic to buy access to all of a publisher's journals than it is to buy a substantial subset. We'll look at how bunding combines with the publication of pseudoscientific journals and having pseudoscientists edit respectable journals to create a situation where junk science shows up in real research institutes. That, in turn has an impact on the credibility of science, as well as its cost of science.

The example at hand, Elsevier publishes some 930 odd journals—most of them are low impact so, if you had any results worth a damn, you would probably publish them elsewhere. Elsevier could charge a premium for journals with very high quality science in them, but that doesn't appear to be what Elsevier does. Its charges are such that, in at least one case, Elsevier journals constitute two percent of a library's subscription catalog, but 20 percent of its subscription fees. This policy is designed to encourage bundling.

Bundling is Elsevier's practice of offering steep per-journal subscription discounts if you take the lot. The numbers quoted above are from Cornell's bundled subscription deal, so one can only imagine what the per-journal subscription fees must be. Now, bundling wouldn't be so bad if Elsevier were flexible about it. But they don't seem to be. Don't have a Mathematics department? Tough, you get Applied Numerical Mathematics or you pay per-journal.

This practice is not just designed to increase Elsevier's bottom line, it is also designed to make Elsevier appear larger than it actually is. That is because Elsevier publishes many specialized journals that very few people actually want. These journals are nearly worthless to Elsevier in every respect except two. They enable Elsevier to say that it has the largest single library of peer reviewed journals, and it can advertise about how all these libraries subscribe to their journals—the Journal of Podunk Economics must be important if Harvard takes it.

This practice suggests to me that Elsevier primarily regards science as a vehicle for making money, with quality a distant second concern. For those of you thinking that this is a "well, duh" moment, Nature publishing group places a much higher value on quality and still maintains profitability, so the two are not incompatible. Furthermore, Nature has taken the opposite tack by creating the impression that publishing in its journals are the height of scientific achievement.

Elsevier's practice of expensive bundled subscription policies actually has serious consequences for the amount of research performed at universities. The money to pay for subscriptions typically comes from the agencies that provide research grant money. When calculating the budget for a research grant, researchers must include overhead costs, which are used to pay for things like janitorial services, computer network infrastructure, and journal subscriptions. Now, contrary to what many people think, scientists are usually very cost conscious: we want to give and get value for money. We accept that instruments may be very expensive because of the limited market. We accept that graduate students and post-docs must be paid. Experimental work, even if it doesn't directly use such services, should contribute to the maintenance of a mechanical workshop.

Every dollar that goes into these overhead costs is a dollar that doesn't go into science.

In general, university overheads are high, and the library constitutes a significant proportion of that cost. Elsevier has, in effect, an umbilical cord attached to just about every granting agency on Earth. Forcing Elsevier to change its prices will not remove the umbilical cord, but could see the umbilical cord reduced in diameter.

The argument that Elsevier would make is that it publishes valuable specialist journals, where the small number of subscriptions warrant high prices. Several factors appear to undercut this claim. Electronic distribution, which is now the main delivery vehicle for science journals, means the subscriber number is far less relevant. In fact, fewer subscribers means fewer downloads, which means reduced hosting costs for those journals—especially when you consider that there's a single database and front end common to all Elsevier journals. They also use the same format for nearly all their journals, meaning that type setting costs are widely distributed. In fact the only place where they can claim increased cost is editorial staff. Except that most journals are voluntarily edited by academics. There doesn't seem to be a valid economic justification for Elsevier's pricing structure, other than the inherent value of their journals.

So, are they worth it? Lets take a look at some examples. Exhibit A must be the Journal of Homeopathy. Homeopathy is not science. The journal has a negative scientific value because it does not distribute scientific knowledge, but rather disseminates wishful thinking about reality. It is the very essence of anti-science. Yet, here it is, a peer-reviewed "scientific" publication being foisted upon universities through subscription bundling. A wedge, if you will, of pseudoscientific thinking right in the heart of science.

Exhibit B is Chaos, Fractals, and Solitons, a real mathematical journal that was once a respectable vehicle for scientific communication. Now, however, the Editor-in-Chief is one M. S. El Naschie, who has managed to publish 300 peer-reviewed papers in his own journal. By itself, this would be an abuse of position, but it's actually worse than that. El Naschie is apparently a numerologist. Yes, that's right, the idiots who spend time looking for mystical significance in integers.

No matter where you look, numerical coincidences occur. But coincidences are not the subject of science—in fact, much of science involves demonstrating that data isn't the result of coincidence. El Naschie would never have a voice on an adequately edited scientific journal, and any journal that was inadequately edited enough to allow numerology in would normally be shunned by the scientific community. It is only Elsevier's drive to profit, even at the expense of their own credibility, that lets this sort situation occur.

It's not clear that Chaos, Fractals, and Solitons can be rescued. It doesn't even deserve to be rescued—mathematicians are publishing in other journals now. Even if that journal were to recover, that wouldn't solve the more general problem posed by bundling. Only if Elsevier charged a reasonable per-journal subscription fee for each journal, one that reflected not just its cost but also its significance, would the journal be compelled to improve or fold.

This approach could generally solve the problem of poor content and poor editorial choices. If the quality of a journal falls, or is filled with pseudoscientific garbage, subscriptions will be cancelled. In this case, libraries will need to start analyzing usage patterns more carefully. Has anyone downloaded a paper from Chaos, Fractals, and Solitons since it turned into a journal of numerology? If the mathematics department at your local university knew about its content, would they still want it in the university? These are questions that should be subjected to regular review, but the bundling practice makes asking them useless. Universities should have the power to cancel these subscriptions without looking forward to a huge increase in subscription fees.

It would be nice to think that Elsevier will listen to scientist, but I suspect that this will not happen until scientists start getting a little more strident. If you are scientist, publish your work in society journals rather than Elsevier journals. Try to avoid citing work published in Elsevier journals. Elsevier lives by a combination of pricing and impact factor, and scientists have direct control over only one of these—impact factor. Librarian could start looking at Elsevier journal usage patterns; perhaps they can follow Cornell's example, and subscribe to just a few Elsevier journals.

I don't often use Ars Technica as a podium, but Elsevier's practices cut to the very heart of science as a profession: they reduce the ability to perform research, and they reduce the credibility of the profession.