An October 4 paper in Science based on a “sting” operation by John Bohannon, science writer at Harvard University reveals the pitfalls in open access scientific publishing. Bohannon has done for selected open access scientific journals what Alan Sokal did to academic journal Social Text. Submitting a paper outlining (entirely made up results of) the anticancer properties of a chemical that had been extracted from a certain lichen to 304 open access journals, he found that the majority of them actually accepted the paper. 157 journals accepted the paper, 98 rejected it and of the remainder, 20 journals said the paper was still being reviewed and 29 journals appeared to be derelict.

What is news for us is that about one-third of these (64 to be precise) were based in India!

Of the 79 journals based in India that were tried, 64 accepted the paper and 15 rejected it. Bohannon writes that even while the editors and bank accounts are based in India, the companies that ultimately make a profit out of the business may be based in the U.S. or Europe. Some of the giants of scientific publishing — Kluwer, Elsevier and Sage owned some of the journals that accepted the paper. On being presented with the results of the “sting” they have taken suitable action. Kluwer for instance, has closed down The Journal of Natural Pharmaceuticals.

The next largest base was the U.S. and among the journals tested, 29 accepted the article for publication and 26 rejected it.

The writer mentions that PLOS ONE was one journal that rejected the paper, as also did Hindawi, a massive open access publisher based in Cairo. The journals Pharmaceutical Technology and Drug Research, National Journal of Community Medicine and Current Botany were three of the journals from India that rejected the paper.

Apart from revealing the poor quality of peer-reviewing process adopted by these journals which accepted the paper, the operation also revealed that some journals that carried names indicating they were located in America or Europe were actually based in the developing world.

The experiment seemingly casts a doubt on the authenticity of open access publishing. However that is not completely valid, both because not all the open access journals have been included in this experiment and because other journals which are not open access have not been included. Now, while a lot of thought has gone into selecting the sample of 304 journals, it is still not an exhaustive list.

It can be argued whether this sort of slip in the peer-reviewing process, can be treated statistically, and that too with such a small sample. We must hesitate therefore to extrapolate the percentages and make inferences about the quality of open access journals and the open access process.

That said, there is a great need for a good system of accreditation of journals, open access and otherwise, and not merely by popular perception and citation indices.