We live, we like to think, in a reasoning age, if not always a reasonable one. Over the past century we have seen spectacular advances in our understanding of the universe. We now have a fairly coherent, if incomplete, picture of how our planet came into being, its age and place in the cosmos, and how the physical world works. We, clever monkeys that we are, understand the processes that lead to earthquakes and volcanic eruptions, and the factors that influence climate and weather. We have seen the rise of molecular biology and major improvements in public health and medicine, giving billions of people longer, healthier lives.

Indeed, life expectancy is on the rise nearly everywhere. Infant mortality continues to plummet. Humanity has actually managed to eradicate one of the greatest scourges of its existence — smallpox — and we are well on the way to destroying another — polio. It is astonishing, this triumph of reason. As a species, we should be proud.

But of course it is not that simple. As the ideals and technological spin-offs of the Enlightenment make our world ever more unified, unreason continues to flourish. This is something that many thinkers find to be as puzzling as it is distasteful.

In December 2011, the Academia Europaea (a European academy of humanities, letters and sciences) organised a conference at Cambridge University to examine the nature and causes, and possible cures, on ‘Reason and Unreason in 21st-century Science’. I took part in the talks and edited the subsequent transcript, which will be published later this spring. The experience gave me a fascinating insight into the exasperation that many scientists feel at the primitivism that is holding us back.

Let me give one example. The brilliant biotechnologist Ingo Potrykus, emeritus professor at the Federal Institute of Technology in Zurich, and his colleague Peter Beyer, professor of cell biology at the University of Freiburg in Germany, have developed a modified form of rice in which Vitamin A is present in the kernel, or the bit you eat (it is normally present in the leaves, but of course we throw those away). Vitamin A deficiency is not a problem in the West. In the Third World, however, where people depend on rice as a staple and often eat little else, it affects something like 400 million people, irreversibly blinding around half a million children a year.

Isaac Newton believed in alchemy, which was pretty woo-woo even by the standards of the 17th-century

Their ‘Golden Rice’ would solve this problem at a stroke. This GM variety is no more expensive to grow or cultivate than normal strains, and it will require no special chemicals or tie-ins with big biotech firms to cultivate. In fact, Potrykus told the conference it would be free for poor and subsistence farmers. It tastes the same as normal rice. And it has been available since 2000. In a sane world, it would have earned Potrykus and Beyer a Nobel Prize. Yet not a single child in Bangladesh, India, the Philippines or Cambodia has benefitted from this new crop.

The reason is simple: relentless and well-funded campaigns against transgenic technology by (mostly European) NGOs and Green campaigners. Their efforts have led to bans on Golden Rice in the very countries where it could save millions of lives. These warriors against ‘Frankenfoods’ are, even if inadvertently, to blame for the blindness of maybe 3 million children. As Potrykus said at the conference: ‘If our society will not be able to “de-demonise” transgenic technology soon, history will hold it responsible for death and suffering of millions: people in the poor world, not in overfed and privileged Europe, the home of the anti-GMO hysteria.’

What lies at the root of this panic, and others like it? One factor that is often ignored by champions of reason is that science is hard, and getting harder. In the mid-19th century, the ideas of British naturalists such as Charles Darwin and Alfred Russel Wallace took hold in part because they were so simple and intuitive (and in part because Darwin was such a clear writer). In those days, it was just about possible for an educated layman to get a grip on the cutting edges of science, medicine and technology. The same feat would be laughably impossible today. The intellectual giants of the 19th century were probably the last humans alive able to know just about everything important that could be known. Today, it is a challenge to know everything about even a tiny subset of knowledge. There are professional scientists who know nothing more than laypeople (and often rather less) about the world outside their own narrow disciplines. It is hard to become a molecular biologist, or a doctor, or an engineer. Yet it is relatively easy to grasp the ‘precautionary principle’ — the belief that, in the absence of scientific proof that something is harmless, we must assume that it is harmful. But, as Lewis Wolpert, professor of cell and developmental biology at University College London, has pointed out, this addled creed would have led early humanity to ban both fire and the wheel.

So perhaps we shouldn’t be surprised at the proliferation of courses in alternative medicine that erupted like boils throughout Britain’s universities in the early 1990s. It might have less to do with human credulity than with the fact that squirting coffee up people’s bottoms or dangling crystals over their bosoms is easy, whereas acquiring the biochemistry and anatomy needed to be a proper doctor is very difficult.

That inestimable scourge of quackery, David Colquhoun, honorary fellow in pharmacology at University College London, has been waging a 10-year war against ‘magic medicine’ with some success. Most of the wackier courses, such as Spiritual Healing — which Colquhoun described in the Financial Times in 2009 as ‘tea and sympathy, accompanied by arm waving’ — and Angelic Reiki — which he said was ‘excellent for advanced fantasists’ — have now disappeared. Increasingly, it is only the more respectable backwaters of alternative medicine, such as acupuncture, that are still replenished by tuition fees and state funding. A collective embarrassment seems to have taken hold in the chancelleries of the new universities.

It might seem daft that a civilisation that has eradicated smallpox could still allocate public money to teach crystal therapy. But should the continuing existence of unreason still give us cause for despair? Even in the Golden Age of the Enlightenment, when we like to imagine learned men (and the occasional woman) meeting in the salons of Edinburgh, London and Paris to discuss Boyle’s Law and the democratic ideals of Thomas Jefferson, there was plenty of silliness. To take just the most prominent example: Isaac Newton, probably the cleverest man ever to have made a living out of his cleverness, invented physics — but also believed in alchemy, which was pretty woo-woo even by the standards of the 17th-century.

In the same way, a great deal of intellectual sloppiness and downright lunacy accompanied the great rational flowering of the 19th century. Darwin himself was sensible on most matters, but you couldn’t say the same for his followers. His cousin Sir Francis Galton, a humane and brilliant man in many ways, believed that clever people should be paid to marry each other and have extra children ‘for the improvement of the race’. Many other eminent Victorian and early 20th-century thinkers cleaved to a crude ‘scientific’ racism. Parallel to the development of evolutionary biology, nuclear physics and relativity, we’ve seen the advent of phrenology, spiritualism (Alfred Russel Wallace was a fan), occultism and a plague of quackery and snake oil that could have occupied Professor Colquhoun’s forebears for the rest of their days.

Science is not a well-maintained Swiss watch so much as a ramshackle, creaking machine held together with shims and bodges

Are things any better today? In some respects, perhaps they are worse. Scientists are distrusted in a way they were not 100 years ago. The whole scientific enterprise looks to many like some sort of sinister conspiracy, created by the industrial establishment to make money at the expense of our health and our planet. ‘Science’ (rather than greed, incompetence, laziness or simple expediency) gets blamed for the degradation of our environment, pollution and threats to species. In the internet age, conspiracy theorists prosper. Comments such as ‘the Moon landings were faked’, ‘medicine kills far more people than it saves’, or ‘vaccines do more harm than good’ gain a spurious truth through repetition. We live in the era of the instant, self-proclaimed expert. Furthermore, the media loves conspiracies. The idea that, for example, the HIV-AIDS link was either a gigantic mistake or some sort of pharmaceutical fraud was too good a story to miss, despite the fact that it was obviously untrue.

It is not enough to dismiss this kind of scepticism as irrational, insane or evil. In many cases, unreason emerges as a result of a complex interplay of religious faith and dogma, well-meaning concern and an attachment to that dreaded precautionary principle. Add in intellectual inertia, some well-founded suspicion of certain scientific enterprises (the activities of some pharmaceutical companies, the historical secrecy of the nuclear industry, resistance to anti-pollution measures and so forth), not to forget simple misunderstanding and you have a heady mix.

We must also accept that reason doesn’t always live up to its own standards. The motto of the Royal Society is Nullius in verba — ‘Take nobody’s word for it’. In reality, though, science is dominated (like any field) by the great and the good, grandees whose word is taken as read. The world of reason is itself riddled with feuds, egotism and, occasionally, downright fraud. Scientists and doctors are people, not machines. They are driven by the same forces that motivate professionals of any kind — which include money, sex, the desire to be respected, liked and even feared alongside the more noble impulses of curiosity, determination, professionalism and perfectionism. And so we must accept that corners can be cut, publications can be biased, and the peer-review system can be corrupted. To paraphrase Winston Churchill’s dictum on democracy, peer review is the worst system there is for evaluating scientific claims, except for all the others. Plenty of things look like science but are not; the American physicist and Nobel laureate Richard Feynman called this ‘cargo cult science’. Many of the psychological findings that make their way into the newspapers — together with ‘formulas’ for the perfect love-match, the perfect day, or the perfect sandwich — are no more scientific than angelic reiki. We must accept that science is not a well-maintained Swiss watch so much as a ramshackle, creaking machine held together with shims and bodges.

And so we come to religion, the oldest ‘unreason’ of all. The question of how to define the relationship between science and faith has occupied minds great and not-so-great for centuries. The answer, such as it is, is clearly a muddle. The late American evolutionary biologist Stephen Jay Gould’s concept of ‘Non-overlapping magisteria’ is currently rather unfashionable but it describes well the conflict, or lack of it, that exists between most forms of religious belief and science.

It is true that the kind of basic science that is taught in schools, for instance, does occasionally contradict widespread supernatural beliefs. In most cases, the severity of these conflicts has been overstated — as indeed it was in the past. The great shouting match between Bishop Samuel Wilberforce and the biologist Thomas Henry Huxley at a public debate in Oxford in 1860 was nothing of the sort, and the theologian and the man they called ‘Darwin’s bulldog’ remained friends afterwards. That seems a good model for how to handle such irreconcilable disagreements.

A separate question is the role of faith in science, and the scientific explanation (if there is one) for faith itself. Is some sort of mysticism, a predilection for unreason, hard-wired into the human brain? Very few cultures or societies have lacked religion. When supernatural belief systems are absent, secular religions such as Leninism-Marxism, Nazism or the peculiar personality cults of North Korea quickly emerge to take their place. And so it seems unlikely that unreason will ever disappear entirely.

However, it can be kept in check — and this is good news not only for the eradication of polio, Vitamin A deficiency and so forth, but for unexpected spin-offs as well. One of the peculiarities of history is the extraordinary decline in human violence — recently charted by the psychologist Steven Pinker in The Better Angels of Our Nature (2011) among others — that seems to have accompanied the explosion of health and wealth brought about by science. It is possible that reason and reasonableness go hand in hand.

Nevertheless, it seems likely that unreason, like the poor, will always be with us. We are not going to see a future of ‘brainy people sitting about in togas swapping theorems’, as the English science fiction writer Michael Moorcock put it in The Guardian in 2008. In amusement arcades you often find a game called ‘Whack-a-Mole’. It consists of a table with a dozen or so holes in it. When you put your money in, little plastic or wooden critters pop their heads out of the holes. The game is to thwack as many of them as possible, with a mallet, in the short interval before they go back underground. The more you hit, the more, it seems, they pop up.

The fight against unreason is something like this, too. Up pops witchcraft — thwack! — only to be replaced by alchemy. Here is institutionalised religious fundamentalism — but then, kerpow! — up pops Victorian mysticism. In the 20th century, ‘scientific’ racism, phrenology and eugenics were all given the mallet treatment, only to see homeopathy, reflexology, anti-vaccination and GM hysteria pop up to take their place. My guess is that the well of unreason will never run dry — indeed, I suppose it will always contain roughly the same amount of liquid. We just have to hope that this liquid becomes less toxic every time we pull up a new bucket.