Barely a day goes by that we are not confronted with another warning of impending catastrophe. If it’s not another global financial crisis, a nuclear war or Global Warming, the next pandemic is surely going to get us and destroy all we know and hold dear. Until recently, one standard response to such fears was to express belief in the ever-increasing power of technology to get us out of any number of tight spots.

Will technological progress keep us going, helping us cope with the challenges that our over-complex societies seem to generate as a side effect? I’m not so sure anymore. Some of those ageing science fiction fans of the baby boomer generation are getting antsy: We were promised robots, as Shorvon Bhattacharya wrote a few years back. Where are those robots now?

We’re blinded by incremental progress in electronic gadgets of marginal utility—new smartphones, larger monitors, and more powerful computers. Yet we drive vehicles with internal combustion engines, our electricity is mostly generated in thermal power plants fueled by coal, we are far from curing solid cancers, and we are slowly losing the race against multi-resistant bacteria. The average Sci-Fi writer of the 50s, 60s and 70s would be very, very disappointed with the world of 2018. There is no colony on the moon, we don’t have fusion power, and there are no laser guns or sonic disruptors. No robots anywhere. We can’t even cure the common cold.

It is painfully obvious that technological progress between, say, 1968 and now, can’t compare to the periods between 1818 and 1868; 1868 and 1918; or 1918 to 1968. This will very likely be recognized by the 50th anniversary of the first Moon landing in 2019, when lots of people will start to wonder why it is that, 50 years after the greatest triumph of the Apollo programme, we don’t even have the capacity to repeat the technological feats of the late ’60s.

A particularly depressing aspect of this downturn in research productivity was recently analysed by Patrick Collison and Michael Nielsen in the pages of The Atlantic. They show that, despite an enormous increase in research activity over the last 50 years, there has been a marked decrease in return on investment, We are getting less bang for our buck. Major innovation is becoming increasingly scarce in the natural sciences, and most Nobel Prizes in the Sciences are handed out for research that was done decades ago.

New essay from @patrickc and myself, arguing that science has suffered from greatly diminishing returns over the past century: https://t.co/VAr3B86L3S — michael_nielsen (@michael_nielsen) November 16, 2018

Something fundamental seems to have gone wrong. In fact, several issues are combining to slow down research and development. My argument is that the research enterprise is in serious trouble.

* * *

At its core, all civilization is about problem-solving. The more successful a society is at that, the more stable is it likely to be. When faced with new problems, we use methods that worked in the past. Traditionally, our problems were solved by creating new structures: committees, commissions of inquiry, new rules, laws and regulations and centralization. So why is it that we now commonly seem to make problems worse whenever that approach is used?

In my neck of the woods, the New South Wales public health system is a good example of a complex system beset by multiple problems. A decade ago a report called the Garland Report was released demonstrating the utter futility of the red-tape approach to problem-solving. This means we are investing in ways that generated positive returns in the past, but are now useless; or if not useless, potentially harmful. Every year we spend more money on managers rather than on doctors or nurses, and even those doctors and nurses spend more and more time with non-productive tasks such as filling in paperwork. We now need seven or eight signatures to authorize advertising for a part-time admin assistant. Research governance has become a nightmare, and compliance costs are increasing steadily. This puts young colleagues off academic careers and has forced my unit to outsource research to Houston, San Francisco, Pretoria, Santiago de Chile, Prague and Las Palmas.

Investment in complexity that produces negative returns is a sure sign of a complex system that is ‘brittle,’ or, likely to fail catastrophically, as explained by Crawford Holling, one of the founders of the academic discipline of ‘social ecology’.

As if this wasn’t bad enough, there are clearly a number of societal factors that reinforce the effects of excessive complexity, and it is difficult to not see them as bound up with other shifts in cultural and moral norms. Western societies have become increasingly ‘feminine,’ with women achieving a much greater share of control over institutions and cultural activities. There have been trade-offs incurred from this process, naturally. The last 200 years have seen a shift from what moral sociologists have described as the ‘honour’ or warrior culture pre-dating the 19th-century—to the ‘dignity’ culture of the 20th-century—to the emergence of a ‘victimhood’ culture in the 21st.

Part of this shift includes the increasing aversion to risk. This risk aversion is sometimes crudely described as the ‘Nanny State,’ but at its core, it is defined by increasing regulation of all aspects of life, both public and private. Regulation, on principle, is about protecting the weak from the strong and the reduction of risk in general—the risk of drowning in a backyard pool, of poisoning from household chemicals, or the risk of dying in a crash due to faulty car components. Clearly, many of these incremental steps are positive advancements. And we’ve been making progress in this regard throughout the existence of our civilization. Yet increasing risk aversion also comes at a cost.

In Thinking-Fast and Slow Daniel Kahneman provides an explanation of this fundamental property of the human mind. Risk aversion is the aversion to trading any kind of risk for some other advantage. It is strong in most people, and possibly more so amongst women than in men. Of course, there have always been unusual individuals that approach risk rather than avoid it. People low in risk aversion drove the industrial revolution and have driven technological progress in general. Modern medicine is what it is today because researchers in the past took risks, sometimes for themselves, and usually for their patients, and sometimes without telling them. If we were to apply today’s rules to the biomedical breakthroughs of the last 200 years, many of them would be considered unethical and illegal. We may condemn this history from our modern perspective, but that’s how we got to where we are today.

If current rules and regulations had been in existence in the 1900s and the first half of the 20th century we would not have airplanes, air conditioning, antibiotics, automobiles, chlorine, the measles and smallpox vaccines, cardiac catheters, open heart surgery, radio, refrigeration and X-rays. The universal principle of risk aversion in the past hampered only individuals, and if only one individual in a million was immune (that is, if one in a million did not share this risk aversion) that was enough for progress to occur. Now, this risk aversion is firmly entrenched in legislation all over the world, and it is throttling innovation, leading to actions that Kahneman describes as “detrimental to the wealth of individuals, to the soundness of policy, and to the welfare of society.” The bottom line is that risk aversion is fundamentally bad for productivity, and especially for research and development.

At the same time that risk aversion is increasing, discrimination on the basis of ethnicity or gender also seems to become more and more acceptable in Western societies. Harvard is fighting a court case brought by Asian students accusing the Ivy League University of systematic discrimination on the basis of race. Melbourne University recently advertised for a Professor of Mathematics, stating that the School would “only consider applications from suitably qualified female candidates.” A senior physicist presenting empirical data on the hypothesis that female academics now tend to be appointed to senior positions with lower publication and citation counts—suggesting systemic anti-male discrimination—was recently suspended from his job at CERN in Switzerland. Clearly, discriminating against individuals for any reason other than their work capabilities is bad for productivity–in business as well as the research enterprise.

Thirdly, the innovation incubators of our world, the universities and their counterparts in business, are increasingly subject to a societal climate that is hostile to free speech, viewpoint diversity and open inquiry. Karl Popper, the foremost philosopher of science, once stated that “the growth of knowledge depends entirely upon disagreement,” yet disagreement with a growing number of orthodoxies is becoming increasingly dangerous. The reaction to Professor Alessandro Strumia’s talk at CERN is an example of how the presentation of arcane empirical data can result in what is colloquially known as a ‘shit storm’. James Damore’s firing from Google is another example. Clearly, it is easy to mortally offend people by reporting bibliometric research today—something I would have considered absurdly, ludicrously unlikely in the past.

In fact, this trend is now most certainly affecting scholarly activity internationally. In his essay ‘The Institutionalisation of Social Justice’ Uri Harris recently described several cases of activists affecting the publication or continued availability of research papers. Research that upsets feminist, LGBTIQ or ‘Black Lives Matter’ activists is high-risk for academics: witness the backlash to Theodore Hill’s paper on the “Greater Male Variability Hypothesis,” what happened to Lisa Littman’s research on Rapid Onset Gender Dysphoria, or Bruce Gilley’s publication on colonialism. It doesn’t matter how obscure the journal, and how technical the paper–self-appointed guardians of morality feel entitled to suppress opinions or research findings they do not agree with. In 2018, social media can actually influence what we can do in our operating theatres. We end up doing medicine that’s increasingly ‘Facebook-based’ rather than evidence-based.

Academic Activists Send a Published Paper Down the Memory Holehttps://t.co/Xk5n4Yg6SN — Quillette (@Quillette) September 7, 2018

The end result of all these mutually reinforcing trends is a very substantial ongoing reduction in research productivity: less and less bang for more and more buck. And this comes at the very worst possible time for the continuing development of our civilization.

Tyler Cowen, an American economist, has examined technological progress as the main driver of economic growth and identified the increasing scarcity of true innovation as the main cause of ‘the Great Stagnation,’ the slowing of economic growth in developed countries since the early ’70s. He claims that we have picked the ‘low-hanging apples’ of the industrial revolution, and the catch-up growth of developing countries will slow soon enough as well, once they’ve picked all those low-hanging apples such as universal education, mass transportation, and gender equality. Cowen argues that we may have to get used to a prolonged period of slow growth; a time when the cake is simply not growing as much as it did in the past. That fundamental insight explains a lot about what’s been happening in Western societies over the last generation, including increasing conflict over the distribution of income. God only knows what will happen to our societies once people realize that the future may be rather poorer than the past trajectory of our civilization would lead us to expect. In fact, we’re probably seeing the effects of this shift already—identity politics may simply be a manifestation of increasingly aggressive conflict over resource distribution.

The societal factors affecting the research enterprise are becoming ever more damaging at the worst possible time, at a time when we have to reach higher and higher to pluck those apples still remaining on Tyler Cowen’s Tree of Technology. We need more, not less research and development, because our societies have become so complex, so brittle, and increasingly unlikely to be able to weather sudden changes in their conditions. Collapse is becoming more and more likely, as Tim Homer-Dixon has documented in The Upside of Down. Whatever it is: global warming, a pandemic, nuclear war, a computer virus or the next global financial crisis—we need less regulation, not more, and less complex, decentralized systems to weather the coming storm. We also need less discrimination—not more. We need more free speech, not activists and activist media telling us what is an acceptable point of view, and what’s sexist, racist, transphobic, imperialist, misogynistic, fatphobic, ableist, or whatever the latest linguistic abomination may be.

At the very worst possible time in the existence of this civilization, at a time when we’re running out of low-hanging apples to pick, we’re drowning ourselves in red tape and a new authoritarian orthodoxy. That’s very bad news, not just for the research enterprise, but for all of humanity.

Hans Peter Dietz is Professor of Obstetrics and Gynaecology at the University of Sydney, Australia.

Share this: Pocket

WhatsApp



Email

Print

