While some extreme claims have been made about ancient Indian science, it is better if scientists and journalists do not dismiss all claims as ipso facto absurd. Newtonian science itself was overturned in just over a 100 years.

There has been a lot of uninformed blabbering in the media over the Indian Science Congress sessions recently and the fact that, for once, there were a few papers that looked at ancient science from India. This, naturally, attracted a lot of abuse from the usual suspects, who laughed at what they claimed was utter rubbish: how on earth could ancient Indians know about plastic surgery and flying machines, they asked rhetorically, and assured us this was all poppycock.

But in point of fact, the entire medical establishment even in the West does acknowledge that Sushruta was one of the fathers of surgery, that the medical instruments that he invented (or its variants) are still in use today, and that his early techniques for plastic surgery (to repair sliced-off noses, a common punishment for heinous crimes) were remarkably advanced.

As far as flying machines are concerned, it requires a paradigm shift: perhaps not a heavier-than-air machine, but why not a hot-air balloon-type device, as suggested by a friend who’s a professor of aeronautical engineering? Based on their experience in conducting yagnas, the ancients no doubt discovered that hot air rises; it is not beyond the realm of possibility that animal-hide balloons could have been built. Even today, water-tight buffalo-skin coracles are used to ford rivers (I had the heart-stopping experience of crossing the Kaveri in one of those at Bheemeswari, Karnataka); maybe a modest hot-air balloon could have been made with these.

Such a craft would not be very navigable; but if the prevailing monsoon winds were going in roughly the right direction, perhaps a Ravana could have drifted down south to Lanka by taking advantage of them. Yes, a little far-fetched, but no more so than a famed sarkari historian claiming on a BBC programme that a certain ‘Saint’ Thomas could have arrived in Kerala. He didn’t produce any evidence that Thomas did, which clearly is the difference between history and myth.

I was also entertained by another famous sarkari historian claiming that India was a superpower in myth-making. It was ironic considering that he has been one of the principal myth-makers about the Gandhis (both the original Mahatma and the Nehru-dynasty Gandhis). Besides, his core competence is as a historian of cricket: the myths and make-believe surrounding cricket are second to none!

These self-proclaimed historians demonstrate a truism about history: that it is constructed, informed by the prejudices and beliefs of the historian as well as society at the time. Thus Max Mueller, based on white people’s then ability to beat up natives everywhere, grandly proclaimed the Aryan Invasion fantasy, which he himself repudiated later. As Oscar Wilde once said, the only obligation we have to history is to rewrite it; just as Nehruvian historians erased inconvenient facts and individuals from their toxified version of Indian history, it is time for a new class of historians to de-toxify it.

Interesting, this is true of science as well. Despite our belief that science consists of universal truths that are verifiable through evidence, the very process of creating science is highly culture-specific. Answering a question about whether ancient Indians discovered the so-called Pythagoras Theorem, the Fields Medal winner Manjul Bhargava replied (in “Did India Discover Pythagoras Theorem”, 9 January 2015, rediff.com) that you could answer the question in three ways: Mesopotamians seem to have had an inkling of the principle, Indians formulated the theorem precisely, and the Chinese provided formal proof. So who in fact discovered it? Depends on your definition of ‘discover’.

Proof is culture-specific. In western minds, a formal proof is a logically derived sequence of steps that shows how the hypothesis can be proved correct. However, in digital environments, it may well be more important to show the practical result of the algorithm rather than to formally prove it. In software, the formal proof of a programme grows exponentially in size and complexity. A page of code may take 100 pages of mathematics to prove it correct, and it is not worth the time and effort, especially when systems have hundreds of millions of lines of code. Does that mean un-proofed code is worthless? Not really: almost all the code you are using daily lacks formal proofs. You take it on faith that it is, more or less, correct and that it works and that your smartphone will not suddenly explode.

The role of intuition in science and discovery is also discounted by many. Srinivasa Ramanujan, one of the greatest mathematicians of all time, credited his ishta-devata, the Goddess of Namakkal, for telling him his extraordinary theorems, which he wrote down without proof: and almost all of them turned out to be true. Saayana’s astonishing number for the speed of light is amazing, and so is the discovery of the benzene ring: these are based on intuitive breakthroughs, one of the elements of sheer genius.

Science too has its dogmas, as much as any West Asian religion does. In fact science suffers from several serious problems, and these can be illustrated by using two disciplines: one, medical science and the other, astrophysics, both crown jewels of the western canon. Let us first look at allopathy, or what is grandly called ‘modern medicine’ (a loaded term, implying that other systems of medicine are bogus, and so I shall avoid using that term here). One problem with allopathy is that while its practitioners claim it is ‘evidence-based’, and they put great store by their ‘double-blind’ systems of experimentation, it fundamentally boils down to statistics and correlation, rather than causation.

This is a serious lacuna: the hypotheses are proven mostly by statistical analysis, rather than by understanding the causal chain of events that take place, and thus predicting from first principles. A natural corollary of this approach is that it becomes necessary to believe in a mechanistic Cartesian frame of reference, wherein, to avoid combinatorial complexity, you have to break a system down into its smallest elements and then use them as your focus of study.

Unfortunately, as you delve deeper and deeper into the system, you lose track of the whole. Thus, if you are looking at the microscopic entities alone, you cannot possibly understand the so-called ‘emergent’ behaviour of flocks or groups of ordinary entities. For instance, an individual ant is a boring creature, with limited cognitive skills. But put a colony of ants together, and their emergent behaviour can be amazingly intelligent, but it is not possible to identify where the intelligence comes from: the ‘wisdom of crowds’?

Similarly, allopathy errs in assuming that by delving deeper and deeper, it will be able to finally understand what’s going on. Not true in many cases. Ironically, if you analyse a human being in a Cartesian manner, he/she is just a bag of skin carrying about $83 worth of chemicals. But that hardly explains what life is. Allopathy knows a lot about cells, but it cannot explain the emergent behaviour of a group of cells that we loosely call life. That ‘modern medicine’ cannot explain what life is makes it a self-parody.

Similarly, this leads to fads: for instance, fat is bad, then fat is not so bad, but sugar is bad. At the moment, there is a new fad in allopathy: that the microbiome, the gut bacteria you carry around, is the root cause of diseases as varied as irritable bowel syndrome, allergies, diabetes and Alzheimer’s. In fact, there is a thriving trade now in fecal transplants, so that the ‘good’ bacteria in a healthy person’s gut can be introduced into a sick person’s gut (the genteel way is through surgery, but – not for the squeamish – you can get capsules you swallow which contain the healthy person’s fecal matter). It turns out there’s supposed to be a balance between the ‘good’ and the ‘bad’ bacteria – which sounds awfully like the ‘balance of humors’ in Greek medicine, and the balance of vata, pitta and kapha in Ayurveda.

I think it’s a fundamental problem with deconstruction. Every time you delve one level deeper, you lose some knowledge of the interactions of the elements under study. An interesting contrast would be in computer science. There too, after some experimentation, monolithic, giant pieces of computer code was found to be unmanageable, and a radically different approach emerged with Unix: a sort of Lego, where tiny elements doing single tasks are put together in arbitrarily complex ways, with the result being a variant of emergent behaviour, with the large piece of code doing complicated things.

The difference in this ‘constructive’ approach is that there is an architecture, and the interactions and interfaces between the elements is well-defined. I was listening to a talk on KQED Forum by Vikram Chandra, novelist (“Red Earth and Pouring Rain”) and computer programmer, author of the recent “Geek Sublime: Writing Fiction, Coding Software”. He makes the point that there is beauty and elegance in the construction of code. That elegance is part of the emergent property of well-built software. This big picture is what a Cartesian deconstruction misses, and which is why doctors should treat the individual, rather than the symptom: I prefer the intuition and experience of old doctors rather than the ‘evidence-based’ data crunching of young doctors.

At the other extreme comes the world of astrophysics. I was startled to read an article that suggested that in a way this most esoteric of the sciences is leading to a new appreciation for religious faith. The way I understood it, there are a few fundamental constants in the universe, and their values are ‘fine-tuned’ so that the universe can actually exist – if these were a little bit off from their actual values, the universe as we know it couldn’t exist. For instance, “the ratio of the electromagnetic force-constant to the gravitational force-constant must be… delicately balanced. Increase it by only one part in 1040 and only small stars can exist; decrease it by the same amount and there will only be large stars” (see here).

The question of why the constants are just so is philosophical: and two possible answers exist. One, that there is a plan, and a planner – this veers dangerously close to theism and to ‘intelligent design’. Two, that there are multiple universes (infinitely many with different values for these constants), and ours is the one that happens to have our parameters. We don’t know yet.

This reminds me of the time a century ago when physicists were convinced that everything worth knowing was already known. Sadly for them, their entire Newtonian edifice came tumbling down in the realms of the super-small (quantum theory) or the super-large (relativity). Thus, what the 19th century scientists believed in firmly has turned out to be a laughing matter a hundred years thence. The certainties of today’s all-conquering science may well turn out of be the laughing-stock a hundred years from today, so a little humility is called for.

There is one more observation based on Arthur C Clarke’s pithy observation that any technology that is sufficiently advanced is indistinguishable from magic. It is possible that there was advanced technology in ancient days that we simply don’t understand any more. An example of such lost knowledge is the Iron Pillar in Delhi. We know by analysis what it is made of, but nobody quite knows how it was built, and so nobody is able to replicate it. It is similarly known that advanced nano-carbon steel, known as wootz, was made by southern Indian blacksmiths, but we don’t know how.

Thus, while it is true that some extreme claims have been made about ancient Indian science by some enthusiastic people, it is better if scientists and journalists do not dismiss all claims as ipso facto absurd. Baby, bathwater, etc. – yes, science can give us important insights, but let not hubris overtake us. Let us be aware of how much we don’t actually know, or else we’ll end up like IBM’s chief Thomas Watson who confidently predicted in 1943 that the world would need a grand total of maybe five computers.