Another example was inspired by the discovery of “mirror neurons". While most cells in the touch map in your brain respond only to your own skin being poked, about 10 per cent will respond to your friend's skin being poked – so long as you are watching. Your brain uses the visual input to create a virtual reality simulation of your friend’s mind. Though you don’t actually feel the poke, you empathise with him. But if your hand is removed you actually feel his pain (this doesn't happen if your hand is intact because signals from it veto the mirror neuron output). So the only thing separating my mind from my friend’s is my skin. Remove it and I start to feel his sensations; dissolving the barrier between me and others. If you have a phantom arm and watch your friend's hand being massaged – you will feel the massage in your phantom hand. And guess what? This phantom massage relieves phantom pain and is starting to be exploited clinically.

The broad implication of all our discoveries put together is that the brain works very differently from a computer; it is not a serial hierarchical bucket – it is a brigade like machine with multiple autonomous hard-wired modules. In my view, the brain is an extraordinarily malleable (plastic) organ with its connections – both within a module and across them being constantly altered in response to ever-changing environmental challenges. Not only are the modules in a state of dynamic equilibrium with the environment but also with each other (mirrors reducing pain) and with the skin and bones (as in reflex sympathetic dystrophy) and indeed with other brains – via mirror neurons. The brain resembles a termite mound or coral reef colony than a computer. Daniel Dennett has made the same termite mound analogy.

This type of "functional" paralysis and excruciating pain is also sometimes seen following stroke and the mirror procedure is especially effective.

Another very interesting mystery you study is synesthesia. You have stated that the overlapping region among halos of associations between two words, which is the basis of metaphor that exists in all of us, but is larger and stronger in synesthesia as a result of cross-activation gene. Is the ability of a poet to look at the world and discover metaphors then another form of synesthesia of a ‘higher’ kind? How has the study of synesthesia helped you in your approach to the neuro-biological dimensions of literary aesthetics?

About 2 per cent of people see ordinary numbers printed in black as being tinged strongly with colours – eg 3 might be red, 5 yellow, 7 indigo etc, which are different for different synesthetes but stable throughout life. The condition is inherited. We – and a few other groups – showed for the first time in a century that the effect is genuine and may involve 'leakage' or cross-activation between brain centres involved in colour and those involved in a number, which are right next to each other.

We suggested that in the foetus all brain modules are ‘hyper-connected’ with each other ‘pruning’ genes and inhibitory transmitters ordinarily evolve these excess connections – leaving behind the segregated modules that characterise the adult brain. But if the gene mutates the result is that modules that normally don’t talk to each other begin to do so. For example, number module is right next to colour module v4 – so numbers stimulate number detecting neurons which cross-activate colour detecting neurons – so numbers look coloured. If the same defective pruning or defective inhibition genes are expressed more widely throughout the brain – and if high level concepts are also represented in far-flung brain regions – the result would be a tendency to link seemingly unrelated concepts – what we call ‘metaphor'. And that explains why synesthesia is seven times more common in poets, artists and creative scientists than in the general population. The hidden agenda of the otherwise useless synesthesia gene is to make some people creative. This smacks of group selection – but that is not what I mean.

And that brings me to your famous eight laws of aesthetics. Can you please speak about it in a more detailed manner?

Well the question is whether in spite of the staggering diversity of artistic styles in the world – are there some universal principles or 'laws' of aesthetics? Sitting in a temple precinct I came up with eight or nine. A simple example is our aesthetic preference for symmetry – whether you are a child playing with a kaleidoscope or a great mogul emperor building a mausoleum to immortalise his wife. This evolved because in the natural world symmetrical things are usually living – prey, predator or mate – and symmetry serves as an ‘early alert system’ that makes you orient and pay attention to something biologically relevant. Obviously paying attention is the bare minimum requirement for art, though hardly sufficient.

A less obvious example is ‘peak shift'. Assume you teach a bird to choose a rectangle, but not a square, to get food reward. Then oddly, if you give the bird a choice between the rectangle it was taught and a longer skinnier one – it picks the latter! This isn't stupid – the bird's brain has learned to go not for a particular rectangle but to ‘rectangularity’ – as a rule. It is a long story, but a Chola artist uses this principle; he mathematically subtracts the average male shape from the average female and amplifies the difference to create a ‘super woman' – the epitome of feminine perfection. There is of course more to perfection than figure – however alluring. Through subtle exaggerations of posture (tribhanga) and mudras or gestures, the artist conveys such ineffable qualities as poise, grace and charm – all that makes a woman special. Or if you want a caricature of Modi you take the average of a hundred male faces, subtract that from Modi and amplify the difference.

For abstract art, I invoke the related "gull chick principle". A baby seagull chick finds and pecks at his mother’s beak to make her regurgitate half – digested food into its gaping mouth. “Beak is mom”. The beak is a long yellow shape with a red spot near the end. Ethologist Nikolaas Tinbergen found that an oblong stick with a red spot at one end will fool the chick’s brain – it is just as effective – because the requirements of neurons are not perfect – a rusty key will open a lock. But the remarkable discovery made was that a long thin stick with three red stripes excites the chicks even more than a real beak. Because of the way neurons are wired up – this odd object hyper activates the neurons though it doesn't resemble a beak. So if gulls had an art gallery they would hang up the long stick, worship it and bid millions at auctions but not understand why because it doesn't resemble anything. And that is exactly what happens when a human art collector or curator encounters a great work of art – a Picasso or Chola bronze – that more optimally titillates your brain than real objects.

The last year was the 1,000th anniversary of Indian aesthetician Abinava Gupta. Indian literary tradition does not have the concept of ‘critic’ but it calls the cultivated reader Sahrdaya meaning 'one of the same heart'. Abhinava Gupta defines the term Sahrdaya as “one whose mirror of heart (manamukura) is cleansed of impurities and has developed the ability to become one with the poet." Does this resonate with the modern theories of neuroaesthetics? With your eight principles, have you opened a new window for us to carry forward the tradition of Abinav Gupta into forging what C P Snow may call third culture?

Yes, he started it all. There is, of course, Sage Bharata’s Natya Shastra from 3rd century BCE. But let me add another key insight. It concerns the difference between what is called high art versus kitsch. (For example, the art that hangs in hotel lobbies and shopping plazas). I’d argue the distinction is not cultural nor based on democratic vote; in fact more people like kitsch until they have been exposed often enough to the real deal. But if it is not vote, what is it that makes non-kitsch “superior“? My criterion is the fact that you can graduate from kitsch and move forward to high art but can't slide backward from high art back to kitsch once you have tasted it. To me this provides clues to unravel the mysteries of art. I shall argue that no matter how many laws of aesthetics you discover and however far our understanding of art continues, unless you can objectively specify the difference between kitsch and non-kitsch – you haven't really understood art.

Are the neural correlates of art experience and religious experience interrelated? Is our religious experiences rooted in our evolutionary biology?

This hasn't been studied, but is probably true. Subtle nuances of emotion can be evoked by visual art as well as by music. Darbari Kanada scale probably evolved from "peak shifts" and gull chick principle applied to separation cries of infants pleading. The angst of separation from parent got morphed – through gull chick effect into the existential angst of separation from God. (Oh, why have you brought me into this world – this vale of tears – and left me alone) and in the descent (But I know all will end well). Or Abheri (world sorrow) compared with Bhairavi (plaintive personal sadness) and Shubhapantuvarali (penance). There may have been a transitional, intonational, right hemisphere ‘language’ without syntax for communicating subtleties of emotions. A public-school educated Englishman can say “really?’ in 11 ways. Or the same sentence “John pushed past Susan to look at the plane through the window?" can convey at least five different nuances of meaning depending on which word you stress. Eg. John pushed Susan? versus “John pushed Susan?” Or "John pushed Susan to look at the plane through the window" etc.

How is it that what you do is not a form of neurobiological reductionism?

Explaining something in terms of interactions of constituent parts doesn't explain it away. If I put electrodes in your lover's brain while you make love and show that her nucleus accumbens and septum are active – that doesn't make her sensation any less "real"; on the contrary it is a proof she isn't faking it.

It has been said you prefer very simple techniques to fancy, expensive technology; Is there a reason?

Poverty forces you to be "ingenious" and resourceful early on in your career plus the history of science tells us the importance of simplicity. The minute you start using fancy technology, there are so many steps from the raw data to the conclusion that there is plenty of scope for unintended massaging of the data. Methodology is important but your research should be concept driven – not methodology driven. Lastly, using sophisticated techniques (especially if computers are involved) lulls you into a false sense of thinking you have done something “scientific". The use of hi-tech is – to quote Peter Medawar – seen, unfortunately, as a sign of intellectual manhood.

The word meme was first introduced by Dr Richard Dawkins in 1976. Soon it even created a related controversial field ‘memetics’, which since then has been abandoned by serious social scientists. However, now with Internet we see memes having become a household word.

The meme is no more than a pun – it rhymes with gene. But it is fundamentally different. As Mendel showed genes are inherited in a quantised manner (independent assortment) and they don't undergo change as they spread. Memes are passed down in a Lamarckian way – through imitation and pedagogy – both horizontally (to peers) and vertically down generations. The spread of genes is lawful and takes several generations; the spread of memes isn’t. The polar bear took a thousand generations to evolve a fur coat by natural selection of genes; a human can watch his mother slay and skin a polar bear and imitate this behaviour to create a meme whose avalanche like propagation takes just a single generation. But its spread is unlawful, chaotic and unpredictable – there are no precise laws of sociology like the laws of Mendelian genetics.

From that insightful diversion into technology, let us return to your research now. You talked about your new area of research interest – Calendar agnosia. Can you tell our readers what it is?

Francis Galton in the nineteenth century asked hundreds of people to imagine or visualise the annual calendar in front of them. Most of us conjure up a vague fuzzy rectangular grid parallel to the face. But about one in 50 people has an inherited propensity to literally see (hallucinate) a crystal clear calendar that has a strange shape that is unique to him, like for example, a giant L shape or a hula-hoop going around the chest – with December on the left and July on the right.

The months are clearly marked clockwise in right handers and anti-clockwise in left-handers. No one believed such a syndrome existed or what caused it. My group – including my graduate students David Brang, Chaipat Chunharas, Zeve Marcus and Ed Hubbard – revived interest in it for the first time in over 100 years (as did Seanna Coulson, Stan Dehaene and Brian Butterworth). People were struggling for decades to find out whether the subject literally sees the calendar visually or does he merely see it metaphorically in his mind’s eye – as when you visualise a vague image of your butler while reading this. Various brain imaging studies were done with no clear result. We simply asked the subject to visualise the calendar and rattle off alternate month names backward like for example, October, August, June etc. Whereas normal people take 40 seconds these people visualise the calendar and simply read off the months backward – taking just 15 seconds.

What had remained unsolved for decades was shown in 30 minutes. But we need to see additional subjects before we can be sure. There is a small chance that the result was a statistical fluke – there is variability across subjects in their ability to recite backwards and our two calendar synesthete may have been very good at it (compared to eight normals) simply by chance.

We take our mental calendar for granted but unconsciously use it to navigate through life planning for the future, while being anchored in the present. The left angular gyrus is a tiny structure involved in a disproportionate number of uniquely human skills – arithmetic, naming fingers, knowing right from left, reading, writing etc, and it occurred to me that the idea of a sequence – even if it is in time – is mapped on to the brain spatially because the brain never had time to evolve a look-up table for numbers and sequences and finds it convenient to represent time as space. I was struck by the fact that a band of fibres called the ILF (inferior longitudinal fasciculus) connects the angular gyrus with another brain structure called the hippocampus – which is the brain's GPS system and contains place neurons and grid neurons with signal location in time and space etc. So we proposed that the angular gyrus – ILF –Hippocampus system is the brain's own calendar. We predicted that damage to the angular gyrus or ILF should lead to a new syndrome or sign which we call "Calendar agnosia" – difficulty with sequencing in general but especially calendars. Recently, we found that dyslexic children with minor dysfunction in the angular gyrus did indeed show calendar agnosia!

We here view time as a cycle or spiral and the Western perception of time is linear. Are there cultural elements in the brain construction of the calendar?

Of course; it is crisp and well-defined in Germans and vague in Italians. Another point. When one of our subjects turned her head rightward the hula-hoop calendar remained where it was in front of her chest. But the left side of the calendar became fuzzy and indistinct and memories from those few months became equally fuzzy; the access to memories was being blocked by the direction of gaze. As though she was looking at a real physical calendar dangling in front of her – a striking example of what is called ‘embodied cognition’.

If you had not become a neuroscientist what would you have become?

An archeologist – though I undoubtedly have an over-romanticised version in mind.

So have you ever felt like an archeologist or paleontologist when you delve deep into the mysteries of the brain?

Yes absolutely! The brain is a palimpsest of its evolutionary history – it is full of fossil treasures we can excavate just as an archeologist excavates physical evidence – pottery and weapons – stratified in archeological digs.

So talking about your interest in archeology, despite the rich archeological sites, we find in India an utter disregard for taking care of its cultural legacy in a scientific way. What you think should be the remedy?

Schools should teach our children the romance and excitement of Indian archeology; I bet many more of them have heard of Troy – or worse yet – the eight wives of Henry the Eighth (thanks to the British colonial rule and Macaulay’s agenda) than have heard that S R Rao has actually found the ancient supposedly mythical city of Dwaraka over which Krishna ruled. Unfortunately, because of lack of funding and its potential for misuse by the religious right – few people in India know about it or the tremendous potential for new excavations. But I hesitate to say this; I might be accused of going saffron. But we must not confuse taking pride in the antiquity of our civilisation with religious fundamentalism. After all, here in the West – biblical archeology is a flourishing field – it is neither allied to nor negates Christian fundamentalism.

There is no mystery city buried off the shore from Mahabalipuram, although the tsunami exposed two new temples – glimpses of which in ancient times might have given rise to speculations about ‘seven pagodas’. But even though there is probably no underwater temples, some serious marine archeology needs to be done.

How about an example?

There have been preliminary underwater explorations off Poompuhar hinting at the existence of an ancient ‘city’ but again, it was debunked as myth – as was done with Troy.

Unfortunately, we have to put up with extremes. On the one hand we have Anglophile Indophobes, who don't see any possibility of elements of our epics being true. At the other extreme, many mystically-inclined people un-questioningly accept the mythology of India. They even speak of our ancestors flying luxury planes and using weapons like atom bombs. The more real debates, for example, whether Dwaraka and Poompuhar existed can only be resolved by more excavation – we need private entrepreneurs to step in with funding. The same holds for the bridge between our southern coast and Sri Lanka – the topic is so politically charged that serious scholars avoid it. But the fact that pumice exists in mines nearby suggested to me that the myth might be true – pumice floats on water and could in theory be used to create a floating bridge. So the jury is out. But let me add I know nothing about Indian geology and could be wrong.

Very respected scholars like B B Lal and Badrinarayanan (director generals of archeology and geology in India respectively) have endorsed the idea – cautiously. My worry though is the general lack of interest and curiosity about such matters in India. When I mention it to prominent people during visits to India, I usually elicit a bemused chuckle of embarrassed amusement rather than wonder, and a passion to know the truth. (This is in marked contrast to their curiosity about a former chief minister's private life.)

Perhaps, a better example is Kishkinda – the mythical birth place of Hanuman – which is near Hampi. Stories about Hanuman, Vali and Sugreeva are abound in Hampi, and perhaps could be matched systematically with accounts of different locations and distances etc as described in the legend. If they emerged as local myths and fairy tales from far-flung locations, there would be no incentive for the different accounts to be mutually consistent except in broad outline. Also, one could see if different versions of the stories as described in far-flung hamlets tally up. For example, does the distance between, say, Chirtrakoot and Kishkinda remain constant? Sri Lanka too has been inadequately explored despite the fact that there are so many references to it in Ramayana.

Returning to your career, most people associate you with neurology, but the first decade of your research was on human vision, was it not? Francis Crick of DNA fame once described your experiments as both simple and ingenious. And Nobel laureate, David Hubel, has praised your research, calling it “bold, irreverent, original and ingenious – people who are not specialists will be impressed, but so will people who, like me, have spent a lifetime studying the brain”. So why did you switch fields?

Because vision research was becoming over-crowded – It's as simple as that.

Has any of your research been deemed controversial?

Does Einstein’s ‘debate’ with Bohr lead us to conclude that their work was controversial? Any scientist with a career spanning many decades is almost certain to have done some things that at least one or two colleagues regard as unproven. In my own arena, people like Noam Chomsky are sometimes regarded as controversial though no one doubts the solidity of their early work or their genius. Even the theories of Nobel laureates like Francis Crick are sometimes criticised (eg his theory of consciousness). And let us not forget critics of HIV causing AIDs or of global warming – the latter is deemed controversial by our new president.

The general rule is if you have made some solid contributions and established your stature and credibility, people are more ‘forgiving of occasional speculative forays'. My work on stereopsis, motion perception, shading, blind spots, synesthesia, phantom limbs etc, have stood the test of time and partially influenced the development of these fields. However, my speculations on the role of the mirror neuron system in human evolution and autism remain unproven. We also need to bear in mind that –sometimes an idea is useful for a while because it stimulates new inquiry even if it turns out to be wrong in detail. As Sherlock Holmes said to Watson, “I pay careful attention to your writings my dear Watson, because seeing the many flaws in your arguments helps me discover the true solution".

Such might be the case with some new work I have been doing with my former grad student Laura Case. We have been studying an extraordinary condition we call AGI or alternating gender identity. A person with this condition might be anatomically female but mentally switch her internal identity on a weekly basis to male and experience phantom body parts and a desire for cross-dressing. We have done studies showing that they are not merely ‘role playing’ and that the switch might involve a shift from one hemisphere of the brain to the other. Even in all of us, sexuality may fluctuate instead of being fixed; there is an Ardha Nareeswara in all of us. But additional work involving imaging and brain stimulation is needed – at present the evidence is suggestive but not compelling. We postulate that zapping one hemisphere with a paralysing jolt of TMS (transcranial magnetic stimulator) will cause a trans person to switch sex.