Meet Julie and Mark, two siblings who are vacationing together in France. One night after dinner and a few bottles of wine, they decide to have sex. Julie is on the pill and Mark uses a condom so there is virtually no chance that Julie will become pregnant. They enjoy it very much but decide to never tell anyone or do it again. In the end, having sex brought them together and they are closer than ever.

Did Julie and Mark do anything wrong?

If incest isn’t your thing, your gut-reaction is probably yes – what Julie and Mark did is wrong. But the point of Julie and Mark’s story, which was created by University of Virginia professor of social psychology Jonathan Haidt, is to illustrate how easy it is to feel that something is wrong and how difficult it is to justify why something is wrong. This is what happens when Haidt tells the Julie and Mark story to his undergrads. Some say that incest causes birth defects, or that Julie and Mark will cause pain and awkwardness to friends and family, but birth control and secrecy ensured that none of these problems will occur. Students who press the issue eventually run out of reasons and fall back on the notion of it “just being wrong.” Haidt’s point is that “the emotional brain generates the verdict. It determines what is right and what is wrong… The rational brain, on the other hand, explains the verdict. It provides reason, but those reasons all come after the fact.”

So the question is: when it comes to our moral sentiments and deliberations, what system is in charge, the rational one or the emotional one?

The reason-emotion debate runs throughout the field of moral psychology. On one hand, cognitive science clearly shows that emotion is essential to our rationality, on the other hand, psychologists argue if reason really is the “slave of the passions,” as David Hume suggested. Haidt tends to take on the later position (and this is what the incest debate illustrates), but psychologists such as Paul Bloom and Steven Pinker believe that reason can persuade our emotions; this is why we have moral progress they argue.

Neuroscience is weighing in too. It demonstrates that we use different parts of the brain when we think deliberately versus when we go with our guts. As one author explains, “subjects who choose [rationally] rely on the regions of the brain known as the dorsolateral prefrontal cortex and the posterior parietal cortex, which are known to be important for deliberative reasoning. On the other hand, people who decide [with their guts] rely more on regions of the limbic cortex, which are more closely tied to emotion.”

So which system sets the agenda, the intuitive one or the rational one? Should I go with my gut as Gladwell advertises? Or would that lead me into predictably irrational mistakes as Ariely warns? Should I listen to my unconscious as Gerd Gigerenzer and Timothy Wilson suggest? Or, as the Invisible Gorilla folks advise, should I take note of how intuitions deceive us? And finally, will we ever know if anything is objective wrong with incest?

Moral psychology is young, so are relevant neuroscience and evolutionary psychology studies, so I hesitant to draw any conclusions here. So what about more general moral feelings? Is it nature, nurture, or somewhere in between? Thanks to several recent studies we now have some answers.

One experiment, which I briefly mentioned a couple of months ago, comes from Paul Bloom, Kiley Hamlin and Karen Wynn. Bloom summarizes in the following article:

In one of our first studies of moral evaluation, we decided… to use… a three-dimensional display in which real geometrical objects, manipulated like puppets, acted out the helping/hindering situations: a yellow square would help the circle up the hill; a red triangle would push it down. After showing the babies the scene, the experimenter placed the helper and the hinderer on a tray and brought them to the child. In this instance, we opted to record… which character they reached for, on the theory that what a baby reaches for is a reliable indicator of what a baby wants. In the end, we found that 6- and 10-month-old infants overwhelmingly preferred the helpful individual to the hindering individual.

Does this mean that we are born with a moral code? No, but it does suggest that we have a sense of compassion and favor those who are altruistic from very early on.

Another experiment comes from Marco Schmidt and Jessica Sommerville. Schmidt and Sommerville showed 15 months year old babies two videos, one in which an experimenter distributes an equal share of crackers to two recipients and another in which the experimenter distributes an unequal share of crackers (she also did the same procedure with milk). Then, they measured how the babies looked at the crackers and milk while they were distributed. According to “violation of expectancy,” babies pay more attention to something when it surprises them. This is exactly what they found; babies spent more time looking when one recipient got more food than the other.

What does this suggest? According to the researchers, “the infants expected an equal and fair distribution of food, and they were surprised to see one person given more crackers or milk than the other.” This doesn’t mean that the babies felt something was morally wrong, but it does mean that they noticed something wasn’t equal or fair.

Schmidt and Sommerville followed up the experiment with another. In the second, they offered the babies two toys, a LEGO block and a LEGO doll. They labeled whichever toy the babies chose as their preferred toy. Then an experimenter asked the baby if he could have the preferred toy. They found that about one-third of the babies gave away their preferred toy, another third gave away the toy that wasn’t preferred, and the last third didn’t share at all. They also found that 92 percent of the babies who shared their preferred toy spent considerably more time looking when the food was unequally distributed; 86 percent of babies who shared their less-preferred toy were more surprised when there was an equal distribution of food. In other words, the altruistic sharers (those who gave the preferred dolls away) noticed more when the crackers and milk weren’t distributed equally while the selfish sharers (those who gave the less-perferred dolls away) showed the opposite.

Taken together, Bloom’s and Schmidt and Sommerville’s work encourages the fact that our moral instincts form early on. But these two studies are just a tiny sampling. It is still difficult to say with certainty if we are born with a moral instinct or not. It is also difficult to say what this moral instinct entails.

Back to incest.

To be sure, evolutionary psychology easily explains why we morally reject incest – obviously, reproducing with our siblings would be counter productive – but there are many other topics such as why we act altruistically, why we show compassion towards strangers and why we give to charity that remain fairly mysterious. Fortunately, moral psychology is making great progress. It is an exciting new field and I look forward to more findings like the ones outlined here. In addition, I hope that one day in the near future psychologists will come to a consensus regarding the emotion-reason debate.

Schmidt, M., & Sommerville, J. (2011). Fairness Expectations and Altruistic Sharing in 15-Month-Old Human Infants PLoS ONE, 6 (10) DOI: 10.1371/journal.pone.0023223