If you were a reporter instructed by your editor to hack into a grieving parent's phone, would you do it? If you were a Syrian soldier ordered to fire on unarmed protesters, would you obey? What if you were asked by a white-coated scientist to deliver lethal electric shocks as part of an experiment?

Your answer to all of these questions will undoubtedly be "no" – or at least, "I hope not". Certainly when Stanley Milgram put the last question to 110 Americans — psychiatrists, students and middle-class adults — all of them insisted that they would defy anyone in authority who asked them to do such a terrible thing.

But Milgram was not satisfied with this answer. Fifty years ago, in what are still being celebrated as the most famous psychology studies of all time, he examined what would happen when people were confronted with this scenario in real life.

The volunteers were asked to play the role of "teacher" in a learning experiment. Would they go along with an experimenter's instructions and deliver increasingly harsh electric shocks, up to 450 volts, when the "learner" made a mistake?

Of course, the shocks were not real and the "learner" was an actor employed and carefully schooled by Milgram to play his part. But the participants didn't know that.

As the results of his first pilot studies came in, Milgram was astonished to find that participants regularly followed the experimenter all the way to the bitter end. At first he dismissed the results, thinking that perhaps they were peculiar to the Yale students who had taken part. But when he repeated the studies with a cross-section of American adults he obtained similar results.

In the so-called baseline condition, where the teacher is in a different room to the learner and only hears his reactions through the wall, 65% are fully obedient. Milgram had identified a phenomenon that would shock the world.

Milgram's findings suggest that our instinctive answer to the question of whether we would obey the destructive orders of an authority is wrong. They seemed to show that evil acts are not the preserve of a few psychopaths among us. Instead, in the wrong circumstances, any of us is capable of inflicting terrible harm on our fellow human beings.

As a Jew, Milgram had the holocaust clearly in mind, but his argument might equally be applied to events that were going on even as he developed his ideas, such as the My Lai massacre during the Vietnam war.

But why? What makes – or rather allows – ordinary people to do such extraordinary harm? Milgram himself proposed that this occurs when people (like the participants in his studies) enter an "agentic state" in which they focus entirely on how well they fulfil their obligations to authority. Here they become so fixated on obeying orders that they are blind to the harm that they are doing.

Our recent work suggests a very different explanation.

Even Milgram's most fervent admirers are sceptical of his explanation. For one thing, Milgram's "baseline study", albeit the most famous, was actually only one of over 20 variants that Milgram conducted. And across these studies the percentage of people who went all the way to 450 volts varied from 0% to almost 100%. The idea that people enter an agentic state – and hence obey orders – whenever they are confronted with authority cannot explain this variation.

For another thing, if you examine exactly what the experimenter said to urge participants to continue administering shocks, it becomes apparent that the more he issued orders ("you have no other choice, you must go on" as opposed to giving justifications based on the scientific value of the research) the less likely it was that people would carry on administering the "shocks".

Contrary to all the received wisdom, it seems that whatever was going on in these so-called obedience studies, people were not blindly following orders.

The current consensus is that Milgram identified a phenomenon of supreme importance, which half a century later – and in the light of massacres in Bosnia, Rwanda, the Sudan, Syria and Libya – is sadly as relevant as ever.

But this phenomenon still remains to be properly explained. This impasse is unsurprising when one considers that the agentic state approach distorts the essential character of the "obedience" paradigm. That is, the approach places a one-sided focus on the relationship between participant and experimenter. But the power and tension of the paradigm lies in the fact that the participant is caught between two different voices: the experimenter urging "go on", and the learner appealing "stop, let me out of here". The question, then, is which voice will the participant be led by?

If we reconceptualise the obedience paradigm in terms of leadership, however, we can draw on recent advances that suggest we follow leaders because we see them as representative of an identity that we share. In the Milgram paradigm the critical question is therefore whether participants identify with the experimenter as an authority who represents a scientific endeavour in which both are involved, or whether they identify with the leader as a fellow member of the general public.

This approach is supported by a recent re-analysis we have conducted and which we will present at a British Psychological Society conference in Cambridge next week. This shows that the proportion of people who go on to 450 volts in the different variants of Milgram's paradigm is extremely well predicted by the degree to which participants identify with the science and the scientist as opposed to the public and the learner.

When a given variant encourages participants to identify with the experimenter but not the learner (for example, because the latter is in another room) obedience is high; but when the variant promotes identification with the learner rather than the experimenter (for example, because the experimenter is not a scientist, or if two experimenters argue) obedience is very low.

From this perspective, people do not deliver electric shocks because they are ignorant of the effects but because they believe in the nobility of the scientific enterprise. For now, this can be no more than a provisional conclusion. But it points to a possibility that is even more disturbing than Milgram's original account.

People don't inflict harm because they are unaware of doing wrong but because they believe what they are doing is right. We should be wary not of zombies, but of zealous followers of an ignoble cause.

Stephen Reicher is professor of psychology at the University of St Andrews, Alex Haslam is professor of psychology at the University of Exeter. They have edited a special issue of The Psychologist to mark the 50th anniversary of Milgram's "obedience" studies. Their book The New Psychology of Leadership (with Michael Platow) is published by Psychology Press