In August of 1971, social psychologist Philip Zimbardo performed an infamous experiment at Stanford University, one whose results still send a shudder down the spine because of what they reveal about the dark side of human nature. In The Lucifer Effect: Understanding How Good People Turn Evil (Random House, $27.95), Zimbardo recalls the Stanford Prison Experiment in cinematic detail. We watch as nice,middle-class young men turn sadistic; the experiment is terminated prematurely due to its character-imploding power. These events shaped the rest of Zimbardo’s career, focusing him on the psychology of evil,including violence, torture, and terrorism. In 2004 he served as an expert witness for the defense in one of the Abu Ghraib court-marshal hearings. Zimbardo gives a detailed analysis of the events at AbuGhraib in this new book, drawing on social psychology research, the military’s investigative reports, his own interviews, and hundreds of photos never released to the general public. Like Russian poet Aleksandr Solzhenitsyn, a former prisoner in Stalin’s gulag, he argues that “the line between good and evil is in the center of every human heart."

In May 2004, we all saw vivid images of young American men and women engaged in unimaginable forms of torture against civilians they were supposed to be guarding. The tormentors and the tormented were captured in an extensive display of digitally documented depravity that the soldiers themselves had made during their violent escapades. The images are of punching, slapping, and kicking detainees; jumping on their feet; forcibly arranging naked, hooded prisoners in piles and pyramids;forcing male prisoners to masturbate or simulate fellatio; dragging aprisoner around with a leash tied to his neck; and using unmuzzled attack dogs to frighten prisoners.

I was shocked, but I was not surprised. The media and the “person in the street” around the globe asked how such evil deeds could be perpetrated by these seven men and women, whom military leaders had labeled as “rogue soldiers” and “a few bad apples.” Instead, I wondered what circumstances in that prison cell block could have tipped the balance and led even good soldiers to do such bad things.

The reason that I was shocked but not surprised by the images and stories of prisoner abuse in the Abu Ghraib “little shop of horrors”was that, three decades earlier, I had witnessed eerily similar scenes as they unfolded in a project that I directed: naked, shackled prisoners with bags over their heads, guards stepping on prisoners’ backs as they did push-ups, guards sexually humiliating prisoners, and prisoners suffering from extreme stress. Some images from my experiment are practically interchangeable with those from Iraq.

Not only had I seen such events, I had been responsible for creating the conditions that allowed such abuses to flourish. As the project’s principal investigator, I designed the experiment that randomly assigned normal, healthy, intelligent college students to enact the roles of either guards or prisoners in a realistically simulated prison setting where they were to live and work for several weeks. My student research associates and I wanted to understand the dynamics operating in the psychology of imprisonment.

How do ordinary people adapt to such an institutional setting? How do the power differentials between guards and prisoners play out in their daily interactions? If you put good people in a bad place, do the people triumph or does the place corrupt them? Would the violence that is endemic to most real prisons be absent in a prison filled with good middle-class boys?

The enduring interest in the Stanford Prison Experiment over many decades comes, I think, from the experiment’s startling revelation of“transformation of character”—of good people suddenly becoming perpetrators of evil as guards or pathologically passive as prisoners in response to situational forces acting on them.

Situational forces mount in power with the introduction of uniforms,costumes, and masks, all disguises of one’s usual appearance that promote anonymity and reduce personal accountability. When people feel anonymous in a situation, as if no one is aware of their true identity(and thus that no one probably cares), they can more easily be induced to behave in antisocial ways.

When all members of a group are in a deindividuated state, their mental functioning changes: they live in an expanded-present moment that makes past and future distant and irrelevant. Feelings dominate reason, and action dominates reflection. The usual cognitive and motivational processes that steer behavior in socially desirable paths no longer guide people. It becomes as easy to make war as to make love,without considering the consequences.

At Abu Ghraib, MP Chip Frederick recalls, “It was clear that there was no accountability.” It became the norm for guards to stop wearing their full military uniforms while on duty. All around them, most visitors and the civilian interrogators came and went unnamed. No one in charge was readily identifiable, and the seemingly endless mass of prisoners, wearing orange jumpsuits or totally naked, were also indistinguishable from one another. It was as extreme a setting for creating deindividuation as I can imagine.

Dehumanization of prisoners occurred by virtue of their sheer numbers, enforced nakedness, and uniform appearance, as well as by the guards’ inability to understand their language. One night shift MP, Ken Davis, later reported how dehumanization had been bred into their thinking: “As soon as we’d have prisoners come in, sandbags instantly on their head. They would flexi cuff ’em; throw ’em down to the ground; some would be stripped. It was told to all of us, they’re nothing but dogs. . . . You start looking at these people as less than human, and you start doing things to ’em that you would never dream of.”

The Stanford Prison Experiment relied on deindividuating silver reflecting sunglasses for the guards along with standard military-style uniforms. The power the guards assumed each time they donned these uniforms was matched by the powerlessness the prisoners felt in their wrinkled smocks. Obviously, Abu Ghraib Prison was a far more lethal environment than our relatively benign prison at Stanford. However, in both cases, the worst abuses occurred during the night shift, when guards felt that the authorities noticed them least. It is reminiscent of Golding’s Lord of the Flies, where supervising grown-ups were absent as the masked marauders created havoc.

We want to believe in the essential, unchanging goodness of people,in their power to resist external pressures. The Stanford Prison Experiment is a clarion call to abandon simplistic notions of the Good Self dominating Bad Situations. We are best able to avoid, challenge,and change negative situational forces only by recognizing their potential to “infect us” as they have others who were similarly situated. This lesson should have been taught repeatedly by the behavioral transformation of Nazi concentration camp guards, and by the genocide and atrocities committed in Bosnia, Kosovo, Rwanda, Burundi,and Sudan’s Darfur region.

Any deed that any human being has ever committed, however horrible,is possible for any of us—under the right circumstances. That knowledge does not excuse evil; it democratizes it, sharing its blame among ordinary actors rather than declaring it the province of deviants and despots—of Them but not Us. The primary lesson of the Stanford Prison Experiment is that situations can lead us to behave in ways we would not, could not, predict possible in advance.