Rethinking the Classic ‘Obedience’ Studies

Stanley Milgram’s 1961 obedience experiments and the 1971 Stanford Prison Experiment are legendary. But new research adds new wrinkles to our understanding of allegiance and evil.

They are among the most famous of all psychological studies, and together they paint a dark portrait of human nature. Widely disseminated in the media, they spread the belief that people are prone to blindly follow authority figures—and will quickly become cruel and abusive when placed in positions of power.

It’s hard to overstate the impact of Stanley Milgram’s obedience experiments of 1961, or the Stanford Prison Experiment of 1971. Yet in recent years, the conclusions derived from those studies have been, if not debunked, radically reinterpreted.

A new perspective—one that views human nature in a more nuanced light—is offered by psychologists Alex Haslam of the University of Queensland, Australia, and Stephen Reicher of the University of St. Andrews in Scotland.

In an essay published in the open-access journal PLoS Biology, they argue that people will indeed comply with the questionable demands of authority figures—but only if they strongly identify with that person, and buy into the rightness of those beliefs.

In other words, we’re not unthinking automatons. Nor are we monsters waiting for permission for our dark sides to be unleashed. However, we are more susceptible to psychological manipulation than we may realize.

In Milgram’s study, members of the general public were placed in the role of “teacher” and told that a “learner” was in a nearby room. Each time the “learner” failed to correctly recall a word as part of a memory experiment, the “teacher” was told to administer an electrical shock.

As the “learner” kept making mistakes, the “teacher” was ordered to give him stronger and stronger jolts of electricity. If a participant hesitated, the experimenter—an authority figure wearing a white coat—instructed him to continue.

Somewhat amazingly, most people did so: 65 percent of participants continued to give stronger and stronger shocks until the experiment ended with the “learner” apparently unconscious. (The torture was entirely fictional; no actual shocks were administered.)

To a world still reeling from the question of why so many Germans obeyed orders and carried out Nazi atrocities, here was a clear answer: We are predisposed to obey authority figures.

The Stanford Prisoner Experiment, conducted a few years later, was equally unnerving. Students were randomly assigned to assume the role of either prisoner or guard in a “prison” set up in the university’s psychology department. As Haslam and Reicher note, “such was the abuse meted out to the prisoners by the guards that the study had to be terminated after just six days.”

Lead author Philip Zimbardo, who assumed the role of “prison superintendent” with a level of zeal he later found frightening, concluded that brutality was “a natural consequence of being in the uniform of a guard and asserting the power inherent in that role.”

So is all this proof of the “banality of evil,” to use historian Hannah Arendt’s memorable phrase? Not really, argue Haslam and Reicher. They point to their own work on the BBC Prison Study, which mimicked the seminal Stanford study.

They found that participants “did not conform automatically to their assigned role” as prisoner or guard. Rather, there was a period of resistance, which ultimately gave way to a “draconian” new hierarchy. Before becoming brutal, the participants needed time to assume their new identities, and internalize their role in the system.

Once they did so, “the hallmark of the tyrannical regime was not conformity, but creative leadership and engaged followership within a group of true believers,” they write. “This analysis mirrors recent conclusions about the Nazi tyranny.”

It also sounds familiar to anyone who has studied the rise of semi-autonomous terror cells in recent decades. Suicide bombers don’t give up their lives out of unthinking obedience to some religious or political figure; rather, they have gradually melded their identities with that of the group they’re in, and the cause it represents.

Similarly, the researchers argue, a close look at Milgram’s study suggests it really isn’t about blind obedience at all. Transcripts of the sessions show the participants are often torn by the instruction to administer stronger shocks. Direct orders to do so were far less effective than entreaties that they need to continue for the sake of the study.

These reluctant sadists kept “torturing” in response to appeals that they were doing important scientific work—work that would ultimately benefit mankind. Looked at in this way, it wasn’t some inherent evil or conformism that drove them forward, but rather a misplaced sense of idealism.

This interpretation is still quite unsettling, of course. If a person has has fully bought into a certain world view and believes he or she is acting on the side of right, this conviction “makes them work energetically and creatively to ensure its success,” Haslam and Reicher write.

So in the researchers’ view, the lesson of these two still-important studies isn’t about conformity or even cruelty per se. Rather, they reveal a dangerous two-step process, in which authority figures “advocate oppression of others,” and underlings, due in part to their own psychological makeup and personal histories, “identify with those authorities … who promote vicious acts as virtuous.”

So we may not be inherently evil, but it appears many of us can be enticed into believing that a heinous act is, in fact, good and necessary. Perhaps the real lesson of these startling experiments is the importance of learning how to think critically.

The most effective antidote to evil may be rigorous skepticism.

Related Posts