In October 1963, the Journal of Abnormal and Social Psychology published an article, blandly titled “Behavioral Study of Obedience,” by a 30-year-old Yale professor named Stanley Milgram. The young author had never before published in an academic journal, and it was clear from his prose he was hoping to make an early splash. He had conducted an experiment that he claimed shed light on one of humanity’s basic features: our tendency to obey orders, even ones that conflict with our morals, so long as they are issued by an authority figure. By his fourth sentence he was already referencing Nazi death camps and their “daily quotas of corpses,” implying that the Holocaust was something his nine-page paper would help the world understand.
Almost five decades later, the “Milgram experiments”—with their famous simulations of powerful electric shocks—are among the most well-known studies of the 20th century. No introductory psych course or textbook can get away with skipping them. The experiments have inspired plays, a William Shatner movie, episodes of reality TV, a storyline on Law & Order, special issues of academic journals, at least one novel, and two pop songs that I can find. The lyrics to Peter Gabriel’s 1986 track “We Do What We’re Told (Milgram’s 37)” neatly sum up Milgram’s findings, at least as they have been absorbed by popular culture: “We do what we’re told / We do what we’re told / We do what we’re told / Told to do.”
In the 1970s, Milgram suggested that his findings helped explain American military atrocities in Vietnam. More recently, they’ve found new life as a staple of debates about torture at Abu Ghraib and Guantánamo. In business-school ethics courses, the experiments are cited to warn budding middle managers about the perils of blindly following orders from up the food chain. To a remarkable degree, Milgram’s early research has come to serve as a kind of all-purpose lighting for discussions about the human heart of darkness.
After tracking down one of Milgram’s research analysts, Perry found reason to believe that most test subjects knew they were taking part in a low-stakes charade.
But how much does Milgram’s work really illuminate? In 2004, Gina Perry, an Australian journalist and psychologist, was conducting research for an article on Milgram when she learned about a Yale archive containing hundreds of audiotapes documenting his obedience research. Intrigued, she flew to New Haven and started listening to the tapes. Over time she became more interested in Milgram’s subjects—and what they revealed about his experiments—than in the scientist himself.
Between 1961 and 1962, almost a thousand people showed up at Milgram’s lab for different variations on the shock-machine experiments. Afterwards, some were shaken, convinced that Milgram had unveiled a sinister weakness lurking within them; others resumed their lives as if nothing had happened. In her new book, Behind the Shock Machine, Perry describes how, over the course of four years, she tracked down several of Milgram’s still-living participants, his participants’ relatives, his collaborators and assistants, and even their relatives.
The more conversations she had, and the more time she spent in the Yale archives, the more she came to believe that Milgram’s subjects had been doubly misled: first by the experimental conditions in the lab, and second by the notion that Milgram had proven anything at all.
IN 2011, THE FILMMAKER Eli Roth, best known for the mainstream torture-porn hit Hostel, recreated a version of the Milgram experiments for a Discovery Channel special called How Evil Are You? The year prior, a major French television channel broadcast Le Jeu de la Mort (The Game of Death), which combined an approximation of Milgram’s methods with the trappings of a game show, including a studio audience that egged participants on with cries for punishment. In 2009, the journal American Psychologist ran an article by Jerry M. Burger, a psychologist who claimed to have reproduced Milgram’s results. As it happened, Burger had been put up to his act of scientific replication by ABC News, which funded his research and aired footage of his experiments during “The Science of Evil,” a 2007 episode of its Basic Instincts series pegged to the atrocities at Abu Ghraib.
The perennial fascination with Milgram’s findings, underlined by these mass-market recreations engineered for maximum entertainment value, has had a way of obscuring what actually took place in Milgram’s lab. In the original setup described in his 1963 paper, participants would arrive at Milgram’s basement laboratory thinking they’d signed up for a study of memory and learning. One participant (called the “teacher”) watched as the experimenter strapped another participant (the “learner”) into a restraining chair and attached an electrode to his wrist. The teacher was then led to an adjacent room. On the experimenter’s instructions, he used an intercom to guide the learner through a simple memorization task. The teacher was instructed to punish every mistake the learner made by pressing a button that delivered an electric shock, each shock stronger than the one before.
The learner was actually an actor in league with Milgram, and the shock machine was a fake—but Milgram’s subjects were told no such thing. Sixty-five percent followed their instructions to the very end, administering shocks to the learner again and again, upping the intensity until they’d reached 450 volts, the highest level available. They kept going after the learner pounded on the wall, as if in protest, and even after he stopped responding altogether, as if unconscious. In one variation, the learner screamed for mercy, and complained of a weak heart before falling silent at the end.
Milgram’s experiments quickly touched off a debate over the ethics of his method. While deception in experiments was a mainstay of psychological research up through the 1960s, most deception-based studies didn’t end up on the front page of The New York Times, and none had involved tricking people into thinking they’d harmed—maybe even killed—a fellow human. Milgram noted that all his experiments ended with a “dehoaxing” session, but this wasn’t sufficient for his critics. In 1973 the American Psychological Association passed new research guidelines effectively banning the type of deception on which his obedience work had relied.
This did not, however, stop Obedience to Authority, Milgram’s 1974 book about the experiments, from becoming a bestseller, nor did it stop universities from inviting him to speak, students from reading his papers, and journalists from representing his conclusions in the most sensational terms possible. The controversy over the ethics of Milgram’s methods became part of his experiments’ mythology—a story of ruthless means yielding profound scientific ends.
Along with conducting his most famous experiments, Stanley Milgram dreamed of, and dabbled in, other careers—filmmaker, writer of literary fiction—that indulged his taste for creative spectacles. (PHOTO: COURTESY OF ALEXANDRA MILGRAM)
And yet, in the Yale archives, Perry discovered that even the debate over Milgram’s procedure had been premised on false information. There had been dehoaxing sessions, yes—but only in a narrow sense of the word. When we hear about dehoaxing, Perry argues, we reasonably assume this means the participants were told that the shocks weren’t real, that the learner’s screams had been pre-recorded, and so on. But this wasn’t the case: Milgram’s debriefing sessions were used primarily to calm down agitated participants (there were many). Three-fourths of subjects left the lab without being debriefed at all. Many went home confused or upset; over 50 years later, some are still unnerved. A significant portion learned the truth only in letters they received months afterward. Others appear never to have been told at all.
If all Milgram had done was fudge his account of the dehoaxing process, his findings could still be completely valid. But Perry also caught Milgram cooking his data. In his articles, Milgram stressed the uniformity of his procedures, hoping to appear as scientific as possible. By his account, each time a subject protested or expressed doubt about continuing, the experimenter would employ a set series of four counter-prompts. If, after the fourth prompt (“You have no other choice, teacher; you must go on”), the subject still refused to continue, the experiment would be called to a halt, and the subject counted as “disobedient.” But on the audiotapes in the Yale archives, Perry heard Milgram’s experimenter improvising, roaming further and further off script, coaxing or, depending on your point of view, coercing participants into continuing. Inconsistency in the standards meant that the line between obedience and disobedience was shifting from subject to subject, and from variation to variation—and that the famous 65 percent compliance rate had less to do with human nature than with arbitrary semantic distinctions.
The wrinkles in Milgram’s research kept revealing themselves. Perhaps most damningly, after Perry tracked down one of Milgram’s research analysts, she found reason to believe that most of his subjects had actually seen through the deception. They knew, in other words, that they were taking part in a low-stakes charade.
Gradually, Perry came to doubt the experiments at a fundamental level. Even if Milgram’s data was solid, it is unclear what, if anything, they prove about obedience. Even if 65 percent of Milgram’s subjects did go to the highest shock voltage, why did 35 percent refuse? Why might a person obey one order but not another? How do people and institutions come to exercise authority in the first place? Perhaps most importantly: How are we to conceptualize the relationship between, for example, a Yale laboratory and a Nazi death camp? Or, in the case of Vietnam, between a one-hour experiment and a multiyear, multifaceted war? On these questions, the Milgram experiments—however suggestive they may appear at first blush—are absolutely useless.
It is likely that no one understood this better than Milgram himself. In his notes and letters, Perry finds ample evidence that, privately, he had significant doubts about his work. He’d also dreamed of, and even dabbled in, other careers—filmmaker, writer of literary fiction—that indulged his taste for creative spectacles. Despite his reservations, he was a canny marketer of his own findings, well aware of how to package science in a way that would set tongues wagging. While Milgram was conducting his experiments, the American news media was fixated on the trial of the Nazi logistician Adolf Eichmann, then underway in Jerusalem. Ahead of the release of Obedience to Authority, Milgram himself drafted potential taglines for the book’s cover: “Is your neighbor a potential Eichmann?” “Where’s Adolf Eichmann? Check your mirror, friend.”
Today the Milgram experiments are as famous as ever. According to studies by historians of psychology, social psychology textbooks are still giving significant space to Milgram’s conclusions, but fewer pages than ever to serious criticism of his work. We have heard Milgram’s version enough. What we need now is what Perry has to offer: a proper dehoaxing.