Hollywood’s penchant for making (and awarding) films about itself is nothing new. But if you watch director Michael Almereyda’s latest, Experimenter, you would find a particularly creative angle for telling a story about the cinema. The biopic, which primarily portrays controversial social psychologist Stanley Milgram’s rise to prominence, draws parallels between filmmaking and social science. As Manohla Dargis wrote in her review for the New York Times, it is “less a straight biography than a diverting gloss on human behavior, historical memory and cinema itself.”
But perhaps even more surprising than the shiny new packaging for an old metaphor is just how timely the comparison is. Comparing a controversial scientist to a film director, Experimenter comes at a moment when science is beginning to feel it might have something in common with Hollywood.
As Almereyda’s film sharply underscores, Stanley Milgram actually had quite a bit in common with entertainment-industry set. The psychologist, who died in 1984, is perhaps best known for his experiment on obedience to authority figures, which took place at Yale University in 1961. But, as Peter C. Baker wrote in a review of Gina Perry’s book Behind the Shock Machine in our September/October 2013 issue, that fame resulted just as much from Milgram’s flair for the dramatic as it did from the experiment itself. Based on research into the Milgram archives at Yale University and interviews with his patients and their relatives, Perry’s book concludes that the social psychologist cooked his data and exaggerated his findings. Reading his notes and letters, Perry concludes that even Milgram himself doubted the experiment’s takeaway message—that 65 percent of the American population would deliver perhaps-fatal shocks to strangers under orders by a higher authority. And yet Milgram, an amateur literary writer and film director, thrived on spectacle and controversy. “Despite his reservations,” Baker wrote, “he was a canny marketer of his own findings, well aware of how to package science in a way that would set tongues wagging.”
At least in a contemporary sense, thespians do seem to be advancing human understanding of one another.
Experimenter highlights this artistic ambition in specific details—Peter Sarsgaard’s Milgram explains his aesthetic choices for the experiment as if in a DVD commentary, and cites Candid Camera as an influence on his work—but also in more symbolic ways. Milgram frequently breaks the fourth wall, as if cognizant that a camera is trained on him. (As Dargis writes, this is perhaps because, like a film director, he watches his experiments unfold behind a two-way mirror.) And he uses the metaphor himself: When one of his Harvard students accuses him of crafting a “deception” rather than an experiment in his Yale study, Milgram responds: “I like to think of it as illusion, not deception. Illusion has a revelatory dimension, as in a play. Illusion can set the stage for revelation, to reveal certain difficult-to-get-at-truths.” If that sounds like total bullshit intended to cover up bad research design, it’s familiar bullshit for anyone familiar with the semantics of entertainment dealmaking.
Hollywood, of course, has always fancied its sets and theaters to be miniature laboratories for accessing buried truths, and its method actors beautiful and bright psychotherapists. But Almereyda’s film captures a moment when the feeling’s getting a tad more mutual: Researchers are beginning to recognize that drama might be an important and useful tool for understanding the fundamental underpinnings of an audience’s reaction—human empathy and cognition.
Thalia Goldstein, an assistant professor of psychology of Pace University, is a leading advocate for the psychological study of actors. In 2011, she argued that cognitive scientists should examine acting because, in acting’s latest iteration as a work of realism, the craft reflects a “human invention” that “draws on a host of other cognitive capacities.” Goldstein’s own research bears this out. She finds that acting training may improve empathy and theory of the mind—both social cognitive abilities—in children and adolescents. Goldstein’s work gives some credence to the idea that role-playing allows one to access a deeper understanding of others—which, in turn, might be why trained actors seem to reveal something about the human condition. But Goldstein takes a longer view: Role-playing beyond childhood, she says, “may be a route by which humans come to develop enhanced empathy and gain greater insight into others’ beliefs and emotions.” Without further research, this evolutionary advantage is purely speculative. But, at least in a contemporary sense, thespians do seem to be advancing human understanding of one another.
It only makes sense that actors, who are lauded for going to great lengths to realistically emulate human behavior, might succeed in revealing something about it. But what about the technical professionals—the screenwriters, directors, and editors?
According to neuroscientist Uri Hasson, it’s this behind-the-scenes work that is actually most notable for its insights into human behavior. In 2004, Hasson published a neuroscientific approach to film studies that measured brain activity on MRI scans and analyzed similarities with inter-subject correlation analysis (ISC). Hasson found that directing, editing, and genre all determined the control that a film exerted over the viewing experience. Films with more involved directing and tighter editing evoked a greater shared response than your typical home video, and it was the genres relying most heavily on these techniques—horror, thrillers, dramas—that most effectively shaped their viewers’ responses. ISC, for instance, was higher for Alfred Hitchcock Presents and the Good, the Bad, and the Ugly than it was for Curb Your Enthusiasm. The research seems to indicate that cinematic “auteurs” like Hitchcock and Sergio Leone—directors who supposedly exert greater “authorship” over their films and realize highly specific aesthetic visions—have a better grasp on human behavior than Larry David.
Hasson argues that directors like Hitchock and Leone do the work of cognitive experts—or, at least, unintentionally harness responses that intrigue the experts. “Our data suggest that achieving a tight control over viewers’ brains during a movie requires, in most cases, intentional construction of the film’s sequence through aesthetic means,” he wrote. Though these powers appear to result primarily from neurotic directors, Hasson’s findings have nevertheless produced a small cottage industry of highly specific film marketers. In 2011, Fast Company heralded “the rise of neurocinema,” or the advent of neuromarketing companies that claim to use findings like Hasson’s to maximize profit and ground film development in empirics and eliminate waste.
None of this is particularly surprising. As early as the 1960s, Alfred Hitchcock was claiming that “creation is based on an exact science of audience reactions.” In 2015, claims to similarities with science like that in Experimenter, though, feels somewhat less narcissistic. Scientists still don’t think movies are based on exact science; but they are beginning to perceive that the study of their art brings us closer to one.
Since We Last Spoke examines the latest policy and research updates to past Pacific Standard news coverage.