In 1759, long before he wrote The Wealth of Nations, Scottish philosopher Adam Smith published The Theory of Moral Sentiments, in which he outlined the ways in which one person feels what he called “sympathy” for another’s suffering.
In his oft-quoted introduction, he wrote, “By the imagination we place ourselves in his situation, we conceive ourselves enduring all the same torments, we enter as it were into his body, and become in some measure the same person with him, and thence form some idea of his sensations, and even feel something which, though weaker in degree, is not altogether unlike them.”
Two hundred fifty years later, neuroscientists and social psychologists think Smith was on to something. He described perfectly what we would today call empathy (although that word wasn’t coined until the early 20th century) — the ability to put oneself in another’s shoes and feel something of what he or she feels.
It could be argued that empathy is one of the core traits that make us distinctively human, the emotional response giving rise to compassion and our impulse to help one another. Yet until recently, surprisingly little was known about where in the brain empathy arises or how it operates.
New research suggests that people use multiple brain networks to replicate another person’s experience, some of which operate almost automatically, while others employ conscious, higher-order reasoning.
A team at Vanderbilt University recently showed, for example, that a perceiver’s innate ability to adopt a different spatial perspective and imagine oneself in another person’s body was related to that person’s self-reported capacity for empathy.
Another study at Columbia University identified several brain regions that are enlisted when we listen to another person recount an emotion-laden story. Ingeniously, the researchers were also able to measure how accurate the perceivers were in gauging the target’s emotions.
Empathy research belonged in the realm of social psychology until the 1990s, when it was discovered that macaques — and presumably other primates, including people — have “mirror neurons” in their brains that fire automatically when one animal observes another’s actions. Neuroscientists quickly hypothesized that mirror neurons might account for our ability to feel empathically.
Columbia researcher Jamil Zaki says the mirror neuron network probably helps explain what might be called affective empathy — the simple ability to feel what another person is feeling. They seem to link to other brain networks, so if, for example, you see someone stub his toe, you’re likely to activate those areas in your own brain that process pain.
“There is a whole other network that underlies what I would call cognitive empathy, which is your ability to make a decision about what you think someone else is feeling, based on the situation that they’re in,” Zaki says. This is a more complex process that draws on life experience and contextual information, he says.
There are two main theories about how people generate this kind of empathy. One, called Simulation Theory, suggests that people draw on their own emotional experiences in order to figure out what someone else is feeling. The other contender, cleverly named Theory Theory, argues that people reason from generalized judgments about human behavior as they take the measure of another person’s emotional experience.
Zaki thinks there’s a third possibility. “It doesn’t make much sense that it’s either/or,” he says. “Probably what’s happening in the real world, where things are more complex, is that you’re bringing both of these processes on line.”
He co-authored a paper titled “The neural bases of empathic accuracy,” published in May in the Proceedings of the National Academy of Sciences, which reported on a functional magnetic resonance imaging (fMRI) study of subjects’ brains while they watched videos of people discussing emotionally intense experiences.
The study started with 14 “targets” who talked about these experiences on camera. The targets then rated their emotional response while watching a playback of the videos. In a second phase, 21 “perceivers” watched — and rated — the videos while lying in the fMRI machine. Later, their ratings were matched against the targets’ self-ratings. On the whole, the perceivers showed an impressive degree of accuracy, Zaki says.
Meanwhile, the fMRI studies revealed that the ability to correctly “read” another person’s emotions (which the researchers called “empathic accuracy”) was linked to activity in portions of the medial prefrontal cortex and the superior temporal sulcus, thought to be brain areas involved in integrating semantic and contextual information about social interactions. But there was also activity in the mirror neuron network, particularly in the right inferior parietal lobule and the bilateral dorsal premotor cortex.
“This finding is consistent with the idea that, while perceiving complex emotional displays, accuracy is predicted by a combination of sustained attention to targets’ verbal and nonverbal affect cues (including posture and facial expressions that may be processed in the mirror neuron system), and inferences about targets’ states based on integration of these cues,” the report said.
Meanwhile, at Vanderbilt, researcher Katharine N. Thakkar noted that in humans, mirror neurons seem to cluster in the front of the brain but also in the parietal cortex toward the back of the brain.
“It’s interesting, because this parietal cortex has also been implicated in basic spatial process, like mentally rotating things,” Thakkar says. “Our language is so rich with spatial metaphors, like when we talk about seeing something from someone else’s perspective. We just want to do a basic correlational study, to examine whether there is some kind of relationship between how quickly people can physically take someone else’s shoes and see their visual perspective — and how good they report they are at empathizing.”
In Thakkar’s paper, “Exploring Empathic Space: Correlates of Perspective Transformation Ability and Biases in Spatial Attention,” published online at PLoS ONE in June, she reported asking 40 subjects to look at a photo of a man with one hand raised up either facing toward the viewer or facing away. Researchers measured how quickly and accurately study subjects identified whether the man’s right or left hand was raised.
“When this image is facing you on the screen, and you see the back of him, it’s easier to decide which hand is yours, because you’re in the same perspective,” Thakkar explains. “But in that front-facing position, it does seem to be kind of a cognitively demanding task, because people’s error rates go up. It takes them much longer to respond when the figure is facing them because they do have to perform this kind of mental imagined transformation.”
The subjects also completed a standard psychological test that measured their empathic traits, and when those results were matched with the perspective-taking task, something interesting emerged.
“We kind of expected that the faster you could adopt the visual-spatial perspective of someone else, the more empathy you would report,” Thakkar says. “In fact, we found the opposite. So in women, the longer it took them to perform these perspective transformations, the more empathic they reported themselves to be.” There was no comparable effect among the male subjects, she notes.
The study wasn’t designed to explain the differences, but it’s possible that the highly empathic women took more time because they powerfully engaged their imaginations to solve the task, Thakkar says.
“I think the study provides a kind starting point to investigate these two domains and their relationship with each other,” she says. “How does visual-spatial processing and how we interact with the physical world affect how we interact with the mental states of others? It definitely needs more neural imaging and more rigorous behavior testing to work out.”
Sign up for our free e-newsletter.
Are you on Facebook? Become our fan.
Follow us on Twitter.