Debates over race and public policy in the United States are a bit odd from a social-psychological standpoint. They tend to focus on explicit racism, on who said what and whether it was offensive, rather than on the trickiest, most difficult to dislodge aspect of racism: implicit bias.
Researchers have known for years that racism is a lot more complicated than how a white person responds to a question about blacks. Rather, it affects judgment and decision making at a subconscious level, even among many otherwise egalitarian people, influencing how they decide who to trust and how they react to dangerous-seeming situations.
One new study, led by New York University psychology researchers Jeni Kubota and Elizabeth Phelps, offers some valuable new insights into how implicit bias operates in negotiation settings—but also highlights the complicated ways that bias resists behavioral interventions.
The study is based on the Ultimatum Game, which is something of a behavioral economics greatest hit. In its standard version, the “proposer” must offer the “player” a split of $10. The player chooses whether or not to accept—if he or she declines, no one gets anything.
“First, prejudice is NOT just about individual bad apples. It lives and thrives in even the best intentioned. Solutions will have to take this into account.”
Neoclassical economics suggests that players, being rational, will accept any amount of money greater than zero, because the decision is whether to get some money or no money. But a surprising percentage of players will reject low offers, which suggests that affront at an unfair-seeming offer can short-circuit the steely-eyed rationality of homo economicus.
Kubota and her colleagues applied race to their version of the game, hypothesizing that as a result of ingrained stereotypes which associate them with “aggression and hostility,” black proposers would have a harder time getting other players to accept their offers than would white proposers.
That is, in fact, what they found: a statistically significant difference in the rates at which non-black players accepted offers from black versus non-black proposers.
“Even if the offer is the same, [players are] still going to reject more often for black [proposers] than whites,” Kubota said. “So if a black player offers $2 and a white player offers $2, they’re going to, on average, accept a proposal from the white player but not the black player.”
Kubota said that in looking at the results, the team couldn’t ascribe these differences simply to in-group/out-group effects. “It wasn’t just about being an out-group member,” she said. “It really was specific to black players in our game. So it seems to be something specific to what they were associating the black player with.”
It’s an interesting result, and it contributes to a growing body of research on the ways in which implicit bias operates. But all of this work races a question: What, exactly, are we supposed to do about it from a public-policy perspective?
Even within the narrow area of interracial negotiations, after all, there are certainly pressing public-policy issues. Take the debate about the Affordable Care Act, for example. Given what we know about bias, it’s not much of a stretch to suggest that it has influenced perceptions of Obamacare among white Americans, that many of them view the program in a racialized way—either because the “offer” is coming from a black president, or because of a widespread-in-some-circles conception that the benefit is flowing only to poor, “lazy” minorities (in this case, some of the effects would be explicit rather than implicit).
These attitudes have consequences: There are certainly legitimate arguments to be made against Obamacare, but the race of the law’s proposer, or of its recipients, is not one of them. Moreover, the law’s effectiveness in improving health care in America will depend on people actually signing up for it, rather than refusing to as a result of their suspicious attitudes.
“Of course, it’s hard to extract these really tightly controlled experimental situations” to the real world, Kubota said. But “it certainly seems that at least a part of that rejection in negotiation situations can be explained by whether the person is of a similar group to you or not.”
So implicit bias could be affecting attitudes toward Obamacare, as well as plenty of other public-policy issues. That’s good to know. But where does it lead us?
With some social science research, after all, there’s a short and well-paved path from research finding to intervention. If, as studies suggest, color-coding works better than calorie counts at encouraging people to make healthy eating decisions, there’s no mystery about what policy makers should do: push for more color-coding.
Research into implicit bias is different.
“Translating what we know about implicit prejudice into public-policy prescriptions is tricky to say the least,” wrote Curtis Hardin, a psychologist at Brooklyn College, in an email. “Frankly, it’s a lot easier to exploit prejudice for political purposes than to subvert it.”
The problem is that implicit prejudice is, well, implicit. As Kubota put it, “If people don’t have access to it, how do we intervene?” That’s part of the reason this specific subgenre of behavioral intervention is taking a bit longer to develop than some of its other, more straightforward cousins.
“Right now, a lot of intervention work on implicit bias is new,” she said. “Though we can make some claims about what’s been found in the last 10 years, a lot of this research is just starting.”
Hardin did note that there are already some practical takeaways from implicit bias research, however.
“First,” he wrote, “prejudice is NOT just about individual bad apples. It lives and thrives in even the best intentioned. Solutions will have to take this into account.” Peer pressure is another important factor, he wrote, since “implicit prejudice is readily influenced by pretty straight-forward social influence processes. For example, being around people you respect or like who you think value black people reduces and sometimes reverses anti-black implicit prejudice.”
Kubota said that hasty decisions are more likely to suffer from implicit bias. “Giving people more time helps,” she said. “Quick, speedy decisions lend themselves more to the influence of implicit bias. When people are given more time there are processes in the brain that come online: self-control processes that can downregulate the effects of implicit bias on behavior.” (For more on this, check out some of Daniel Kahneman’s work.)
No, these ideas aren’t as easy to implement as a red dot on a bacon cheeseburger. But they do mark an important start.
Kubota, for one, said she was hopeful that the anti-bias arsenals of social scientists will continue to grow. “There’s people at all levels of analysis working on this problem right now.”