How Political Polarization Breeds Ignorance

There’s new evidence that we trust the expertise of fellow political partisans, even when there is reason not to.
Voters cast their ballots at ChiArts High School.

You’re faced with a challenge in a realm that requires some expertise to navigate. Who do you turn to for help? Someone who clearly knows what they’re doing?

Nope. It’s more likely you turn to someone who shares your political ideology.

That’s the finding of a newly published study, which finds we trust the instincts of fellow partisans, even when there is reason to doubt them—and even in a realm far removed from politics.

“Our findings have implications for the spread of false news, for political polarization, and for social divisions,” senior author Tali Sharot of University College London said in announcing the results. “If we are aware of a person’s political leanings, for example on social media, we will be more likely to accept their take on a myriad of issues without scrutiny.”

In the journal Cognition, a research team led by her colleague Joseph Marks describes an experiment conducted online featuring adults. It featured 20 trials in which participants viewed 204 colored shapes. They were instructed to use trial and error to figure out which shapes would be classified as “blaps,” as opposed to “not blaps.”

Between some of the trials, they were presented with statements regarding specific social and political issues, and asked to indicate their level of agreement with each.

At the end of a trial, participants were presented the scores of four fellow players (which were, in fact, algorithms), as well as their answers on the political questions. They then chose one of the imaginary confederates to help them with the shape-identification test.

“Participants preferred to hear from the politically like-minded source that performed randomly on the blap task over the source that was accurate on the blap task but dissimilar politically,” the researchers report. A similarly structured follow-up experiment, featuring 101 people, replicated that key result.

“When we examined participants’ impressions of the co-players, we found they overestimated how good the politically like-minded were at the shape-categorization task,” Sharot explained. “This misperception drove the participants to seek advice from the politically like-minded.”

Which was, and is, a bad idea. “People are biased to believe that others who share their political opinions are better at tasks that have nothing to do with politics, even when they have the information they need to make an accurate assessment about who is the expert in the room,” the researchers conclude. This illusion apparently inspires us “to seek and use information from politically like-minded others.”

The researchers, who also include Harvard University’s Cass Sunstein, suspect these findings reflect the well-known “halo effect,” in which positive feelings about one aspect of a person or thing lead us to perceive other impressive qualities.

“If people generally believe that politically like-minded people are particularly worth consulting,” the researchers write, “they might extend that belief to contexts in which the belief does not make much sense.”

The dangers of this mental short-cut are self-evident. “Suppose someone with congenial political convictions spreads a rumor about a coming collapse in the stock market, (or) a new product that supposedly cures cancer,” the researchers warn. “Even if the rumor is false—and even if those who have reason to believe it is false—they may well find it credible, and perhaps spread it.”

Many of us have bemoaned the devaluation of expertise. Now we see what’s replacing it: blind confidence in the judgment of our fellow partisans. When tempted to follow that path, we’d all do well to recall a pithy slogan coined by Ronald Reagan: Trust, but verify.

Related Posts