Polarization Isn’t That Hard to Come By

Researchers show that, even when we agree on the logic of an argument, disagreements over a single statement can lead us in radically different directions.

By Nathan Collins

A protester holds a sign during an Iraq War protest in 2005, in Chicago, Illinois. (Photo by Peter Thompson/Getty Images)

It’s easy to dismiss those who disagree with us as ignorant dimwits. But that outright rejection is misguided: Smart, well-informed people sometimes vehemently disagree on issues like climate change and evolution. Now, researchers argue that social interactions combined with our aversion to cognitive dissonance could explain such disagreements—and perhaps help us understand why political debates have become so polarized.

The key issue, University of California–Santa Barbara’s Noah Friedkin and his colleagues write in Science, is that changing a belief is not as simple as one person getting another to accept the truth of a single statement. If that were all it took, then data and logical arguments would probably be enough to get everyone to agree.

But, of course, getting a creationist to accept evolution, or vice versa, takes a lot more than changing a single belief. Instead, it requires modifying someone’s beliefs about an interconnected set of statements—that is, a belief system.

To get a flavor for what that means, Friedkin and his colleagues took a look at the arguments George W. Bush’s administration put forth to justify the war in Iraq, which involved three statements: There were weapons of mass destruction in Iraq; Iraq was a threat to the region and the world; and a preemptive invasion would be just. Those were connected by a logical structure, at least according to the administration at the time: The first statement implies the second, and the second implies the third.

Changing a belief is not as simple as one person getting another to accept the truth of a single statement.

Crucially, it’s possible for everyone to agree on that logic, yet disagree on the statements. For example, you can believe that if there’d been weapons of mass destruction in Iraq, then Iraq would have been a threat and a preemptive war would have been justified—all while believing there weren’t any such weapons hiding in the desert.

In fact, that’s all it takes to lead to polarization. In computer simulations, Friedkin and his colleagues show that, if the members of a social group generally bought both the administration’s logic and the claim there were weapons of mass destruction in Iraq, in time they’d come to agree the invasion was just, regardless of their initial beliefs about that particular statement. Another group that bought the logic but was more skeptical of the weapons of mass destruction claim would come to believe just the opposite. The logical structure is key here. Without it, the authors find, everyone converges to a state of high uncertainty, neither believing nor disbelieving claims that Iraq was a threat or that the war was just, regardless of their beliefs about the presence of weapons of mass destruction.

Still, polarization is just one, fairly intuitive possibility. The model also predicts that a small group of “intransigent skeptics” could sway public opinion on Iraq at large, and that, especially in small groups, such as a team of policy experts, individuals could arrive at a strong consensus even when they start with a difference of opinion, and even when they don’t use the same logical arguments—in other words, the model predicts groupthink.

In an accompanying viewpoint, Carter Butts, a professor of sociology, statistics, and electrical engineering and computer science at the University of California–Irvine, argues that the model could pave the way for a better understanding of why people familiar with science nonetheless reject it, or the perplexing phenomenon known as pluralistic ignorance—the observation that an entire group of people can sometimes share the same belief, while at the same time can remain convinced that no one else agrees with them.

Related Posts