Justifying What You Know Can’t Be True

Researchers looking at al-Qaeda and Saddam Hussein explore why it is that people often steadfastly believe something even when they’ve been shown it ain’t so.

President Obama has had a hard time dislodging misperceptions about his health care proposal — those stubborn beliefs that there are death panels and free care for illegal aliens that don’t actually exist in the legislation. Recent research about the way people defend their faith in false information, though, suggests calling out the inaccuracies may not be all that effective in converting the suspicious.

Sociologists at the University of North Carolina and Northwestern University examined an earlier case of deep commitment to the inaccurate: the belief, among many conservatives who voted for George W. Bush in 2004, that Saddam Hussein was at least partly responsible for the attacks on 9/11.

Of 49 people included in the study who believed in such a connection, only one shed the certainty when presented with prevailing evidence that it wasn’t true.

The rest came up with an array of justifications for ignoring, discounting or simply disagreeing with contrary evidence — even when it came from President Bush himself.

“I was surprised at the diversity of it, what I kind of charitably call the creativity of it,” said Steve Hoffman, one of the study’s authors and now a visiting assistant professor at the State University of New York, Buffalo.

The voters weren’t dupes of an elaborate misinformation campaign, the researchers concluded; rather, they were actively engaged in reasoning that the belief they already held was true.

This type of “motivated reasoning” — pursuing information that confirms what we already think and discarding the rest —

helps explain why viewers gravitate toward partisan cable news and why we tend to see what we want in The Colbert Report. But when it comes to justifying demonstrably false beliefs, the logic stretches even thinner.

By the time the interviews were conducted, just before the 2004 election, the Bush Administration was no longer muddling a link between al-Qaeda and the Iraq war. The researchers chose the topic because, unlike other questions in politics, it had a correct answer.

Subjects were presented during one-on-one interviews with a newspaper clip of this Bush quote: “This administration never said that the 9/11 attacks were orchestrated between Saddam and al-Qaeda.”

The Sept. 11 Commission, too, found no such link, the subjects were told.

“Well, I bet they say that the commission didn’t have any proof of it,” one subject responded, “but I guess we still can have our opinions and feel that way even though they say that.”

Reasoned another: “Saddam, I can’t judge if he did what he’s being accused of, but if Bush thinks he did it, then he did it.”

Others declined to engage the information at all. Most curious to the researchers were the respondents who reasoned that Saddam must have been connected to Sept. 11, because why else would the Bush administration have gone to war in Iraq?

The desire to believe this was more powerful, according to the researchers, than any active campaign to plant the idea.

Such a campaign did exist in the run-up to the war, just as it exists today in the health care debate.

“I do think there’s something to be said about people like Sarah Palin, and even more so Chuck Grassley, supporting this idea of death panels in a national forum,” Hoffman said.

He won’t credit them alone for the phenomenon, though.

“That kind of puts the idea out there, but what people then do with the idea … ” he said. “Our argument is that people aren’t just empty vessels. You don’t just sort of open up their brains and dump false information in and they regurgitate it. They’re actually active processing cognitive agents.”

That view is more nuanced than the one held by many health care reform proponents — that citizens are only ill-informed because Rush Limbaugh makes them so. (For the record, the authors say justifying false beliefs extends equally to liberals, who they hypothesize would behave similarly given a different set of issues.)

The alternate explanation raises queasy questions for the rest of society.

“I think we’d all like to believe that when people come across disconfirming evidence, what they tend to do is to update their opinions,” said Andrew Perrin, an associate professor at UNC and another author of the study.

That some people might not do that even in the face of accurate information, the authors suggest in their article, presents “a serious challenge to democratic theory and practice.”

“The implications for how democracy works are quite profound, there’s no question in my mind about that,” Perrin said. “What it means is that we have to think about the emotional states in which citizens find themselves that then lead them to reason and deliberate in particular ways.”

Evidence suggests people are more likely to pay attention to facts within certain emotional states and social situations. Some may never change their minds. For others, policy-makers could better identify those states, for example minimizing the fear that often clouds a person’s ability to assess facts and that has characterized the current health care debate.

Hoffman’s advice for crafting such an environment: “The congressional town hall meetings, that is a sort of test case in how not to do it.”

Sign up for our free e-newsletter.

Are you on Facebook? Become our fan.

Follow us on Twitter.

Add our news to your site.

Related Posts