In debating whether or not to release the allegedly gruesome images of Osama bin Laden’s death, one of the main arguments in favor of release was that it might help to counter the conspiracy-minded thinking that the operation was a fake, or that bin Laden has been dead for years, or even that he was a CIA fabrication all along.
But President Barack Obama seems to have realized what many social scientists have known for years: that no evidence in the world would convince certain people that a U.S. Navy SEAL unit killed bin Laden at his compound in Abbattabad, Pakistan, and buried him at sea. Much in the same way, a certain percentage of Americans will never believe Obama was born in the United States, no matter how many long-form birth certificates are released.
But how is it that the human mind is capable of spinning increasingly baroque tales to deny simple facts?
Much actually follows the simple idea that most of us only seek and believe information that supports what we already think, all the while ignoring and disregarding information that would contradict our pre-existing beliefs. It’s a phenomenon that goes by many names: confirmation bias, selective exposure, myside bias, cognitive dissonance. But they all add up to the same thing: We like to pick one side and be right about it, and once we commit to thinking a certain way, gosh darn it, we’re going to make sure we keep thinking that way – whatever it takes.
“When you get some evidence, you only take it seriously if it agrees with you,” explains Jonathan Baron, a professor of psychology at the University of Pennsylvania. “And in a way, that’s not unreasonable. If I hear of some psychology experiment that demonstrates ESP, I’ll tend to say, ‘I don’t believe that.’ But the difficulty comes when people selectively expose themselves to evidence and then pretend that they didn’t.”
Studies in psychology have demonstrated, among other things, a consistent pattern of avoiding evidence that contradicts an initial hypothesis; irrepressible overconfidence in one’s own judgment; that initial impressions are hard to dislodge; that people feel more confident in their decisions when they only consider one side; and that we, in fact, have a preference for one-sided thinking.
One of the most famous studies in this line of research recruited people who had strong pro and con views on capital punishment, and then exposed them to made-up studies of the effect of the death penalty as a deterrent to crime. Opponents of the death penalty described the research finding no link to deterrence to be more convincing, while the opposite was true for supporters of the death penalty. But what was remarkable was that people came away from the experiment even more convinced of their original viewpoint.
Another famous example details what happened when a UFO cult leader’s prophecy that aliens from the planet Clarion would rescue cult members from an earth-destroying flood on Dec. 21, 1954, did not, in fact, come to fruition. Rather than losing faith, cult members instead became more resolute, believing that the prophecy was real but that their group had spread enough goodness to save the planet from destruction. After all, what was harder to believe for the group members: that they had made a mistake, or that they had singlehandedly prevented the destruction of humanity?
Such appears to be the human mind: Start with a faulty premise, add time and intention, and voilà! — without constant efforts to challenge our beliefs, the cognitive shortcuts and motivated reasonings we are all prone to can easily cement into impossible-to-dislodge conspiracy theories.
“There’s what we’d talk about as an epistemological stance that people take,” says Deanna Kuhn, a professor of psychology and education at Columbia University Teachers College. “By that, I mean what their stance is with respect to evidence, how claims are supported, what kind of evidence would you take as proof of a claim. And there are many people, too many people, who are operating in an epistemological framework that is not open to evidence.”
In the U.S., conspiracies often taken on a partisan tinge, such as the birther phenomenon.
“When a Democrat is in the White House, the prevailing theories [are] fairly conservative,” says Mark Fenster, a University of Florida law school associate dean and author of Conspiracy Theories: Secrecy and Power in American Culture. “And when a Republican is in charge, the theories trend toward to the left. There’s a partisan aspect, but there’s an underlying fear about concentrations of power.”
If you don’t like Obama, you’re much more likely to attune to allegations of his foreign birth. They confirm what you always thought of him — something is not quite right about that man. And all that filtering has a way of feeding upon itself to construct an elaborate epistemological edifice.
“If you think Obama is a terrible president, then you want to think that he’s lying about bin Laden,” Baron says. “So some of it is wishful thinking.”
Studies have shown that even something as seemingly objective as one’s own personal financial well-being is subject to one’s partisan leanings.
Fenster also notes that there are some intriguing demographic drivers: African-Americans tend to be more likely to believe in conspiracy theories (perhaps because they’ve been victimized more in American history, and belief in conspiracy theories tends to be associated with alienation from power). Men also tend to be more likely to believe in conspiracy theories than women, though nobody is sure why.
All in all, it suggests that convincing those who are fully invested in disbelieving you is almost certainly always a fool’s errand. In fact, if you liked this article, it’s probably only because you suspected as much about human cognition in the first place.