How to Tackle Conspiracy Theories in Politics

Americans’ beliefs about politics are as inaccurate and bizarre as ever. Is there a way to fix that?
People attend a rally for Ohio Republican congressional candidate Troy Balderson on August 4th, 2018, in Lewis Center, Ohio.

The other week, President Donald Trump was upstaged at a campaign rally in Florida, not by Representative Ron DeSantis, for whom the rally was ostensibly held, and not by some rowdy supporters, and not by the media. No, Trump was eclipsed by an alleged shadowy figure named Q—a person at the heart of the “QAnon” conspiracy theory, which holds, among other things, that special counsel Robert Mueller is not actually investigating Russian election interference, but rather top Democrats’ involvement in a variety of criminal enterprises, including child sex trafficking.

Conspiracy theories, and more generally misconceptions about public policy, are nothing new, but they are taking up greater prominence in media and political discourse seemingly like never before, and not just in the United States. Two-thirds of people surveyed in the Middle East and North Africa gave at least some credence to the idea that the U.S. was secretly trying to help the Islamic State, according to one recent study. In Hungary, the prime minister tells people that liberal Jewish financier George Soros is out to get him. And there are plenty of false beliefs that, while not conspiracy theories, may be much more damaging—substantial numbers of Americans believe, for example, that more than half of the federal budget goes toward paying interest on the national debt. (It does not.)

Can these people ever be convinced otherwise?

On some level there is hope: Political scientists have found that surprisingly simple interventions could be enough to get some people to reverse course, maybe even on some of their most extreme and extremely inaccurate beliefs. But what works in the short term or on a small scale—in other words, what works in an academic experiment—faces some serious obstacles out in the real world.

“I have no silver bullets,” says Brendan Nyhan, a political scientist at Dartmouth University who studies misperceptions. “It’s a complex phenomenon that defies a simple solution.”

For years, the standard view among political scientists was that it was pretty hard to change anyone’s mind on anything. In fact, some of Nyhan’s work suggested that if you tried to correct an erroneous belief—say, that vaccines cause terrible side effects and autism—it could actually backfire and strengthen those beliefs. In a 2014 study of such beliefs, Nyhan and his colleagues estimated that showing vaccine skeptics a narrative about a baby who is hospitalized because of measles nearly doubled the fraction of skeptics’ who thought it very likely vaccines had serious side effects from 7.7 percent to 13.8 percent.

Yet in that same study was a glimmer of something else: When the researchers presented scientific evidence, taken from a Centers for Disease Control and Prevention website, directly refuting a link between autism and vaccines, the correction worked. The portion of people who had strongly agreed that “some vaccines cause autism in healthy children” declined from 9 percent to 5 percent.

Since that study, there’s been a growing body of evidence to suggest that taking the direct approach works for political misperceptions too. Just this year, Nyhan and Thomas Zeitzoff found that directly refuting myths about the history of the Israel-Palestine conflict reduced misperceptions. In 2017, Massachusetts Institute of Technology political scientist Adam Berinsky showed that directly refuting the rumors of death panels and euthanasia associated with the Affordable Care Act raised the portion of those who dismissed the claim, from 50 percent to 57. And, in a report for the Knight Foundation published earlier this year, Syracuse University political scientist Emily Thorson found that relatively simple corrections could reduce some misperceptions by as much as 20 percentage points or more.

Those studies signal a few potential strategies for changing someone’s mind. First, make sure the information comes from a reliable source—keeping in mind that reliable may be in the eye of the beholder. In Nyhan and Zeitzoff’s study on the Israel-Palestine conflict, those corrections were supplied by “a well-respected Israeli historian.” In Berinsky’s study, about twice as many people changed their minds about the ACA euthanasia rumor when the information came from a Republican than when it came from a Democrat.

That’s a simple calculus, Berinsky says. “People speaking against their interests [are] more credible,” he says. “What’s more credible: The surgeon general or McDonald’s saying you shouldn’t eat French fries?”

Second, it may help to include the corrections wherever someone might read about a subject, an idea Thorson calls “contextual fact-checking,” which really amounts to a sidebar containing background information on a news story. In the Knight Foundation study, Thorson showed 391 people a news story about the national debt, and half of those saw some additional background information. Without that information, 60 percent of those surveyed believed China held more than half the U.S. national debt, but with it only 43 percent did. (In fact, only about a third of the debt is owed to other nations, including China.)

There are some important caveats to this research. Berinsky also found that merely repeating death panel rumors increased the portion of people who believed it from 17 to 20 percent, while decreasing the portion who rejected it from 50 to 45 percent. Corrections also had little sway over people who actually believed the rumor going in; most of the shifts came from people who weren’t sure about the rumor and then, after reading more information, decided to reject it.

In addition, a correction’s effects seem to fade over time. In 2012, Berinsky decided to see how many Americans believed President Barack Obama was born in the U.S. and how that number changed over time. In early April of 2011, 55 percent did, while 15 percent thought he was born elsewhere and 30 percent were unsure. After Obama released his birth certificate, large numbers of Americans believed he was born here. But by the next year, that figure was back down below 60 percent.

“Corrections fade quickly,” Berinsky says. “You’ve got to keep whacking the same mole.”

The situation is worse when it comes to topics like child actors at Sandy Hook or the Mexican border, says Joanne Miller, a professor of political science at the University of Minnesota who studies conspiracy theories. Although those theories have been around for a long time—in the lead-up to the Revolutionary War, Miller says, a good many people thought the British intended to enslave Americans—they are spreading faster than ever thanks to the Internet and social media. And once they spread, those ideas are often very hard to kill.

“There’s a fundamental difference between [misperceptions] and conspiracy theories,” Miller says. “The very nature of a conspiracy theory is a belief in a far-reaching conspiracy,” which makes it more likely that a believer will take any attempt to correct their beliefs as lies promulgated by the conspirators.

The better tactic for addressing conspiracy theories, Miller says, may be not “to attack the belief itself, but rather the reasons people believe in conspiracy theories.” That could include working to improve trust in government or addressing deeper anxieties. Conspiracy theories surrounding 9/11, for example, reflect a deep desire to reduce randomness and hence anxiety, some of which may itself reflect fears about who is in power—Democrats, for example, are more likely to believe in conspiracies when Republicans are in power, and vice versa.

In the face of that anxiety, changing a person’s beliefs is a tall order. “It’s about wanting to give them the motivation to believe the correct thing,” Miller says. “But tackling the motivation side is a heck of a problem.”

That’s a scary thought given the increasingly real-world implications of some conspiracy theories. The theory known as Pizzagate, for example, led an armed man to travel to Comet Ping Pong, the alleged hub of the alleged Clinton sex trafficking operation, to, as he put it, “self investigate.”

Even if it never came to shots fired, as it did in that case, and misperceptions and conspiracy theories had no effect on electoral outcomes, beliefs would still matter, says John Bullock, a political scientist at Northwestern University. “You can think of getting the beliefs right as a necessary condition for certain kinds of things that you want voters to do. If you want voters to gauge politicians accurately, well, that entails voters having accurate beliefs, or probably does,” Bullock says. And even if their beliefs aren’t as far from the truth as some suspect, it doesn’t mean they’re going to be the best, most civic-minded citizens we might hope they’ll be.

Taking all of it in, Nyhan says, there may be a deeper problem, one that is harder to solve: Not much is going to change until politicians and media outlets commit to telling the truth, over and over again. Nyhan isn’t particularly sanguine about that possibility—the political and economic incentives do not seem to favor it, and given the polarized state of American politics, demand for misleading information “isn’t going away.”

Yet there remains some hope. Fact checking politicians might help keep them honest, Nyhan and frequent co-author Jason Reifler found in 2014—and fact checking is something news media is generally taking more seriously. Then there was the news on Monday that Apple, YouTube, Facebook, and Spotify all but booted one of the most popular sources of conspiracies, Alex Jones, from their sites, suggesting that maybe something might start to change.

Related Posts