Hillary Clinton's new book about the 2016 campaign provides a reminder of the power of fake news, and our inability to find a concrete corrective. It's now clear that many false or misleading stories circulated on social media in the final weeks of the campaign, some apparently created by Russian agents. It's hard not to see that as a threat to our democracy.
The ultimate weapon against such disinformation would be a less credulous public. But given our tendency to believe "facts" that confirm our biases—and for startling but unverified assertions to stick in our brains—is that really possible? Can fake news be successfully debunked?
Just-published research offers no panacea, but it does provide some concrete suggestions.
"The effect of misinformation is very strong," said co-author Dolores Albarracin, a psychologist at the University of Illinois–Urbana-Champaign. "Generally, some degree of correction is possible, but it's very difficult to completely correct."
The researchers, led by Man-pui Sally Chan, gathered data from 20 experiments described in eight research reports published between 1994 and 2015.
Participants—nearly 7,000 in total—read inaccurate reports of everything from fires and robberies to the fictional "death panels" associated with the 2010 Affordable Care Act. Researchers then used various techniques to attempt to correct the misimpressions they had gathered.
You can lead a Facebook user to the facts, but you can't make him believe them.
Analyzing their results, Chan and her colleagues (including Kathleen Hall Jamieson of the University of Pennsylvania's Annenberg Public Policy Center) discovered some interesting patterns. Writing in the journal Psychological Science, they offer three specific recommendations.
First, when attempting to debunk a false narrative, avoid repeating the original falsehood in detail. This sort of elaboration "reduces the acceptance of the debunking message, which makes it difficult to eliminate false beliefs," they write.
Second, "correct misinformation with new, detailed information." Evidence suggests simply labeling a belief as wrong is less effective than providing specific reasons why it is mistaken.
Finally, debunk misinformation in such a way that encourages counterarguments. That sort of back-and-forth "enhances the power of corrective elements" by inducing "a state of healthy skepticism," they write.
In other words, the best results appear to come when a person can be coaxed into coming up with a compelling explanation as to why his or her initial idea was mistaken. In psychological terms, this "enables recipients to update the mental model justifying the misinformation."
So if you can get people to think seriously about an issue, question the basis for their beliefs, and in the process present detailed information on why their assumptions are wrong, minds can be changed.
If that sounds like a heavy lift, well, it is. Chan and her colleagues concede that debunking is difficult, and often unsuccessful. "The ultimate persistence of the (inaccurate) information," they write, "depends on how it is initially perceived."
To update the old saying, you can lead a Facebook user to the facts, but you can't make him believe them.