Point out that special interests have a stake in manipulating your thinking.
By Tom Jacobs
Los Angeles. (Photo: Ben Amstutz/Flickr)
In light of the proliferation of “fake news” reports, which may or may not have influenced the outcome of the presidential election, Facebook has announced it will work with fact-checking organizations to flag suspect stories.
While that’s an important advance, new research points to the value of a sharper message than “This may be factually incorrect.” It might be more effective to note: “This may be an attempt by certain special interests to manipulate how you think.”
Such a statement effectively “inoculated” people against a key piece of climate-change misinformation, writes a research team led by Cambridge University psychologist Sander van der Linden.
The researchers’ study provides encouraging evidence that “a basic explanation about the nature of disinformation campaigns” can “preemptively refute” their disingenuous arguments.
The study, published in the new journal Global Challenges, featured 2,167 American adults recruited online via Amazon’s Mechanical Turk website. All were asked a series of questions regarding climate change, including “What percentage of climate scientists have concluded that human-caused climate change is happening?”
Before answering, some participants were given the factual answer (97 percent) in an easy-to-comprehend pie chart. Others were shown a widely discredited document — a petition supposedly signed by 31,487 American scientists asserting “there is no convincing scientific evidence” that greenhouse gases are disrupting the Earth’s climate. Still others saw both.
Another group of participants also saw both messages, but they were framed with one of two counter-messages: a short one noting that “some politically-motivated groups use misleading tactics to try to convince the public that there is a lot of disagreement among scientists,” or a longer one pointing out specific problems with the “petition project,” including the fact that many of its signatories — including Charles Darwin — are clearly fakes.
As expected, the factually incorrect petition affected people’s beliefs. When both it and the accurate figure were presented, “the informational value of the (accurate information) was negated completely,” the researchers write.
Specifically, people exposed to both messages estimated the scientific consensus at 73 percent — virtually the same as a control group that was not given any information at all.
For those who read the warning about misleading tactics, however, that figure increased to nearly 80 percent. And for those who also read the detailed description of why the petition is widely derided, it was at nearly 84 percent.
Given that “perceived scientific agreement is a key determinant of the public’s opinion of climate change,” this is an important finding. It suggests public attitudes on this issue — and perhaps others — “can be effectively ‘inoculated’ against influential misinformation,” the researchers conclude.
Importantly, the statements of skepticism “proved equally effective across the political spectrum,” they add. Fears that such messages would lead to backlash “among those who are ideologically predisposed to be skeptical about climate change” proved unfounded.
The research suggests reminders of the scientific consensus on climate change should be accompanied by warnings that “politically or economically motivated actors” may seek to undermine our belief in that basic fact. The researchers argue that “a basic explanation about the nature of disinformation campaigns” can be used “to preemptively refute such attempts.”
This just might be a clever way to transcend ideological bias and convey actual information. After all, while people tend to believe “facts” that confirm their political beliefs, no one likes the idea of being manipulated.