Did you hear that scary story on the news? My sister told me she heard about it from a friend, who saw it on CNN. It sounds like we’re in big trouble!
Or are we? New research shows information not only gets garbled as it is transmitted from person to person—it gets slanted in an increasingly negative direction. That original news item about a possible Ebola outbreak can gradually morph into “We’re all going to die!”
“The more people share information, the more negative it becomes, the further it gets from the facts, and the more resistant it becomes to correction,” said University of Warwick psychologist Thomas Hills, who authored the paper with Robert Jagiello. “This research explains why our world looks increasingly threatening, despite consistent reductions in real-world risks.”
The study, published in the journal Risk Analysis, featured 154 people in Great Britain and Luxembourg. They were broken up into 14 “chains” of eight participants apiece.
The first participant in each chain read four articles about a “high-dread” topic (nuclear energy) or a “low-dread” one (food additives). These factual, balanced reports came from respected outlets including National Geographic and the BBC.
The participant wrote up a summary of the information, and passed it on to the second person in the chain. The second person then read that description, paraphrased it, and passed it along to the third person—a process that continued until the eighth and final member of the chain received the information.
Coding each message for negative and positive words, the researchers found the emotional tone of the messages got darker with each new iteration. “The proportional amount of negative statements made by subjects increased as messages were transmitted from node to node,” they write.
Not surprisingly, this effect was stronger for the “high dread” topic of nuclear power. And as the information grew more negative, participants felt more threatened by it.
To see if the reintroduction of impartial, factual information into the discussion could stop this spiral, the researchers gave half the participants in the sixth position the original information, as well as the summary provided by the person just before them in the chain.
This infusion of facts was “ineffective in reducing bias,” the researchers report. Reading the original articles did decrease factual distortion, but it “did not influence the amount of negativity transmitted from one person to the next,” they write.
The researchers believe this reflects several psychological mechanisms that leave us vulnerable to misinformation. For obvious reason, humans evolved to be alert to possible threats, leading us to focus more intently on negative news. Transmitting information appears to intensify this bias.
In addition, “Message from peers are likely to have more weight, and hence greater influence, with regard to selective retransmission of information,” they write. That may have led participants toward the bottom of each chain “to disregard the positive side of the balanced information in favor of a confirmation of the negatively focused facts.”
In this era of social media, when people often get information from “friends” rather than credible news sources, the implications of these findings are clear and disturbing. Hills and Jagiello don’t offer any obvious solutions, but being aware of this negativity bias—and refraining from automatically forwarding unconfirmed information, even if it sets off alarm bells in your brain—would be a start.