The Unique Danger of Implied Misinformation

New research suggests it’s harder to dislodge vague implications than overtly stated errors.

Remember when, back in July, Planned Parenthood officials were caught on tape selling organs harvested from aborted fetuses? OK, the ambiguous “sting” video didn’t precisely show such a transaction, and sure, most mainstream news outlets avoiding overtly making that assertion.

But the prominent play the video received certainly implied that something nefarious was transpiring. And, according to a troubling new study, that sort of journalism—in which key facts are uncertain, leaving readers or viewers to connect the dots—may inspire people to create their own inaccurate narratives, which are particularly resistant to later revision.

When it comes to news stories, “Misinformation that is ‘merely’ implied is more difficult to eradicate than misinformation that is stated directly,” conclude Kent State University psychologists Patrick Rich and Maria Zaragoza. Writing in the Journal of Experimental Psychology: Learning, Memory, and Cognition, the researchers describe a series of studies that demonstrate this phenomenon.

The first study featured 357 university undergraduates, all of whom read one of three versions of a fictional news report describing a jewelry theft at a private home while the owners were on vacation, and the subsequent police investigation. The information was presented in a series of 13 messages resembling news bulletins (or perhaps tweets), which were updated as new facts were uncovered.

“Misinformation that is ‘merely’ implied is more difficult to eradicate than misinformation that is stated directly.”

One version of the story explicitly fingered the couple’s son, declaring that he “may have taken the (jewelry) box from the home.” Another left out that statement but implied the son was involved, noting that he was paying off “recent gambling debts.” The third version mentioned that the couple had a son, but did not implicate him in the robbery.

Approximately half of the participants who read each version also saw a correction added to the bottom of the story, noting that the son “had been called away on business” and could not have committed the crime.

After completing a 20-minute filler task, each participant completed a written questionnaire about the news story, answering nine factual questions (such as “What did the police notice about the bedroom window?”), and nine “inference questions” (including “How do you think the thief got into the locked drawer to steal the jewelry box?”).

The researchers were particularly interested in that second group of questions. Previous research has consistently shown that, when asked to make deductions, people often reach back into their memories and use the information they initially received about a given incident—even if that information was later discredited. Subsequent corrections reduce, but do not eliminate, this tendency.

But what happens if the misinformation was only implied? Surprisingly, the results suggest the initial, incorrect account sticks in the brain even more strongly.

As expected, the correction reduced participants’ reliance on the initial misinformation. But it had a stronger effect on those who were explicitly told he was a suspect. Those who read the alternate account merely implying the son’s guilt were more likely to use the initial, inaccurate account to draw inferences.

These results were replicated in a subsequent experiment, which found the effect persisted even when a stronger correction was provided—one that pointed to evidence implicating someone else in the crime. But why?

Rich and Zaragoza can’t say for certain, but they have a strong working hypothesis. They note that people who were not provided with explicit information “had to go beyond the evidence provided in the news story to infer, or self-generate, the likely cause of the outcome.”

This apparently resulted in a “richer, more elaborate story representation” than the one that formed in the minds of the people who read the explicit statement about the son’s possible guilt. It’s conceivable that, having constructed the narrative in their own minds, participants were more hesitant to give it up. Or perhaps it simply lingered longer in their memories.

With prudent caution (especially in the light of recent revelations of overstated conclusions in other psychology studies), the researchers note that their findings “will need to be replicated and extended” to find out if they can be generalized to all kinds of news items. Of course, other factors can and do influence our tendency to believe or disbelieve a story, including personal experience and political ideology.

Nevertheless, their findings are robust, and track nicely with Stephen Colbert’s concept of “truthiness“—the notion that, if an assertion feels right to us, we tend to accept it as valid. This study suggests the mental effort it takes to put one and one together may help create, or at least confirm, such gut-level beliefs.

Even when the answer we come up with is “three.”

Findings is a daily column by Pacific Standard staff writer Tom Jacobs, who scours the psychological-research journals to discover new insights into human behavior, ranging from the origins of our political beliefs to the cultivation of creativity.

Related Posts