Chances are you heard some blatantly untrue statements during last night’s debate. It’s a cynical, manipulative strategy, but it works: Psychological studies have consistently shown that oft-repeated statements are more likely to be perceived as true, regardless of their actual veracity.
Since this “Illusory Truth Effect” was first noted in the late 1970s, it has been widely assumed that this ploy is effective only on people unfamiliar with the issue in question. Knowledge of the subject matter will lead people to dismiss the lie and distrust the liar, one might assume.
But a newly published study reports that’s not necessarily true: Even those of us with a solid grasp of the issue at hand are susceptible to this sort of misinformation.
“Reading a statement like ‘A sari is the name of the short plaid skirt worn by Scots’ increased participants’ later belief that it was true.”
“The results of two experiments suggest that people sometimes fail to bring their knowledge to bear” when evaluating a statement, a research team led by Vanderbilt University psychologist Lisa Fazio writes in the Journal of Experimental Psychology: General. Rather, we rely on “fluency“—the ease or difficulty of comprehending a piece of information.
Statements you’ve heard many times are easier to process, and this ease leads people “to the sometimes false conclusion that they are more truthful,” the researchers write. Their key—and disheartening—revelation is that they found examples of this unfortunate dynamic “even when participants knew better.”
The first of their experiments featured 40 Duke University undergraduates. Each was presented with a series of statements—some true, some false. Half of these referred to pieces of common knowledge, making falsehoods relatively easy to spot; the others featured relatively obscure facts (or “facts”), making the discrimination process somewhat trickier.
Afterwards, participants were given a set of 176 statements—some of which were repeats from the aforementioned first phase of the experiment—and asked to label each on a scale of one (“definitely false”) to six (“definitely true”).
Finally, participants answered 176 multiple-choice questions to assess their actual knowledge of the statements they had rated. For instance, they were asked “What is the largest ocean on Earth?” and given three options: Pacific (true), Atlantic (the false answer they had previously been exposed to), or Don’t Know.
The researchers found that repeated falsehoods were more likely to be accepted as accurate, “regardless of whether stored knowledge could have been used to direct a contradiction.” To put it more bluntly: “Repetition increased perceived truthfulness, even for contradictions of well-known facts.”
“Reading a statement like ‘A sari is the name of the short plaid skirt worn by Scots’ increased participants’ later belief that it was true,” Fazio and her colleagues write, “even if they could correctly answer the question ‘What is the name of the short pleated skirt worn by Scots?'”
That finding kilt me.
The second experiment was a simplified version of the first. This time, the participants (40 different Duke undergrads) judged statements as “true” or “untrue,” rather than using a six-point scale. The researchers found the illusory-truth effect once again emerged, even for common-knowledge subjects where falsehoods should have been easy to detect.
So before you declare that Obamacare is a disaster (it’s actually working well) or vaccines cause cancer (they don’t), consider how that notion got into your head, and whether it aligns with what you actually know. If there’s some question, check it out with an actual expert.
Repeating a falsehood won’t make it true, but it may make you think it is true.
Findings is a daily column by Pacific Standard staff writer Tom Jacobs, who scours the psychological-research journals to discover new insights into human behavior, ranging from the origins of our political beliefs to the cultivation of creativity.