Skip to main content

Fake News From Facebook Has Had Very Little Impact on Voters' Beliefs

New research finds that social media is not the main driver of why so many people fall for misinformation about candidates and issues.
A photo taken on May 16th, 2012, shows a computer screen displaying the logo of social networking site Facebook.

A photo taken on May 16th, 2012, shows a computer screen displaying the logo of social networking site Facebook.

As the 2020 election approaches, we're still letting go of some erroneous early explanations of the 2016 results. The notion that Donald Trump's support was disproportionately driven by the white working class has been definitively debunked. And the belief that his victory can be attributed to fake news spread on Facebook also appears to be highly dubious.

The latest research to limit Mark Zuckerberg's liability has just been published in the online journal PLoS One. R. Kelly Garrett of the Ohio State University concludes that, "during the 2012 and 2016 presidential elections, social media contributed relatively little to Americans' willingness to endorse political falsehoods."

"There is simply no evidence that social media are having a powerful and consistent influence on citizens' belief accuracy," he writes.

Kelly analyzed data from panel surveys conducted during the 2012 and 2016 elections. In both years, "a large, representative, general population sample of Americans responded to the same set of survey questions at three points during the election cycle." They were specifically asked about social media use and their belief in several widely circulated untruths.

The 2012 survey focused on falsehoods about the two major-party candidates, including statements such as, "Barack Obama is a Muslim," and, "As Governor of Massachusetts, Mitt Romney signed a health-care law providing taxpayer-funded abortions." The 2016 survey focused on issues, noting what percentage of people endorsed statements such as, "Most Muslims support violence against Western countries," and, "Human activity has no influence on climate change."

The results: Heavier use of social media "was associated with a slight increase in the likelihood of endorsing falsehoods against President Obama in 2012, but it had no effect on beliefs about the Republican candidate," Kelly reports. "In 2016, social media use had no measurable aggregate influence in issue beliefs."

Beyond that null result, the study offers some good news for Facebook. Among the heaviest uses of social media for political news, those who used Facebook were more likely to answer the questions accurately than those who only used other social media, such as Twitter.

"The magnitude of this effect is small," Kelly writes, "but it does call into question the presumption that Facebook had a uniquely harmful influence" in those elections, at least in terms of users accurately understanding the policy issues.

The results suggest that the influence of social media on what we choose to believe is "quite modest," Kelly concludes. But he cautions that "the fact that these effects are small does not mean that they are unimportant. When election margins are small, even small differences can be decisive. Furthermore, it is likely that there are subsets of the population for whom these effects are larger."

"For example, traffic to 'fake news' sites in 2016 was driven in large part by individuals whose overall news diets were highly conservative, making stronger effects among this group more likely," Kelly writes.

Nevertheless, these findings suggest that the rise of social media does not fully explain "how political misperceptions have come to be a hallmark of the contemporary political environment." In Kelly's view, it's far too easy to blame Facebook for such disturbing trends as "the willingness of millions of Americans to conclude that science is corrupt and economic data and financial models are untrustworthy."

The problem goes deeper than our news feeds.