Consensus Is No Match for False Balance - Pacific Standard

Consensus Is No Match for False Balance

Quoting both sides leads people to think there's controversy, even when they're told there's consensus.
Publish date:
Social count:
(Photo: Joanna Bourne/Flickr)

(Photo: Joanna Bourne/Flickr)

In their tireless quest for balance, journalists have a tendency to report all sides of any given issue. But balance can be a dangerous thing if it doesn't reflect the weight of evidence supporting the various parties in, say, the political debate over climate change. Concerns about false balance recently led the BBC and other news agencies to report more on the support different viewpoints actually enjoy, but new experiments show that's probably not enough: Simply presenting differing points of view is enough to convince people there's controversy, even when the opposite is true.

The issue is not how false balance affects readers', listeners', or viewers' beliefs about an issue, but rather how it affects their beliefs about what experts believe. "[F]alse balance can exert a distorting influence on perceptions of what the experts think ... even when precise numerical information is presented regarding the full distribution of expert opinion on the topic," University of Waterloo professor of psychology Derek Koehler writes in an email.

In other words, you can tell people that 97 percent of climate scientists agree climate change is real and that humans caused it, but if you interview a climate change denier, they'll still think there's a controversy.

Those who read expert opinions were about one-third less likely to believe there was consensus on the issues.

False balance isn't limited to climate change, of course. In his experiments, Koehler actually studied false balance in movie reviews and economic issues. In the latter case, he chose four questions asked of the IGM Economic Experts Panel. On two of those—about carbon taxes and surge pricing for taxis—the experts largely agreed, while on two others—concerning the effects that information technology and a federal minimum wage could have on jobs—they were split fairly evenly.

Koehler presented the exact numbers to 393 people online. About half of those people also read brief statements from economic experts on both sides of each of the four issues, which for carbon taxes and surge pricing mimicked false balance. Then, everyone took a short poll designed to see whether they thought there was an expert consensus.

For the two genuinely controversial subjects, presenting balanced viewpoints had no effect—people correctly believed there was nothing like an expert consensus on information technology and the minimum wage, whether they'd seen statistics alone or with example viewpoints.

On the uncontroversial issues, however, false balance made a substantial impact. Those who read expert opinions were about one-third less likely to believe there was consensus on the issues compared with those who saw only how many experts came down on both sides, and they were slightly less likely to believe there was enough agreement to guide real-world policy decisions—even though they'd seen the numbers and should have known there actually was widespread agreement.

"[W]eight-of-evidence reporting has been suggested as a remedy for potential distorting influences of false balance," yet false balance still had an observable effect on people's beliefs, Koehler writes. Perhaps, he adds, more vivid, concrete depictions of the weight of evidence would stand a greater chance against the power of false balance.


Quick Studies is an award-winning series that sheds light on new research and discoveries that change the way we look at the world.