Skip to main content

How Does the Media Choose Which Election Polls to Cover?

They opt for the ones that suggest big changes in the horse race, according to new research.

By Nathan Collins


(Photo: Andrew Renneisen/Getty Images)

This being election season in the United States, we are absolutely inundated with polling data, with everyone from Gallup to Rasmussen to FiveThirtyEight providing daily (or even more frequent) updates on the state of the Hillary Clinton-Donald Trump horse race. But not all polls are created equal, it seems: News media coverage doesn’t treat each poll the same, with television news preferring to cover mainly the most dramatic results, according to a new study.

“Given all the discussion on how the media covers polls … the results from this piece suggest the media potentially wields a lot of power in that they pick and choose what polls get aired — and we show they tend to pick polls with extreme results,” Kathleen Searles, an assistant professor of political science at Louisiana State University and lead author of the new paper, writes in an email.

“Moreover, the way media cover polls that make it on air differs from the actual poll results,” she writes. “This suggests that not only are the gatekeepers selective in the polls they cover, but their coverage of their polls is distorted.”

Those results are based on polls that ABC, CBS, NBC, CNN, MSNBC, and Fox News covered between June 4, 2008—the date Clinton left the 2008 Democratic primary race—and November 8, 2008. Searles, Martha Humphries Ginn, and Jonathan Nickens compared that coverage with much more inclusive polling databases from PollingReport and Pollster, with an eye toward how polls that got coverage differ from those that didn’t.

The polls that are heavily covered could bear little resemblance to reality.

The main thing that drives coverage, the team found, is change: Polls that reflected bigger changes in the Barack Obama-John McCain margin were more likely to receive media attention. That’s a problem, the authors write, since it suggests the polls that are heavily covered could bear little resemblance to reality. Indeed, using the RealClearPolitics poll average as a baseline, the researchers found that the story told by heavily covered polls bears little resemblance to the one told when all polls are included. Swing voters in the polls that are most often covered, for example, tend to number much higher than what the polling average actually suggests, perhaps reflecting the fact that there’s a great deal of statistical noise involved in poll results.

Despite the results’ timeliness, Searles writes, the researchers’ original aim was not simply to explore how the media covers polls, but rather how the media might distort all sorts of coverage. “We started looking at this topic because there is a literature in political communication on news values that suggests that the gatekeepers (editors, reporters) privilege some stories over others based on timeliness, human interest, novelty, scandal,” and so on, Searles writes.

The problem is, it’s very hard to figure out what stories news outlets didn’t cover. “Polls present us a unique opportunity to observe what gets covered and what doesn’t,” Searles adds, “and perhaps even more importantly, give us a glimpse into the kind of polling stories gatekeepers privilege (and the distortion that results).”