Skip to main content

Are Swing Voters Real?

If a new study is indeed correct, there certainly aren’t many of them—and politicians should stop spending so much to try to persuade them.

By Nathan Collins


(Photo: Alex Wong/Getty Images)

Mitt Romney got roughly a five percent bump in the polls following his first debate with Barack Obama in 2012, an increase most politicos attributed to swing voters being disappointed with Obama’s phoned-in performance. But there’s a different explanation, according to a new study: Polls shifted not because opinions changed, but because pollsters asked different people for their opinions before and after the debate.

That’s an important distinction not only for our understanding of politics, Andrew Gelman, Sharad Goel, Douglas Rivers, and David Rothschild argue in the Quarterly Journal of Political Science, but also for politicians, who spend billions of dollars trying to persuade swing voters who may, in fact, not exist.

Yet the discovery was largely an accident, Gelman writes in an email. Goel and Rothschild had been studying data from a 2012 Web-based survey of 83,283 people; on any given day, about 7,500 responded to survey questions, so they had a lot of data. Meanwhile, Gelman had been working on figuring out how to adjust polling results for differences between the general population and the people actually polled. “When we did all this, David and Sharad noticed this stunning result that, after adjustment, support for [Romney and Obama] was very stable over time,” Gelman writes.

Polls shifted not because opinions changed, but because pollsters asked different people for their opinions before and after the debate.

That’s in sharp contrast to what most polls at the time showed. In September, prior to the first debate, a Pew Research poll had Obama up 51–42, with the rest undecided. In October, Romney was tied 46–46.

How could that be? Most organizations conduct their polls by calling up 1,000 people at random and asking who’d they vote for. After a week goes by, the organizations do it again, but with a different set of 1,000 people — and that’s crucial.

Imagine the country is evenly split between Romney and Obama, and no one ever ever changes their minds. The chance that a given poll randomly calls exactly 500 Romney and 500 Obama supporters turns out to be quite small—only 2.5 percent—and the chance you’d call equal numbers of Romney and Obama supporters twice in a row is even smaller. Meanwhile, the odds that pollsters would call 510 Obama and 490 Romney supporters (or vice versa) are a relatively high 2.1 percent. In other words, polls can jump a few percentage points from one round to the next, even if no one actually changes their mind.

But is this all that happened after the first 2012 debate? It would appear so, going by two additional data sets (along with Rothschild and Goel’s work). The first data set comes from the Pew survey, which asked whom participants voted for in 2008. In September, 47 percent said they’d voted for Obama, while in October only 42 percent did so—a hint that Pew somehow contacted more conservatives in the October poll, thereby biasing the results.

Stronger evidence comes in the form of the RAND Corporation’s American Life Panel, which surveys the same 3,666 people every week. Tellingly, Obama polled between 51 and 54 percent between September and the election, and while there was a drop after the first debate, it was small compared with Pew’s results.