Questioning Questions in Evaluating Polls

How you ask, what you ask and when you ask can all affect what you get in conducting polls.
Publish date:
Social count:
How you ask, what you ask and when you ask can all affect what you get in conducting polls.

Recently a conservative organization's solicitation letter (aka junk mail) arrived in my mailbox pleading for funds to "clean up television." I strongly agree that there's much on TV that I would like to see changed, but my list would primarily be to eliminate stupid reality shows and idiotic cable news commentators interrupting each other in shouting matches.

This letter had something else in mind. Attached to the donation card was an "official poll" asking several questions including: "Are you in favor of television programs which major in gratuitous violence such as murder, rape, beatings, etc.?" and "Do you favor the showing of obnoxious and edited R-rated movies on network television?"

I don't think that the folks who would say "yes" to gratuitous murder (are other kinds of murder OK, then?) or favor obnoxious movies on TV are really the target group for the donation. And I also don't believe they were trying to recruit film buffs who abhor the editing of movies by anyone other than the film director. The solicitation "survey" leads recipients to check "no" for the answers, but to say "yes" to sending a check to correct these media problems.

What you see here is a technique that orients and biases questions in specific directions through the use of loaded words and leading phrases. Learning to critically evaluate surveys and public opinion poll questions are important skeptical skills to use when faced with a daily dose of possibly deceptive data, even from professional surveys trying to present honest results.

Let's look at a couple of real examples from some official surveys. The Pew Research Center for the People & the Press is a well-respected national nonpartisan public opinion research organization focused on policy issues and the media. They have done extensive work on designing surveys and illustrate the impact wording can have on responses.

SKEPTIC'S CAFEPeter Nardi discusses how to use our critical skills to avoid scams, respond to rumors and debunk questionable research.

Peter Nardi discusses how to use our critical skills to avoid scams, respond to rumors and debunk questionable research.

One example is from their own January 2003 survey asking respondents whether they would "favor or oppose taking military action in Iraq to end Saddam Hussein's rule." When worded that way, 68 percent said they favored military action and 25 percent said they opposed it. However, when the question was written as: "favor or oppose taking military action in Iraq to end Saddam Hussein's rule even if it meant that U.S. forces might suffer thousands of casualties," a major reversal of opinion occurred. Only 43 percent said they favored military action and 48 percent said they opposed it.

Consider also these curious findings from a February 2010 CBS News/New York Timespoll. When people were asked if they "favored or opposed gay men and lesbians serving in the military," 51 percent responded "strongly favor" and 19 percent "somewhat favor," for a total of 70 percent approval. Yet, when the wording was changed to "favored or opposed homosexuals serving in the military," 34 percent answered "strongly favor" and 25 percent "somewhat favor," for a 59 percent approval rate.

Although a majority of Americans were supportive regardless of the wording, note how using "gays and lesbians" instead of "homosexual" created a more positive outcome. Perhaps the H-word highlights the sexual too much for many people's comfort.

Another less obvious technique for leading respondents toward intended answers makes creative use of the order of poll questions. If one were to ask residents, for example, about their opinion of the effectiveness of the local mayor after first inquiring about their views on several problems facing the city (such as budget problems, potholes in the streets, crime), a different outcome would likely occur compared to asking them how their mayor is doing right at the start of the survey.

Such an explanation is what Nate Silver believes accounts for the major discrepancies between Fox News polls and other non-Fox surveys focused on the health care reform bill.

For a reasonably-worded question, "Based on what you know about the health care reform legislation being considered right now, do you favor or oppose the plan?", 53 percent of those interviewed stated they were opposed to the plan. On average, Fox's numbers across several surveys showed a 14 percentage point difference between those who favored the legislation versus those who opposed it, compared with only a 2 percent difference in non-Fox polls. One reason may be due to the placement of the health care items after a set of questions that included: "Do you think President Obama apologizes too much to the rest of the world for past U.S. policies?" and "Do you think the size of the national debt is so large it is hurting the future of the country?"

This is a wonderful case of a properly worded question on the health care plan following a set of leading and loaded questions that likely created a negative context to President Obama's policies. When the findings were announced, no mention was made of these other items and the placement of the health care item after them.

So in addition to assessing the phrasing of survey items, investigate the context in which questions appear. I'm sure you agree with me on the importance of questioning questionnaire design and its impact on survey results. Select one: a) Strongly Agree or b) Somewhat Agree.