FDU Magazine Online - Summer/Fall 2007
   
Taking the Pulse of the Nation, by Rebecca Maxon
Polling numbers are really not precise conclusions, they are scientific estimates, and all scientific estimates should be compared against one another …
— Peter Woolley


The Caveats

Just how do we know which polls can be trusted?

First and foremost, a reputable pollster makes every effort to ensure that a survey population, or sample, is chosen at random. This is typically done using automated telephone dialing on a computer system.

Good polling reports contain full disclosure of methodology — including the sample size, dates of interviewing and statistical margins of error.

The organization releasing the poll, as well as the source reporting it, should have a “track record” of accurate reporting. Journalists are responsible for screening out bad research based on the methodology they are presented.

A good poll is conducted by a polling organization with no vested interest in the results, such as an independent research contractor.

A good poll is conducted by a polling organization with no vested interest in the results, such as an independent research contractor.

Even so, there is the unavoidable inaccuracy called sampling error or statistical margin of error. Based on the number of people surveyed, this number is expressed as plus or minus “x” percent. A survey with a margin of error greater than 3–4 percent usually is not reported. Sampling error rarely creates a problem, however. In fact, the National Council of Public Polls says that polling error in U.S. presidential election polls conducted prior to Election Day over the past 50 years has continually declined.

“From time to time,” says Mrozinski, “you will see polls and surveys that do not follow appropriate ‘standards’ and may tend to introduce a bias into the results in order to enforce whatever the pollsters are trying to reinforce or refute.” These surveys may manipulate sample selection, question wording or question ordering to get a desired result.

Political polling has been done for so long that the structure of the questions and the answer scales is usually based on a proven standard, and the same questions are used from poll to poll and year to year. Many surveys rotate the order of the questions, particularly the order in which they name political candidates, to lessen the effect of question order on results.

On the other hand, sometimes question order is specifically designed to test a particular theory about the influence on one factor on another. For example, a PublicMind poll conducted in March showed that asking survey participants about either race or gender before asking them to choose their favorite for the Democratic nomination led to a significant decrease in support for Sen. Clinton. And, the beneficiary of Clinton’s losses appeared to be former North Carolina Sen. John Edwards, whose support increased by about 7 percent when a question regarding race or gender was asked prior to asking whom respondents supported for the nomination.

“People aren’t willing to admit that they take a candidate’s race or gender into account when deciding who to vote for — but they do,” said FDU political scientist and PublicMind survey analyst Dan Cassino. “These results suggest that the introduction of race and gender into the campaign may move some voters away from Hillary to a white man they think can win: Edwards.”

When in Doubt …

Woolley recommends comparing the results of similar research conducted by differing sources or media. “The major news corporations … do have slightly different results and there are variations, of course, among The New York Times, The Washington Post, CNN and The Wall Street Journal.

“Polling numbers are really not precise conclusions,” he emphasizes, “they are scientific estimates, and all scientific estimates should be compared against one another, particularly when polls are close.”

Pulse of the Nation | Beyond Politics | FDU’s PublicMind

   

FDU Magazine Home | Table of Contents | FDU Home | Alumni Home | Comments

©Copyright 2007 Fairleigh Dickinson University. All rights reserved.

For a print copy of FDU Magazine, featuring this and other stories, contact Rebecca Maxon, editor, 201-692-7024 or maxon@fdu.edu.