Get Connected
  • facebook
  • twitter
  • Sign In
  • Classifieds
  • Sections

Political analysts make sense of polling

CHARLESTON, W.Va. -- When it comes right down to it, running for elected office is a business decision.

It requires a huge investment of time and money, not to mention untold amounts of heartbreak and hurt feelings. Double that if it's a big race, or your opponent is particularly nasty.

Worse, all those investments could be worthless if 51 percent of people like the other candidate better than you. As in business, there are few guarantees in politics.

So politicians try to find advantages. They want to spot trends and predict what their "customers" want in order to get a leg up on the competition.

That's where the pollsters come in.

Forget about the horse race poll numbers you read about in election season, where a candidate might be ahead 10 points today and six points tomorrow. There's more to political polling than that. 

Much of the information collected in polls is never released to the public. Instead, it is used to help candidates find more information about their constituents, determine what issues they care about and figure out what it would take to earn their votes.

And although reporters like to classify pollsters by party, Mark Mellman of the national polling firm The Mellman Group says there is little difference between Democrats and Republicans when it comes to poll results.

"The difference between Democrat and Republican (pollsters) should not be in the science, but in the ways we use the information," Mellman, who typically works for Democrat candidates, said. "If everybody's doing things right, the numbers should be the same."

Mark Blankenship, who runs the Charleston-based Republican polling firm Mark Blankenship Enterprises, said he agrees.

"You don't get paid to be a Republican or a Democrat. You get paid to give good, reliable information," he said.

Preparing the poll

The accuracy of a poll is largely determined by its "sample," the people who are interviewed to collect information.

A poll's accuracy can be negatively affected if that sample contains too many people, or too few. Size has little to do with it. Even a poll that surveys 10,000 people could be incredibly inaccurate, if the wrong people are included.

For example: Blankenship said if you conducted a poll of all U.S. households, that study will run about 20 percent more in favor of Democrats.

But if you limit that sample to registered voters, Democrats' advantage drops by about 4 percent. If you include only likely voters, the Democrat advantage drops to 1 percent.

"At the end of the day, the sample group you have has to balance pretty closely to who you think are going to vote," said Republican political consultant Rob Cornelius.

Polled correctly, Blankenship said 600 responses should give an accurate representation of how the voting public feels.

In addition to surveying the correct people, it also is important to make sure your questions are asked correctly.

Polling firms work hard to come up with questions that don't subconsciously bias poll participants.

Mellman said pollsters typically avoid agree/disagree questions. That's because, for some reason, people are statistically more likely to choose "agree" than "disagree," no matter the topic.

It's also important to watch for words like "prohibit" and "allow."

Mellman said if pollsters asked voters whether Communists should be allowed to teach in public schools, most voters would probably say "no."

But many of those same people also would say "no" if pollsters asked whether Communists should be prohibited from teaching in public schools.

"The language creates a connotation," Mellman said.

Pollsters often include several questions covering the same topic but phrased in multiple ways, to try to escape wording biases.

"You try and hit it from a number of angles, knowing any one question can be a misleading indicator," Mellman said.

It's also important to ask questions in the correct order.

Pollsters always put "sample ballot" questions -- the ones that begin with "If the election were held tomorrow ..." -- before any message-testing questions, where you ask for participants' opinions of candidates and their campaign strategies.

That's because message testing sometimes includes negative statements about candidates or their opponents, and pollsters don't want that information to pollute voters' initial reactions.

Choosing the method

There are several ways to conduct a poll, but most polling firms prefer human operators to Internet and computer-automated polling.

Blankenship said people are generally more truthful when they're talking to another person. Robo-calls also cannot be certain whether they're talking to Mr. Smith or Mrs. Smith (or the family pet, if Fido happens to know how to use the phone).

Mellman said Internet and robo-polls are insufficient for political polling because they do not produce truly random samples of the electorate.

Internet polls are limited to people with Internet access, and federal do-not-call rules prevent robo-polls from calling cellphones. By systematically excluding those people, the poll no longer is using a random sample.

"You're taking 30 to 50 percent of the electorate (out). That's a violation of the fundamental principal of randomness," Mellman said.

Blankenship said auto-dialed polls and Internet polls can be useful in some instances, like collecting voters' general thoughts about candidates or surveying customers of a particular business. But for political polling, he sticks with live operators.

Conducting the poll

Once pollsters have determined their sample, crafted their questions and picked a polling method, it's time to begin calling people.

Blankenship's firm outsources this part of the process, using a telephone bank with 250 call stations capable of conducting hundreds of interviews per night.

The surveys are conducted by real people, but the phone numbers are dialed by a computer-assisted telephone interviewing system, where a computer selects phone numbers at random from a giant database containing tens of thousands of numbers.

This is the gold standard for polling, Blankenship said, because it ensures everyone in the electorate has the same probability of being interviewed.

"It's not something where two people in their kitchen can pick and choose and conduct a telephone interview," he said.

Polling firms buy their databases from companies that collect numbers using telephone surveys.

Those surveys also allow database companies to break numbers down into specific demographics. Want the phone numbers of potential voters age 18 to 65? They can do that. How about homeowners with dogs? Chances are, there's a database out there for that.

The increasing use of cellphones has made things a little more difficult for pollsters. Because federal do-not-call laws prevent automated calling software from dialing cellphones, computers must randomly generate a number from its database, display it on a screen and the operator must dial the number.

That means cellphone surveys take longer and, subsequently, cost more money.

"It's made our product more costly, but it's also made it more representative and helped reduce what was becoming a very noticeable degree of coverage error," Blankenship said.

The timing of the telephone polls is important, as well.

Pollsters don't want to conduct surveys on Wednesday night, because lots of people attend midweek church services. Saturdays and Sundays are bad, too. Pollsters also try to avoid calls during daytime hours, because many people are at work.

"All of these things can affect your reliability and accuracy of your survey," Blankenship said.

That's why most telephone polls occur during weekday evenings.

Making sense of the numbers

Call centers return polls results to polling firms in big computer documents, filled with lots of unorganized numbers.

"To the naked eye, it would look like hundreds of thousands of numbers in what would look like a disorganized Excel spreadsheet," Blankenship said.

His firm runs this file through a program that separates all the questions and answers, turning the raw data into "cross tabs" outlining every finding of the survey.

These reports can be hundreds of pages thick, but would provide very little help to a polling firm's clients. So, before presenting the findings, an analyst sits down with the giant stack of paper and reviews each line of data.

It's about finding a story in the numbers. How are voters responding to the candidate's new ads? How are women voters responding? How about older voters, who traditionally vote Democrat?

Analysis lets pollsters find out.

Once the report is finished, it's time to go back to the client.

"Sometimes they're not the most pleasant conversations. A lot of times, pollsters deliver bad news," Blankenship said. "You don't get paid to sugar coat. If it's not going the right way, it's tough medicine to swallow, but they are asking you to tell them the truth."

Blankenship said poll results are important to candidates, even if they aren't good. Bad news allows candidates to reassess their tactics and make changes where necessary.

"If it tells you something bad, you don't have to release it," Cornelius said.

The "horse race" aspect of polling isn't that important to candidates, anyway. Cornelius said it's more important to figure out what voters care about and what they don't like about your opponent.

"It might not move them to you, but it might move them to undecided. And that's a start," Cornelius said.

Contact writer Zack Harold at 304-348-7939 or Follow him at


User Comments