How do we assess the wide variety of poll results weâve seen during this election? We asked our resident polling expert at ±«Óătv, Peter Butler, to give us some tips.
Dr. Butler ran the Atlantic office of Decima (now Decima Harris) under Alan Gregg for more than a dozen years. He also worked for Western Opinion Research, a research firm out of Winnipeg, and RDI, a local market research firm (now defunct). Heâs been a consultant for provincial and federal governments, and has done issues research and political polls for the Conservative Party of Canada. He is still associated with The Strategic Counsel a public issues-oriented research firm in Toronto and is currently professor emeritus of Sociology and Public Administration.
He also wrote one of only two textbooks about public opinion polling in Canada, entitled Polling and Public Opinion: A Canadian Perspective.
1) Question the randomness and design.
The trustworthiness of a poll comes down to three things: sample design, the quality of measurements they employed and confidence in the data collection process.
The first thing Dr. Butler thinks about is what kind of sample was taken. The irony of this, Dr. Butler admits, is that commercial pollsters (himself included) are notoriously secretive about their methods.
âI even say to students, Iâm not telling you how I do it, because thatâs how I make my money! Itâs very much a closely guarded secret,â he says.
That said, without knowing exactly how they do it, he trusts Nanos, Decima Harris, Angus Reid and other big name pollsters because they can afford expensive polling equipment like predictive dialers, that generate phone numbers and manage the randomness of samples.
âA CATI (Computer Assisted Telephone Interview) system and a predictive dialer sorts through the supervision of the data collection very effectively,â he says.
2) Ask how the questions were asked.
âWhat you want to know is not just the opinion,â says Dr. Butler, âitâs how intensely held the opinion is.â
Find out if a poll result came from asking simply one question or if it represents a summation from several questions. For instance, instead of asking âWho will you vote for,â Dr. Butler normally uses 5 questions to determine a respondentâs vote choice.
âAsking one question is not enough for me. I group my questions. And there are a number of statistical techniques that are done so that a batch of questions is asked to reflect an issue.â
He also likes to know when a question was asked. Did they ask it at the beginning or the end of a questionnaire? It influences responses since respondents usually try to answer consistently during the course of the interview.
3) Check if the poll was a 'one off' or part of a series of polls.
Polls can be conducted over a period of days or they can be a 'one off'. A poll should explain this part of the methodology from the outset. If they donât reveal this, pollsters can sin by omission.
âAre they telling you the nightly resultsâof a good nightâor do they roll up results of the end of the week,â he asks. âWe just collected 500 cases last night and this is what it shows. But the night before it didnât show up.â
Dr. Butler believes polls become more accurate over time. He likes seeing how polls track responses night after night.
âNic Nanos says, and I think heâs right, that the repeated numbers of polls that are done give you a closer and closer perspective on the accuracy of the opinions being reported.â
4) See error beyond the margin of error.
Margin of error tells you how accurate your sample isâits size relative to the populationâbut there are many other sources of error in a poll.
âPeople are not asking this question enough: what is the refusal rate on your survey? You made how many calls so you could get 500 people? Oh, 3,000. Wow. Polling companies donât want to talk about it,â says Dr. Butler.
Butler says increasing numbers of people are missed in polls. This is a problem. What if all the university professors decided to go on a âno callâ list or refuse to participate? You have demographic holes that require filling.
âAlan Gregg was saying in a recent Globe and Mail article that the telephone poll is going out the window if we have a âno callâ list. If people have their cell phones off, or just become weary of market researchers it is going to be difficult to deliver accurate polls by telephone surveys.â
5) Pick out the bias in national polls.
National polls give a big picture, but the numbers are often misleading. National data is made up of regional data. Pollsters segment Canada into polling regions, however, small regions often get overlooked and this creates bias.
âThe big picture is going to be stuff that emanates from the centre,â he explains. The problem is, âsamples get so much smaller in regions that their numbers always have a higher margin of error.â
If your sample numbers for a region is low, what did you do? âDid you collect more people than you needed to, or did you do a mathematical calculation, called weighting the sample?â
âBut as I tell students, mathematics is not opinion,â he says. Many pollsters, being expert statisticians, do the math.
âEverybody does it. How would we (pollsters) get PEI opinions if we didnât weight samples in a national poll, since there is a low likelihood that many cases would be collected from PEI? Well, I have an issue with that! The issue is that it means that a PEI opinion [where] we only got 85 [responses] are coming in as having the opinion of 250. Itâs not 250 opinions! Itâs a weight.â
6) Understanding the undecided vote.
In the waning part of the election campaign, pollsters try to figure out which camp to put the last percentage of uncommitted voters into. As more people commit, polls get more accurate, but until then, pollsters come up with a battery of questions to better read an undecided electorate.
âIs it really undecided or is it they wonât tell you, and if they were going to vote tomorrow, how did they vote in the past,â he says. âEverybody in the business has to deal with the notion of how do we get at the undecided vote.â Â
âIâd hate to tell you the tricks we have used in the past,â says Dr. Butler, smiling.
He really would, because that would reveal another mystery of his profession.
How pollsters decode polls
While we're being deluged with polls now, the only 'poll' that matters is the one on May 2.
Andy Murdoch - April 20, 2011