The fortnightly blanket media coverage of the latest political opinion poll does more than reflect public opinion — polls can help shape it. But how do these polls work and are they accurate? Last week, one poll had the two major parties deadlocked on the two-party-preferred count, while another had a 10-point gap. So who do we trust?
How are polls conducted?
Political polling is done over the phone through market research companies such as Roy Morgan Research, Nielsen, Newspoll, Essential Research and Galaxy. Galaxy contacts landline and mobile numbers, while Newspoll is restricted to landlines. Online polls also exist (Essential Research) but phone polling is still preferred. Galaxy managing director David Briggs says whereas the phone book gives you access to about 85% of the Australian population, online databases are generally capped at about 1 million Australians.
However, political commentator Charles Richardson says the number of people you have access to is a red herring: “You only need a couple of thousand, provided they’re representative. Both phone polling and internet polling have problems getting a representative sample; phone tends to exclude the young, internet tends to exclude the old. I think phone polling is pretty clearly still superior to internet polling, but the gap is closing.”
A typical poll on voting intention requires about 1000 responses to be accurate. That could mean burning through 12,000 phone numbers if there’s a low response rate, says Briggs. Newspoll’s CEO Martin O’Shannessy says it has about a one in six strike rate. When someone does pick up, they’ll find that the questionnaire is short; about 8-15 questions, which wouldn’t take longer than two minutes.
So how does polling work?
Sampling works based on mathematical laws of probability. If it’s a truly representative sample, 1000 responses is enough to represent the whole to within about three percentage points. That plus or minus 3% is the “margin of error” and varies based on the sample size. The result could be more accurate with a larger sample and smaller margin of error. However, for pollsters to halve that variation from 3% to 1.5%, they’d have to quadruple the sample size from 1000 to 4000, and this is where you hit a wall with the laws of diminishing returns. Polling is costly, and pollsters say that level of accuracy really isn’t necessary when a sample of 1000 does the job.
How do they make the sample as accurate as possible?
Pollsters control the sample through several techniques. O’Shannessy says they try to make the polling as much a scientific process as they can, with a goal of giving everybody a chance of selection. Besides making sure they’ve got the right numbers in terms of age, gender and area, the sample is also as random as possible. “We might say something like ‘I’d like to talk to the person with the most recent birthday’ hence filling the statistical need of giving everybody a chance of being selected,” he said.
Even then, the sample is rarely a perfect match. Pollsters will minimise the degrees of variation by “weighting” data based on ABS population estimates, putting results through statistical software so that it reflects the population at large.
When gathering data, pollsters also minimise the scope for systemic variation. For example, there’s a strict order in how questionnaires are put together — voting intention is always asked first so answers are not “contaminated by any other issues lying around”, says Briggs. Consistency of wording is also important. O’Shannessy says Newspoll has been asking its questions in the same way for more than 20 years. Party names may change from time to time but the question always ends with “which party do you prefer”?
Which polls can be trusted? How often are are they wrong?
Rarely. Professor of political science at the Australian National University Ian McAllister says that for the most part, pollsters can be relied upon. From time to time, there are rogue polls and outliers when the sample hasn’t been correctly selected. However, a look at poll records prove reliability. O’Shannessy said: “Newspoll has done 52 polls handled immediately before a state or federal election and we got it right every time. In fact we even often detect late swings at the last moment. For example, in 2010, everyone was calling it a Labour parliament, but we actually called it a hung parliament based on our poll.”
Richardson says Roy Morgan, Nielsen and Newspoll have a long history and generally been accurate: “Newspoll generally puts its neck on the line a bit more because it comes out more frequently, and so you get slightly more variations. Nielsen is probably better at giving you a longer-term trend, but because it has those long intervals, you might miss out on some of the movement in between. Galaxy has a less-established record but in the time it’s been operating, it has been very accurate.”
Crikey is committed to hosting lively discussions. Help us keep the conversation useful, interesting and welcoming. We aim to publish comments quickly in the interest of promoting robust conversation, but we’re a small team and we deploy filters to protect against legal risk. Occasionally your comment may be held up while we review, but we’re working as fast as we can to keep the conversation rolling.
The Crikey comment section is members-only content. Please subscribe to leave a comment.
The Crikey comment section is members-only content. Please login to leave a comment.