One of the regular questions that pop up when it comes to Australian political polling is whether the pollsters have a bias toward one party or another when it comes to the results of their respective vote estimates. Even though “bias” is a technically correct way to describe what we’re looking at here, it unfortunately has connotations in normal, non-statistical usage that aren’t particularly helpful – so instead of speaking in terms of bias it’s probably more insightful to talk in terms of pollsters having a “relative lean” toward one party or another.

Since we don’t actually know what the true state of public opinion is when it comes to political support levels at any given time – we only get to find that out at elections – we can’t say that any given pollster produces results that are pro-Labor or pro-Coalition. If we don’t know the true state of public opinion, we cannot say how far away from that unknown true state any given poll is.

What we do know however, is that there are five pollsters we regularly follow here (6 if we include the irregular Galaxy – and we’re treating Morgan Phone Polls as separate to Morgan Face to Face polls) that all provide estimates of that true value of public opinion using some pretty rigorous statistical backroom work. From these estimates, we can see how each pollster “leansrelative to other pollsters. Not necessarily relative to the true value mind you, just relative to each other.

The problem that turns up when trying to do this type of thing is that the number of polls produced by each pollster differs markedly. Essential polls weekly, Newspoll fortnightly while Morgan Face to Face is generally published once a fortnight, but not always. Morgan Phone poll on the other hand has published only 14 polls since January 2008 while Nielsen has ripped out only 12. Galaxy is our last big public pollster and comes in at the rear with half a dozen – which are too few in number to use and why we’re forced to ignore Galaxy here – but they’re not particularly different from any of the phone pollsters in its results.

The problem with this hugely different publishing schedule is that it can distort simple comparisons like averages. For instance, the average ALP primary vote in Morgan Phone Polls for the Rudd government is 48.75, while the Nielsen average of the same is 44.75. That’s a large 4 point difference, but a difference that is entirely explainable on the basis of when each pollster actually polled in the field (as we shall see).

What we need to do in order to minimise this particular distortion is to generate long term trends for each pollster, and then compare that trend activity. Here I’ve used a rather passive locally weighted polynomial regression to generate trends for the 5 pollsters on both the primary votes and the two party preferred. Morgan Phone Poll is a little less smooth than the others because of the way that they’ve had clusters of polling, running a bunch of polls early in 2008, then being pretty sparse for 12 months or so before having another small bust of polling recently – so just keep that in mind when looking at the Morgan Phone poll trend lines.

First up, on the ALP primary vote – this is what the trends look like charted. (click to expand)

allpollstersalpprim

As we can see, all pollsters generally move together over any arbitrary period of time on the ALP primary vote. Also obvious is how the three phone pollsters (Newspoll, Nielsen and Morgan Phone poll) are usually bunched together, while Morgan face to face polling and Essential Report track each other, and are a few points higher for Labor. The phone pollsters have a relative lean to the Coalition, while the non-phone pollsters have a relative lean to Labor.

The size of the difference between the pollsters is also variable, where the early days of the Rudd government produced a larger gap between the highest and lowest trend results for the Labor primary vote compared to what we see currently.

Over the first 2 weeks of August, all the pollsters were in the field (the most recent period of time where that has happened) – if we compare the trend results for that period (not the actual polling results, but the trends generated from the raw polling results), we get:

alpprimarytrend

There’s a 4.2 point difference between the highest and lowest ALP primary vote trend, while the average trend result comes in at 46.4.

We can do the same thing for both the Liberal Party primary vote and the ALP two party preferred.

allpollsterslnpprim

lnpprimarytrend

The spread for the Coalition in August was 2.8 points, a much smaller trend spread than the ALP.

On the two party preferred we have:

allpollstersalptpp

alptpptrend

Interesting with the two party preferred results is the way that the spread in August was only 2.7 points between the highest and lowest trend – significantly smaller than the ALP primary vote spread at the time. This reflects the way the Greens primary vote estimates impact on the ALP primary (often being interchangeable in net terms between pollsters), but get minimised once preferences are distributed.

In fact, most of the differences in the spread on the two party preferred is explained by differences in the spread of the Coalition primary vote.

All up, Essential and Morgan Face to Face have a relative lean of a couple of points to Labor on the two party preferred and nearly twice that on the primary compared to the phone pollsters, while the Coalition primary vote gets higher estimates by a couple of points from the phone pollsters compared to the non-phone pollsters.