Following a series of dramatic polls in the final weeks of election season, Professor Thomas Lumley from the University of Auckland's Department of Statistics tackles some key questions.

We've seen some dramatic poll swings this election. Last night's Newshub/Reid Research Poll put National at 47.3 per cent and Labour at 37.8 per cent. That compared with the earlier poll by the same agency on September 3 - putting National at 43.3 per cent and Labour at 39.4 per cent. Both of those polls were conducted over different periods and had a 3.1 per cent margin of error, based on 1000 people being surveyed, 750 by telephone and 250 by panel. Can we attribute the change entirely to a shift in voter sentiment, or are there other factors we need to be aware of?

A single poll is not enough to be at all confident that we're seeing a change in voter sentiment.

The theoretical margin of error for a difference between two polls of that size is nearly 4.5 per cent, and the theoretical margin-of-error calculations understate the variation in real polls.


If a second poll gets very similar results we can be more confident that there are real changes in underlying sentiment, but even then it would be uncertain how big they are.

The 1 News Colmar Brunton poll in early September notably put Labour ahead at 43 per cent, compared with National at 39 per cent. This was a random phone poll based on 1000 people of whom 800 had a party preference. Many people have compared this result with last night's. Given the difference in survey sample is this an apples versus oranges situation?

It's a meaningful comparison: you would expect differences between polls from different polling firms, and we see differences in past data, but not differences anywhere near that large.

Either sampling error or true changes in sentiment, or both, must have been involved.

In the wake of Brexit, the US presidential elections and this year's UK general elections, we saw many commentators arguing how many poll results had been at odds with the actual outcomes. Predictably, since last night, we've seen people again question whether we can still rely on polls to give us an accurate reflection of the electorate. Do you think these complaints are justified?

I think we've gone from too much belief in polls to too little after the elections.

Some pollers correctly predicted both the UK and US results, in the sense that the result was well within their predicted margin of error.

YouGov in the UK did very well, and Nate Silver in the US correctly stated that the election was close.

Some poll analysts in the US clearly did underestimate the uncertainty in the election, though.

Polling data is still more helpful than non-quantitative opinion or just asking your friends, but it does have uncertainties and it's important not to understate these.

Are there any other big misconceptions that the public has around polls?

A lot of people think polls dramatically under-represent people like them - "no one like me is ever sampled".

The polls actually do a reasonable job of sampling across a range of New Zealanders.

The other general misunderstanding about polls is how much of the change from poll to poll is due to sampling variation: nearly all of it is.

Ultimately, if polls show a race this narrow and volatile, should we even care what they tell us anyway?

We should care that we know the race is narrow and volatile.

A few months ago, no one would have expected a close election, and we know it is close only because of the polls.

Also, some of what the polls say is very consistent, for example, that NZ First is very likely to be involved in any government.

More generally, a big part of the value of polls is to make it clear that the electorate as a whole is different from you and your friends.

It's easy to believe the people you talk to every day are representative of the country, but they probably aren't.