Just before the election, senior executives at one of New Zealand's major polling firms began to suspect there was a spy in their call-centre. The results were shifting so dramatically towards one of the major parties that there seemed no logical explanation - except that one of the telephonists was a party stooge, fiddling the results of interviews.

"I began to feel a slight twinge of nervousness that one of our staff might have a bias," an executive said. "But we did an audit of the calling patterns and we couldn't find any consistent result for one person. It just seemed that on those particular nights, there was a huge swing, which later evened out again."

Polling is a stressful, difficult and often nasty game. It is the collision of mathematics and human foible, where people with calculators must try to work out the exact statistical implications of Auntie Ngaire's ditherations and Grandpa's loathing of greenies.

The polls are conducted by big, rich companies which must protect their reputations to survive, and that means praying like hell that the eventual election result somehow resembles their estimates.

That's not easy in an indecisive, diverse and sometimes reticent nation, where up to 70 per cent of people refuse to take part in polls and at least 20 per cent change their minds from one week to the next.

No wonder the pollsters who spoke to the Weekend Herald this week all sounded a little weary.

The final votes aren't even counted yet, but already one firm, TNS Research, which does the polls for TV3, has been declared the victor for getting the figures as close to right as possible.

TV One's pollster Colmar Brunton, which indicated a likely National victory in its final poll, is reviewing its procedures. Roy Morgan Research, whose final poll indicated a small Labour victory, is rubbishing most of its rivals as amateurs.

Fairfax pollster AC Nielsen, which also picked a National victory, is reminding consumers that a poll is never anything more than a snapshot in time.

The Herald's pollster, Gabriel Dekel of Digipoll, who got the result very close in his second-last poll but overestimated Labour's support in the final week, is suggesting that last-minute polls should be done away with because the electorate is simply so "volatile" (read indecisive) in the final feverish week of a campaign.

"For the media, it was great fun but for the pollsters it was like being in hell. I was imagining myself sitting in a rocking boat, trying to take a photo of a speedboat rushing past me, with water splashing everywhere," Dekel says.

Marketing expert Professor Janet Hoek of Massey University said many media reports made embarrassing mistakes about the statistics, or failed to give voters enough information to assess the reliability of each poll.

"Instead of asking whether the polls were wrong, we should ask whether the media are competent to assess this question," Hoek says.

Each pollster gives a figure - usually about 3.1 per cent - as the maximum margin of error for the biggest party in each poll.

But it makes Hoek's blood boil to hear reporters describe a minor party, which might be polling at 2 per cent, as "below the margin of error".

Anyone with a basic understanding of statistics knows that as a party's support figure shrinks, so does the margin of error - so the real margin for a low-polling party is more likely to be 0.87 per cent.

"That sort of mistake trivialises minor parties," Hoek says. "It is little wonder that the public have become cynical about poll results and more reluctant to participate in surveys."

The polls might have varied widely but if you looked at them as an average they gave a perfect indication that the result was too close to call, says Auckland University's head of politics, Professor Jack Vowles.

The tendency of New Zealanders to switch allegiance is growing every year, says Vowles, whose research on the 2002 election showed 48 per cent voted differently from 1999, and a huge 61 per cent made up their mind during the campaign.

"Volatility doesn't matter much if everyone is going in different directions, and they cancel each other out. That doesn't explain why polls taken on the same day by different companies ought to be showing such different results," Vowles says.

He believes the real explanation is increasing reluctance to take part - meaning the participants are self-selecting. "People are just more annoyed with strangers calling them up on the phone and asking them to do things," Vowles says.

"There must be reasons people don't participate - and if those reasons correlate with support for one party or another, we've got a problem."

Steve Kirk of TNS says he gets a 30 per cent response rate - to get 1000 voters, TNS telephonists must ring 3000 numbers and cope with 2000 no-answers, no-thank-yous and bugger-offs.

Colmar Brunton estimates a 35 per cent response rate, while Gabriel Dekel says Digipoll's response rate is over 65 per cent.

"We have excellent staff, very well-trained, educated people. We have a few reliable methods for persuading people to participate, but I can't give anything away," he says.

But Gary Morgan, of Roy Morgan Research, says it is "rubbish" for any pollster to claim a response rate higher than about 16-25 per cent. SO who are the people who say yes to pollsters - just loudmouths and those who are too polite to say no?

"It's people who have the time when you call, who are not doing something else and who feel their opinion is worth something," says Steve Kirk of TNS.

"People often say that in phone polling you don't get people who support right-wing parties, such as executives, because they're too busy to do surveys, but I think we get a broad spectrum of New Zealanders."

Traditional left voters might be just as difficult to reach, points out Janet Hoek of Massey, especially if they are blue-collar shift workers or young people out partying.

Many young Kiwis are not bothering to connect a landline, relying simply on mobile phones, which is making them harder to get.

At present, it is estimated only 2 per cent of New Zealand households are without a landline, says Jeremy Todd of Colmar Brunton.

"There is no database of mobile phone numbers available to us yet, but we believe that will be resolved as the technology takes hold," Todd says.

Do polls affect the result? After the 1999 election, a Parliamentary Committee said there was "no evidence" that voters were significantly affected by survey results.

The pollsters and pundits believe MMP-savvy voters now use polls as a tool in tactical voting.

Gary Morgan and other commentators have attributed Helen Clark's victory to the two final-week polls which suggested National had a chance of winning - thus spooking the wavering Labour voters back to the fold.

Act party leader Rodney Hide won his seat, Epsom, despite a Colmar Brunton poll showing he had no chance.

It later emerged Colmar Brunton had asked voters which party they would vote for in the electorate, rather than listing each individual candidate. When other pollsters including Roy Morgan listed candidate names, Hide emerged the clear leader.

But Hide believes the Colmar Brunton poll harmed Act's national party vote because the public believed he had no chance of winning the seat and did not want to waste their votes on Act.

Jeremy Todd of Colmar Brunton says the firm is "reviewing our procedures on that".

Another complication is "snap polls", which survey about 500 voters on one night rather than the usual tally of 1000 voters over several days.

In the campaign's penultimate week, Sunday Star-Times editor Cate Brett became worried her regular poll was "obsolete" because it had been taken before revelations that the National Party had met Exclusive Brethren religious activists.

Brett commissioned AC Nielsen to survey 540 voters on the Friday night. That showed National ahead by seven points rather than the two-point lead indicated by the original poll, so Brett splashed with the new data, prompting accusations of sensationalism.

Snap polls are inherently risky because they can only get a limited range of voters, says Steve Kirk of TNS.

"I don't think people were made aware that some of those snap polls in the last week were done over a very short timeframe. You had to read the news reports very carefully," he says.

AC Nielsen's Gary Martin says snap polls must be taken for what they are; quick and sharp.

In the campaign's final fortnight, Digipoll's Dekel directed his three most-experienced interviewers to do a quality-control test, calling back the voters they had surveyed the week before to ask their views again.

He was "astonished" to find 19.4 per cent had changed their minds.

The most indecisive groups were women (19.8 per cent), those aged 18-24 (26.3 per cent) and the over-65s (21.3 per cent), while more than 50 per cent of those who originally chose Greens, United Future or New Zealand First had changed their minds.

But Gary Morgan says pollsters are in the probability game, and some of the wild swings in New Zealand suggest "volatility" might be a bad pollster's excuse for inaccuracy.

"People do change their minds, but not as much as some of the other polls indicated. Probably their questions are wrong, and we know that the wording and order of questions can change your result," Morgan says.

"If the mug journalists don't understand whose poll is right and whose poll is wrong, it's the mug journalists' fault."