Public Poll Average Calculations

This page is a permanent page which will have attached to it the excel spreadsheet which has all the calculations used for the public poll average.

CuriaBlog Poll Average

The average is both size and time weighted. The calculations and assumptions are:

  1. Weighting is proportional to sample size. The sample size excludes undecided voters when this is known as poll results are expressed as a percentage of decided voters.
  2. The older a poll is, the less weighting it has. The age of the poll is calculated as the difference between today’s date and the last day the poll was conducted mid-point of the dates the poll was conducted over. If this gap is one week or less, then the weighting is 100%. Once a poll is more than 38 days old the weighting is 0%, and between 8 and 38 days inclusive it declines proportionally every day. So a 20 day old poll is weighted 61% and a 24 day old poll is weighted 48%. This means that all polls less than a week old have equal weighting and any poll more than a monthand a week old falls off completely (as they should have been replaced by then if regular).
  3. As during the campaign period polls tend to be released weekly instead of monthly, only the latest poll for each pollster is included.
  4. The weighted average percentages are shown in the graphic on the main sidebar.
  5. The number of electorate seats a party will win is assumed to be the status quo, except when a public poll has clearly shown otherwise. This is a very conservative estimate and no projections are made on the basis of any standard swing in the overall national vote. So it is an assumption not a prediction.
  6. The St Lague formula is used to calculate total number of entitled seats
  7. A number of possible Government combinations are shown. They are automatically bolded if they form a majority able to govern.
  8. Due to the weighting by age of poll, the weighted average changes slightly every day, even if there is no new poll.
  9. One page shows the likely MPs in Parliament. It lists the Electorate MPs (based on above assumption) and the List MPs for each party. Where a party has yet to file a party list, rankings are done based on current Caucus rankings.

Feedback is welcome on the methodology and assumptions, as well as suggestions for new features or improvements. Are there other graphs or tables people would like to be able to incorporate in their own blogs?

15 thoughts on “Public Poll Average Calculations

  1. One issue is that there is no space for parties not currently in parliament. This means that if a party currently outside parliament were to look like they might gain a seat (which is a possibility in the case of The Family Party, and could be the case with others) or get over the 5% threshold the effect of this on the election result cannot be calculated (apart from including the one electorate candidate).

    To calculate this you would however need info on how non-parliamentary parties are polling, and this appears to be going unreported at the moment, all being lumped into an “other” category. This category is low in the Roy Morgan poll at present but higher in other polls. You should be able to obtain this information from polling agencies, I see no reason for them to withhold it if asked.

  2. As far as I can tell nothing is up to date, on either the little table up there ^^^ or in the Excel spreadsheet. The spreadsheet shows today’s date (using =TODAY() ) but doesn’t actually seem to have been update for the weekend polls.

    This is kinda confusing!! Any chance of the spreadsheet showing the actual date of the data? And being updated, obviously 🙂

  3. Pingback: National’s 2008 Party List | Kiwiblog

  4. Pingback: The 2008 Labour Party List | Kiwiblog

  5. Pingback: United Future not interested in Sth Island « Homepaddock

  6. Pingback: Caching and Wordpress | Kiwiblog

  7. In the NZ Herald on Wednesday, Brian Rudman highlighted how the Colmar-Brunton Poll and the Fairfax polls both tend to over rate National’s poll support by 6% to 8%…..and I’d say sometimes even more. The TNS polls and the Morgan polls have consistently been the most accurate. The Herald Digi-poll has recently joined the ranks of the over-raters since that newspaper fell in love with the National party editorially over the past 3 years.

    having these three major polls all echoing the same recorded bias in favour of one party surely must corrupt any attempt to create a meaningful average of polls.

    The Herald’s own “informal” poll reported by Simon Collins 3 weeks ago was the first poll of the recent three that show the race to be much closer (National on 43.5%) than the three polls we know to favour National.

    Polls everywhere are showing evidence of corruption as what they indicate is often far from the actual result at the polling booths……and they put it down to an “overnight shift”…..whereas I’d put it down to months of trying to make their political desires into reality by reporting shonky polling results.

  8. I would tend to disagree about the issue of biases. I’ve been looking at this myself over the last few weeks – it started with the production of the graphs on this wikipedia page:

    http://en.wikipedia.org/wiki/Opinion_polling_for_the_New_Zealand_general_election,_2008

    I’ve used that set of data to analyse the deviations from the “mean” line by polling agency. The results are here:

    Note these are “relative biases” – to get the absolute bias, multiply the value by the current estimate of party support.

    I’ve analysed these biases statistically. The basic conclusion is none of the biases are statistically significant, either at the 95% or even the 90% level. I was suprised by this initially, but when you look at the whiskers in the box and whiskers plot, you can see that all of them cross the zero-bias line.

    I would therefore conclude that the variation within the surveys is generally larger than any systematic biases.

  9. After some thought and closer analysis, I am able to confirm that what truthseekerNZ is saying actually has some statistical merit. The test I discussed in the previous post still holds – no polls show statistically significant deviations from zero bias. However, what truthseekerNZ is discussing are the relative values of Labour and National (ie that a particular poll shows a significant greater (or lesser) bias towards one party than the other). I tested this statistically – the results are shown on the updated version of this figure:

    The analysis shows that Colmar Brunton and Nielsen show a statistically significant bias towards the National Party, and TNS has a significant bias towards the Labour Party. There is no evidence, at the 95% level, of there being a bias in the Digipoll, Roy Morgan, or UMR poll results towards either of the major parties.

  10. Comment for Curia: I have noticed an error in the version of the spreadsheet live on 6 Nov. The calculations show 35 electorate seats won by national, but the estimate of which national candidates will get in on the list uses 36. Hence you are showing 24 List MPs, not 25.

  11. Pingback: Electorate seats « Kiwi Poll Guy

  12. Pingback: More rambling on polls & bias | Distracted Scientist

  13. Pingback: Poll meta-analyses in NZ | Stats Chat

Leave a comment