Pollsters really don’t like predicting things… so well done John Rentoul!
The Independent on Sunday’s John Rentoul tried to do something creative today -far from easy -asking us political pollsters to use our data, knowledge and past experience to actually make a prediction for the May 2015 General Election!
Pollsters shy away from making predictions – if you don’t make predictions, you can never be wrong, and no-one likes being wrong – especially people who have more information to hand than others – like the heads of polling companies (for example).
To highlight the inherent risk of making predictions, the IOS pointed out at the end of the piece that all 8 pollsters last year (before Survation joined the BPC) forecast a Conservative overall majority.
Now of course, pollsters love to say that their polls are “snapshots” of public opinion – we don’t posess crystal balls, even a week is a long time in politics and so on – and to a great extent that is true and often very relevant. However, quite a lot of polling company methodologies are not simply current “snapshots”.
Adjustments and assumptions based on past behaviour in previous elections do feature heavily in polling company methodology. These aspects of methodology are not snapshots at all, they are decisions taken by the pollster that voter past behaviour will, to different degrees affect their future behaviour. There are also decisions made that the pollster’s sample does not contain enough or too many of a certain type of historical voter – for example the controversial “party ID” adjustment employed by some – again, not a snapshot – perhaps more “hark back” than snapshot.
Survation’s constituency polling (but still not our national polling) long ago dropped past vote (2010) weighting and “spiral of silence weighting” – adjusting people who don’t actually tell us who they will vote for partially to a proportion of their previous voting decision.
This can work well in a 2 party system but it particularly unhelpful when trying to calculate support for a surging Green, SNP or UK Independence Party – and companies such as ICM Research who have been consistently reliable in UK General Election polling will surely have this consideration in mind to retain their place on the accuracy podium. Finally we see very little value in “party ID weighting” – but that’s for another time.
So What Did John Rentoul Manage to get us all to say? John’s online article is here:
< Back