FEATURE25 May 2016

The 2015 General Election polls – what went wrong?

Features Public Sector UK

At last night’s IJMR lecture, Patrick Sturgis and John Curtice shared the detail and the discussion of the work done by the British Polling Council (BPC) to determine where the pollsters fell short last year. Jane Bainbridge reports 

Since that fateful 10pm exit poll on 7th May 2015 there has been much hand-wringing, soul-searching, self-analysis and – most importantly – data crunching to determine how and why the pollsters were so far from the mark at the last general election.

And with less than a month to go before the EU referendum, the veracity and validity of political polling remains a topic much debated both within, and without, the market research community.

As John Curtice, president of the BPC, senior research fellow of NatCen Social Research and professor of politics at University of Strathclyde said of the unfolding drama of his exit poll findings through the early hours of the 8th May: “We would be heroes or zeros and about 1.30am we were closer to heroes than zeros.” However by 6am he said the realisation fully dawned that “the pollsters were in a bit of a jam”.

The response to that ‘jam’ was to set up the BPC inquiry. Patrick Sturgis, professor of research methodology at the University of Southampton was asked to chair it. “I said yes very quickly and then had a long time to reflect on the wisdom of that.”

And so began the data crunching to work out where the inaccuracies occurred by taking three polls from each of the nine BPC pollsters. Sturgis said the ‘miss’ in 2015 was “pretty bad” but that the evidence pointed to a “downward trend of underestimating the conservative share over time which suggested something systematic was going wrong here.”

Looking at 2015 compared with 1997 and 2001 – “in terms of statistical measure of error, there wasn’t a big difference”.

Indeed all the weightings that the pollsters applied to their data to try and improve the accuracy didn’t actually make much difference.


Sturgis explained that after eliminating postal voting, voter registration, overseas voters, question wording/framing, turnout weighting, mode of interview, late swing and deliberate misreporting as being the cause of the inaccuracies the inevitable finding was that the problem was unrepresentative samples.

Ultimately the problem is that the methodology is fragile. “The pollsters make a lot of strong assumptions hence our amazement that they get as close as they do as often as they do,” he said.

So the recommendations made were:

  • Explicitly state weighting variables, including population targets and source
  • State if changes made to adjustment procedures since previous poll and outline what they are
  • Require release of individual level data to any future inquiry
  • Extend period for which tables are displayed from one year to five years
  • Provide confidence intervals for each party’s estimated vote share
  • Provide significance tests for change in party vote share
  • BPC to develop an agreed methodology for calculating confidence intervals and undertaking significance tests with a view to introducing this next year
  • Pre-register that the company is undertaking a poll with BPC

But throughout the questions that followed the lecture, the fundamental flaws and limitations – such as every poll not having enough young people represented and there being too many ABs – meant there was an inevitable availability bias in polling. And this is something that’s not easily resolved.

While some people have talked about the problem being an issue of political engagement the speakers pointed to how it simply isn’t that straightforward.

So where does it leave us, as one questioner put it, with the perfect storm of the EU referendum?

Because of the lack of voting along party lines Curtice said “you could get the Tory and Labour vote absolutely right and still get the wrong outcome.”

In the EU polling so far, there is no consensus among the pollsters – unlike the 2015 General Election. What has been seen so far, is that there is a marked difference between the telephone and internet polls, with the telephone showing Remain ahead and the internet showing it closer.  In addition, there is a gradual convergence as telephone poll support for Remain has declined. 

The 24th June is going to be an important day for many reasons.

2 Comments

8 years ago

So nothing is really being changed then, other than the recommendations which seem to me to be to 'cover your arse' a bit better.

Like Report

8 years ago

The sampling going awry is going to be a long-term problem and a big concern for researchers as everything in quant is based on having a rep sample. Weighting can adjust the data but only if you know what universe you're weighting too and have the right variables to weight by.

Like Report