OPINION11 May 2015

Why the UK election polls were probably correct

Opinion

Election night. The one time every five or so years that your friends and family begin to have some inkling of what it is you do for a living.

Res_4013298_vote_ballot_box

With almost every poll aligning on the evening before, the UK expected a dead heat in the vote share and around 274 seats for the Conservatives, versus 271 for Labour. But 24 hours later, an exit poll and the results themselves show that the Conservatives have returned 331 to Labour’s 232. An astonishing departure from the previous data. Or was it?

The main thing to say is that the polls were almost certainly not wrong. So let’s focus on whether they asked the right question. Polls asked how people would vote, while the final surprise exit poll and voting itself recorded what they actually had voted. Perhaps people weren’t lying – perhaps they just couldn’t accurately predict their own behaviour. For me, the parallels with consumer research are clear.

We don’t always know why we do what we do, or what we’re going to do in future. Yet much research insists on asking people to explain just that – when consumers may not fully understand all the key moments along the way themselves. Marketers and researchers create concepts based on a world where consumers are seen as rational beings, with perfect self-awareness:

“How likely are you to buy product X after seeing this advert?”.

I don’t know about you, but I currently don’t know how likely I am to buy a bike part I’ve been thinking about for four months, let alone something random I fleetingly see an advert or new pack design for, for the first time. If we know this about our own self-awareness, why would we expect that mythical other — ‘the consumer’ — to know any better?

The shock difference in the election polls and actual voting behaviour is not evidence that people lied. It’s evidence that when we design surveys, we should think about how we would answer them, and indeed whether we could.

The job of insight professionals is to propose ways we can implicitly predict behaviour; to understand consumers with the depth that means we can identify likely outcomes because we know them so well, in the same way that lifelong friends can usually predict each other’s behaviour with a high success rate.

If there’s one thing that research can learn from politics, it’s that ambition can get your voice heard. The difference for research will be that when we do say something, it might actually be based on some evidence…

Charlie Richards is an account director at Tonic Insight

@RESEARCH LIVE

5 Comments

9 years ago

I think it is fallacious to say the polls were probably correct. That is irrelevant - they were interpreted by the world at large as being predictive, and as such were PERCEIVED to be wrong. Unless we manage the expectations of commentators, consumers and politicians polls are always going to be used in this way. There is a simple fundamental point, namely that we ask about voting intention and that is a different question from how did you vote. Especially in the toxic environment of this last election where every choice in England & Wales came with a pile of nasty baggage - UKIP were BNP in disguise, Tories were posh bankers, Labour would break everything again, Greens were ridiculous and Liberals were unthinkable. Who would volunteer any of those options, and why would that necessarily reflect behaviour. So the choices were passive and rarely based on conviction. The only party that represented an active opt-in was SNP and funnily enough the Scottish polls were spot on. There is a lesson - if you ask people about likely behaviour when they don't actively care for or engage with the brands in question - the chances of their choice reflecting behaviour recede.

Like Report

9 years ago

Whether the pre-polling day polls were right or wrong this, to me, is a warning that although they are constantly said to be a "snapshot" of current voting intention, the media and the general public take them as predictions. Thus it behoves us to stress on every suitable occasion that they are just answers to the questions asked, not a snapshot of anything but this.

Like Report

9 years ago

I have to completely disagree with this post. The polls predicted the number of seats and got it completely wrong, even within their relatively wide margin of error. Just because people told the truth (which I agree is probably correct) doesn't mean that the polls got it right. Perhaps a better title might have been "Just because the polls got it wrong doesn't mean the respondents lied." Sadly, as this is the one piece of research that virtually everyone in the country is exposed to, it reflects on the Market Research industry in general. I do agree with the last point - as researchers we need to work out how to predict voting intention more accurately. Either that or just stop making the claims we make when doing political polling!

Like Report

9 years ago

I'm on record as proposing that election "polls" be (re)termed "pre-polls" or "current intention measures" (or similar/better) as Error arises from all of the design/method(s), the sampling, AND the time in between the survey (poll) and the event it is seeking to inform, however short. And Yes, the Media should NOT state a modelled outcome to be accurate at seat-level (Sky) if the margin of error is +-20 seats (Source : LSE "Exit Polling Explained")). IN SUMMARY The exit poll was accurate at "outcome level", and the (voting share measures from) pre-polls accurate to +-4%. Communication of results needs improving (incl differentiating vote share v seat share), and maybe we should have a more Unified Industry voice (control) on this? Maybe the BPC should run all election research centrally under their auspice and brands, and not individual survey firms?

Like Report

9 years ago

Come on; I get that the polling industry has taken a lot of heat and is trying to find ways out of it. But let's simplify the equation. The polls were not correct, so they were... what, exactly?

Like Report