OPINION24 November 2014

The spiral of silence in polling


The spiral of silence, or if you want its more technical term, social desirability bias, is the unspoken curse of public opinion polling. It conceals a host of challenges, chief among which is whether or not some of the data we produce really does reveal the ‘true’ state of public opinion on a particular subject.

The Scottish Independence referendum is the latest case in point. The polls are thought to have just about ‘dodged a bullet’ but the simple truth is that bias was systematic across all polls, which understated the No vote.

Some of us pollsters acknowledged in advance that this could well be the case: in the absence of known prior behaviour, how pollsters disentangled the Scottish patriot loathed to be associated with a No vote, from the genuine ‘Don’t Know’ proved to be a well-nigh impossible task. In my view, this phenomenon was worth the couple of points that would have made the difference between mediocrity and glory for the pollsters.

This is a well-worn path of course. Recent history offers plenty of examples from the world of political polling: few voters admitted to supporting the Conservative Party during their so-called ‘nasty party’ era, yet the Tories usually ended up scoring a few points more in actual elections than most of the polls suggested. And for the same reason that the Tories did better than the polls suggested 20 years ago, we expect the Liberal Democrats to do slightly better in 2015, at least compared with the routine 7-8% attributed to them now.

Avoiding embarrassment

Giving the ‘right’ answer, or at least the one that might be less embarrassing for a respondent to reveal, infects other areas too. For example, let’s look at self-reported racial prejudice. The British Social Attitudes Survey asks respondents whether they would describe themselves as being racially prejudiced.  Not surprisingly, throughout its 30 year history only a tiny minority ( 2% in 2012 ) describe themselves as such. Yet statistics from the Institute of Race Relations in the same year prove that prejudice is alive and well with 42,000 incidents of race hate crime reported in 2011/2012. 

So called ‘nimbyism’ is another area where divergence arises between results and reality. The government has recently proclaimed on the basis of plunging nimbyism scores in British Social Attitudes Survey that the public is now much more receptive to residential, retail and other forms of development in their local neighbourhood.

Call me cynical, but I wonder how this coexists in a world where it’s impossible to conduct research into large scale development projects without campaigners routinely accusing researchers of bias when results contradict their perspective.

The apologists might castigate me for blaming respondents, on whose goodwill we all, of course, depend. But nothing could be further from the truth.

Taking action

Social desirability bias can often lounge around in the back of the psyche largely undetected – we are pretty awful observers of our own behaviour and often we polish our views with positive retrospection. There’s a difference between believing we’re about to embark on a course of action and actually doing so, as this perennial dieter will confirm. So don’t be surprised if we find out come May 2015 some UKIP supporters revert back to the Conservative fold because they remember the ‘wasted vote’.

As pollsters we must recognise that we cannot contend with all facets of human behaviour, all we can do is try our best to recognise issues that might impinge on the integrity of our data and seek to mitigate them where we can.

In the spirit of transparency and honesty we need to accept that we’re never going to get everything right every time. In fact, as we super-speed into the digital world and dump the intellectual rigour that has underpinned our industry for so long, being able to contend with issues like this may well prove less easy than ever. In fact, as IndyRef again implied, none of us have all the answers.

Martin Boon is director of ICM Research


10 years ago

Yet another reason why I love big data. Past behaviour, not opinions, is the best predictor of future behaviour. Of course I'm going to buy those local organic vegetables and support our farmers! (but my cart is full of cheap imports)

Like Report

10 years ago

Asking better questions is another way around this. Not always possible, I admit. i.e. ask a questions that shows racism / stereotyped beliefs, rather than asking "are you racist" Interesting article. I've always felt we need to be very cautious about assuming a 1:1 relationship between answers to survey questions and "reality" whatever that might be.

Like Report

10 years ago

" So don’t be surprised if we find out come May 2015 some UKIP supporters revert back to the Conservative fold because they remember the ‘wasted vote’. " Sounds like typical ICM between election comment on UKIP. How is it that ICM consistently have UKIP well below the other pollsters, and then just before an electoral event mysteriously the UKIP figures begin a rise to come into line with the rest of the pollsters, before plummeting back down after the election. I agree that bias is systematic in all polls, but ICM take it to another level, as we will no doubt find out as the GE approaches, when ICM begins its standard upward adjustment for UKIP, to avoid looking completely incompetent.

Like Report

10 years ago

Great article, and fantastic to get such an honest view from a pollster. The referendum polls were something of a nadir for opinion polling, with mainstream press suggesting skewed polling cost the UK a lot in last-minute promises of devolution to the Scots to tempt them to vote No. 'Big data' was zero use as well, with no previous comparable events to use. And social analytics provided even worse data than the polls, suggesting a very strong Yes result. Sadly, the media miss these nuances, and need saving from their instinct to instantly jump for headlines without any context or background to polls and data.

Like Report