Res_4012684_online_survey

OPINION17 December 2014

Online surveys need handling with care

Opinion

Respondents to online surveys are often used as barometers for broader influence. Steve Abbott looks at when this works and when it can lead to problems.

There are a powerful group of people who influence governments, companies and public opinion. They have always had influence but recently the make-up of this group of people has changed. We know who they are but we don’t know if their views and attitudes necessarily reflect those of the population as a whole.

Technology has enabled this group of people to be canvased for their views far more quickly and economically than ever before so their influence is growing.

We are talking about the sub set of the population who respond to online surveys.

Governments and companies have always relied on surveys to gauge peoples’ views and use the results to influence policy, product development, marketing campaigns etc. Results being weighted to ensure that they are a representative sample of the population as a whole in terms of age, gender and social grade.

The problem is that there is no means of understanding whether the views and attitudes of the people who respond to online surveys are the same as those who don’t. Some organisations have run concurrent surveys on- and off-line to overcome this problem but that is both expensive and time consuming so rarely done. In addition, the results tend not to be published as they contain valuable proprietary information.

Understanding the New Influencers

The difference between online survey responders and the population was brought into focus during the run up to the Scottish Referendum.

A much publicised online poll, released just before referendum day, showed the ‘Yes’ vote in the lead. Much credence was given to this poll as, historically, online polls had been very successful in predicting election results.

The referendum result, as we now know, was that the ‘No’ vote triumphed. So what went wrong?

One of the things The British Population Survey has been doing over the past three years is monitor whether people respond to online surveys and, if so, what kind. The results can be analysed by not only comprehensive demographic variables but also many attitudinal variables.

What this provides is a quantifiable link between three populations; those who respond to online surveys, those who have internet access but do not respond to online surveys, and those who do not have internet access.

More ‘Active’

One of the key findings is that the group of people who respond to online surveys tend to be more ‘active’ in the broadest sense, than those who don’t.

This single insight helps to explain the failure of the online survey to predict the result of the Scottish Referendum. Basically online survey responders are more likely to actually turn out to vote. This has been an important factor in elections where the turnout is low. (UK General Election Turnout; 2001 59%, 2005 61%, 2010 65%).

The problem in the Scottish Referendum was that turnout was 84.5%. With such a high turnout the tendency for online survey responders to actually vote was less important. In other words the turnout was more population representative.

Stronger Opinions

Online survey responders tend to have stronger opinions than the population in general. Taking Financial Optimism as an example online survey responders are 1.65 times more likely to answer ‘much better’ than the population on the five point scale of much better, better, same, worse and much worse, and less likely to say ‘the same’.

While this could just indicate that they are more optimistic, this in itself is interesting. But they are also more volatile in their views. This complicates matters, as it means that simply taking a single ‘snapshot’ of the online survey population can be misleading.

Conclusion

Online surveys are relied upon to guide decision making across a wide range of subjects by all kinds of organisations but the science is relatively immature. These figures illustrate some of the differences which need to be taken into account when interpreting findings – particularly when trying to extrapolate to the online population as a whole. In terms of the ‘offline’ population that is another matter completely. Online survey responders currently represent approximately 10% of the population (the off line population is 15%). Perhaps they are the most influential minority in Britain today.

Steve Abbott is director of The British Population Survey

(Note: Figures are collected by The British Population Survey by face to face in home interviews with a population representative sample of 6,000 to 8,000 adults aged 15+ per month (data collection commenced Jan 2008 total sample size = 550,044 ).

4 Comments

9 years ago

I share your concerns, and IJMR, of which I'm Editor, is holding a debate at the MRS conference in March on online samples with Reg Baker, Doug Rivers and Corrine Moy as the speakers

Like Report

9 years ago

A mix of methodologies is a key way to ensure representation. It is more complex but it always gives a richer picture.

Like Report

9 years ago

We need to also remember that innumerable sampling AND nonsampling errors are a standard part of the research process, whether you're conducing surveys, focus groups, behavioural, or any other type of research. Research results must be used to guide, inform, and motivate rather than to carve a result into stone.

Like Report

9 years ago

I am not so sure about the premis that respondents of political questions in online panels differ that much from the rest of the population. And the reason is that political questions are only a minor part of the what online panels do. I used to be a yougov member, and virtaully all questions I ever got asked were about branding. I still know quite a few people who are on the you gov panel, not because they have strong views about any political issues, but because they like answering the branding questions (and get some prezzies for doing so). I have now idea whether this is representative, but given that there are enough studies that show that following politics is a minority sport (compared to say, shopping, or following sport), I would be surprised if the majority of panel members are not on a panel for other, non-political, reasons.

Like Report