OPINION11 August 2010

Online surveys don't have to turn respondents off

Opinion

In the July issue of Research, Robert Bain wrote that the market research industry still has work to do to make online surveys appeal to respondents, after spending a month as a fake respondent. That’s not news to Deborah Sleep from Engage Research, which has conducted a study of its own into how to make online surveys more appealing and more effective.

How many times a day, on average, do you receive an email inviting you to participate in an online survey? And of those you actually go online to fill in, how many do you stick with through to the end and how many do you bail out of halfway, because it’s so dull you realise you’ll simply never get that fifteen minutes back again?

Me too.

Therein lies a major problem for marketers looking to gather opinion using online surveys. The internet is a double edged sword. Online surveys remain an important, versatile and affordable way to find out what people think, but if brands and agencies are serious about engaging consumers in this way, their approach to the process needs to be significantly more creative.

Research conducted by Engage in collaboration with GMI showed that bored respondents are rushing through online questionnaires, answering in patterns just to get through the survey quickly and ultimately providing far less usable data.

We believe – and the research supports this – that response rates to online surveys can be significantly improved by changes to the structure of questionnaires, through innovative question mechanics, the use of visuals and animation and the deployment of imaginative techniques such as role play. Our findings indicate that, with these changes, completion rates can be improved by more than 20%, together with a significant growth in data volume and quality. By doing something as fundamental as grabbing the attention of a respondent from the start of the survey with an animated introduction, for instance, consumers will spend up to 80% more time answering follow-up questions.

Employing various role-play techniques to motivate respondents was found to increase evaluation time by 70%, double the number of words used in responses and significantly increase the number and quality of ideas and suggestions. One example which worked extremely well was to ask consumers to evaluate the impact of advertising from the perspective of someone working for an advertising agency. Another was to ask them to place themselves in the role of a boss of a company launching a new product. Both gave them a more interesting context in which to answer questions and the results improved accordingly.

For questions that ask for consumers to compose a written answer rather than ticking boxes in a grid, showing examples of how previous respondents had completed an online question revealed that the volume of responses could double, with word count jumping from 25 to 50 words. That can have a major impact on what the survey reveals.

If there are a significant number of questions to answer, it also helps to include a brief, light-hearted departure from the survey, which provides relief and prevents respondents from becoming bored.

Grid questions are a simple way to gather opinion – the problem is, respondents find them utterly boring, causing 80% more drop-out than other question types and being rated in our research at a lowly 3/10 for respondent enjoyment. Respondents try to speed through grid questions and the effect of that is their answers flatline.

Creating more interesting designs and layouts makes the process far more engaging, and radically improves interest and enjoyment (from a 3/10 score to 8/10 ), generating significantly more granular data.

Question wording also requires careful consideration. Many online surveys stick to the conventions used in face-to-face interviewing, making the whole process of reading instructions cumbersome. It’s a hangover from face-to-face interviews, where respondents do not have a visual scale. Long sets of instructions have no measurable influence on the balance of answers to a question but, more importantly, the inclusion of a whole load of unnecessary instructions acts like the boy who cried wolf. If someone reads an unnecessary instruction, it discourages them from bothering to read others – which could be more important. How significant is this? Up to 50% of respondents don’t read instructions properly at the end of a typical survey.

Social influence techniques can also be used to great effect in setting an expectation. Used carefully, an instruction to “spend no more than …” can multiply the time a respondent spends answering a question by a factor of five. In some instances, it may be worth making a question optional. Prefacing with the phrase “Would you mind” was found to be a successful means of improving the volume and quality of data and, in our experiments, less than 10% elected to skip the question.

Our approach to online surveying needs to change. It is still being too heavily influenced by the conventions of face-to-face interviewing where there are huge set-up and organisational constraints. Many of these rules can and need to be re-thought when conducting online surveys in order to avoid an ongoing decline in response and quality.

Online consumer research can reach far greater numbers far more quickly and less expensively, and can provide the intelligence that will influence everything from product design to price, from channels to market through to marketing messages. But extracting the information that is going to be most valuable requires more thought and engagement than ever before.

The lesson marketers must heed is that their money could yield a far better return if they overhaul their approach to engaging the interest of consumers.