NEWS20 September 2010

Keep online survey questions short and simple, study advises

Europe Features Technology

AUSTRIA — Rushing through long or difficult questions in online surveys harms data quality more than any other type of respondent behaviour, a new study suggests.

Psychologists from the University of Vienna and the University of Deusto in Bilbao, Spain, tested a Javascript tool which tracked respondents’ activity while they completed a 23-question online survey about their use of instant messaging (IM).

The researchers assessed the quality of the data collected by checking results for consistency, and by comparing demographic information from respondents who shared their IM user names with information found in online address books.

As well as looking at plausibility of answers and non-response, the Javascript tool allowed the researchers to study behaviour such as excessive clicking or mouse movements, time spent considering questions and whether respondents changed their mind before settling on an answer.

The biggest problem, they found, was respondents answering without reading questions properly. “Introduction texts were rarely read thoroughly,” they said, “and semantic differentials [matrix questions] showed higher levels of clicking through than other questions.”

On the basis of their findings the authors recommend that questionnaire designers

  • keep introductory texts as short as possible
  • only use matrix questions if absolutely necessary
  • avoid questions that require calculations or other difficult tasks
  • avoid putting more than one question on each page

The researchers said the project had shown that their tool, the UserActionTracer, can be implemented easily to detect problematic items in surveys. Further reserach, they said, might include looking at the positioning of radio buttons, and analysing the speed and duration of mouse movements.

The study was conducted by Stefan Stieger of the University of Vienna and Ulf-Dietrich Reips of the University of Deusto in Bilbao, Spain and is published in the journal Computers in Human Behavior.

@RESEARCH LIVE

6 Comments

10 years ago

Good to see another study in this area, but I am surprised to see it treated as news. Basing findings on one study can be risky, for example how dependent are the findings on the culture and characteristics of the respondents and the topic. I would suggest than anybody working in this area should read Mick Couper's book 'Designing Effective Web Surveys' for a more complete treatment of the area. Of the four recommendations, the first three would be familiar to any reader of works in the area. The fourth one, about asking a single question on the page is one where previous research appears to less clear, and the addition of one more finding does not make it less clear. My take (based on reading research and conducting research) is that generally asking one question per page is safer and easier, but it can reduce engagement and if the second question is a corollary of the first can make the task harder.

Like Report

10 years ago

The finding about asking on question per page refutes what I hear from respondents, at least those who take the trouble to contact me. (I put my contact details on my surveys.) It appears that having a single question per page just creates the feeling of endlessness - another one, another one - and every click of the NEXT PAGE button is another opportunity to bail out completely. If one question leads naturally on to the next - then why not put them together while the respondent is in the appropriate cognitive zone? Blanket rules (keep intro texts as short as possible) are potentially too dogmatic. There are moments, as in a face-to-face interview where one may need to stop; change gear, re-set the mood, shift into a different area of questions: and for these reasons some savvy "chat" can do the job nicely. For sure, it is important for questionnaire writers to start viewing the whole process as a "respondent experience" and to start showing some empathy for the poor sod who, hitherto, has faced one too many matrices from hell.

Like Report

10 years ago

Fair points made about the single question per page issue but the rest is just common sense isn't it? I feel that we really should be well past these basic rules by now as an industry and into the realms of how to design surveys that are as engaging/effective/intuitive/interesting as possible. Unfortunately, there seems to be a lot of online research that just ignores (or does not care about) common sense and the respondent experience. How many studies like this do we need to finally "get it"?

Like Report

10 years ago

Good peice of confirmation research but I have to agree with Ray in being surprised this is regarded as 'news', I'm actually a lot more interested in their UserActionTracer technology as this sounds like a new approach to survey evaluation (at least I've not heard of it before).

Like Report

10 years ago

I was (pleasantly) surprised to find this in "news"; perhaps it's more "current affairs". To Dan's point I don't really know the answer but have heard that you need to be told someting 9 times to "get it". As someone who just reviewed a brand image questionnaire running to 77 numbered questions over 50 pages of a Word document I'm surprised the researchers found what they found in a "mere" 23 questions...

Like Report

10 years ago

see this as well: http://blog.vovici.com/blog/bid/22341/Use-Multiple-Questions-per-Page-of-a-Web-Survey I think there should be a balance between number of pages vs. number of questions per page. Any extreme approach will most probably fail.

Like Report