NEWS20 September 2010
All MRS websites use cookies to help us improve our services. Any data collected is anonymised. If you continue using this site without accepting cookies you may experience some performance issues. Read about our cookies here.
All MRS websites use cookies to help us improve our services. Any data collected is anonymised. If you continue using this site without accepting cookies you may experience some performance issues. Read about our cookies here.
Insight & Strategy
Columnists
Impact magazine is a quarterly publication for MRS members. You can access Impact content on this website.
NEWS20 September 2010
AUSTRIA — Rushing through long or difficult questions in online surveys harms data quality more than any other type of respondent behaviour, a new study suggests.
Psychologists from the University of Vienna and the University of Deusto in Bilbao, Spain, tested a Javascript tool which tracked respondents’ activity while they completed a 23-question online survey about their use of instant messaging (IM).
The researchers assessed the quality of the data collected by checking results for consistency, and by comparing demographic information from respondents who shared their IM user names with information found in online address books.
As well as looking at plausibility of answers and non-response, the Javascript tool allowed the researchers to study behaviour such as excessive clicking or mouse movements, time spent considering questions and whether respondents changed their mind before settling on an answer.
The biggest problem, they found, was respondents answering without reading questions properly. “Introduction texts were rarely read thoroughly,” they said, “and semantic differentials [matrix questions] showed higher levels of clicking through than other questions.”
On the basis of their findings the authors recommend that questionnaire designers
The researchers said the project had shown that their tool, the UserActionTracer, can be implemented easily to detect problematic items in surveys. Further reserach, they said, might include looking at the positioning of radio buttons, and analysing the speed and duration of mouse movements.
The study was conducted by Stefan Stieger of the University of Vienna and Ulf-Dietrich Reips of the University of Deusto in Bilbao, Spain and is published in the journal Computers in Human Behavior.
Newsletter
Sign up for the latest news and opinion.
You will be asked to create an account which also gives you free access to premium Impact content.
From the latest @ImpactMRS, Joanna Swash, group chief executive at @Moneypenny talks to Jane Simms about her focus… https://t.co/gUoLpg2zFC
Streaming reaches record high in the US https://t.co/921fqNK5hq #mrx #marketresearch
BMG Research moves into London office https://t.co/rSJLxDPZNN #mrx #marketresearch
The world's leading job site for research and insight
Resources Group
Qualitative Research Manager – Leading Public Affairs Team!
To £38,000 + Industry–Leading Benefits!
Hasson Associates
Associate Director – German speaking
£300–300 per day
Hasson Associates
Research Director – Quant
£68000–70000
Featured company
Town/Country: Aylesbury
Tel: +44 (0)1296 632578
2Europe is an award-winning (Quirk's, RAR Awards), full-service market research agency which specialises in international B2B market research. We are recognised as one of the top market research agencies in the UK . . .
Town/Country: London
Tel: +44 (0)20 7490 7888
Kudos Research are leading providers of premium quality UK and International Telephone Data-Collection. Specialising in hard to reach B2B and Consumer audiences, we achieve excellent response rates and provide robust, actionable, verbatim-rich data. Methodologies include CATI, . . .
Ray Poynter
12 years ago
Good to see another study in this area, but I am surprised to see it treated as news. Basing findings on one study can be risky, for example how dependent are the findings on the culture and characteristics of the respondents and the topic. I would suggest than anybody working in this area should read Mick Couper's book 'Designing Effective Web Surveys' for a more complete treatment of the area. Of the four recommendations, the first three would be familiar to any reader of works in the area. The fourth one, about asking a single question on the page is one where previous research appears to less clear, and the addition of one more finding does not make it less clear. My take (based on reading research and conducting research) is that generally asking one question per page is safer and easier, but it can reduce engagement and if the second question is a corollary of the first can make the task harder.
Duncan Stuart
12 years ago
The finding about asking on question per page refutes what I hear from respondents, at least those who take the trouble to contact me. (I put my contact details on my surveys.) It appears that having a single question per page just creates the feeling of endlessness - another one, another one - and every click of the NEXT PAGE button is another opportunity to bail out completely. If one question leads naturally on to the next - then why not put them together while the respondent is in the appropriate cognitive zone? Blanket rules (keep intro texts as short as possible) are potentially too dogmatic. There are moments, as in a face-to-face interview where one may need to stop; change gear, re-set the mood, shift into a different area of questions: and for these reasons some savvy "chat" can do the job nicely. For sure, it is important for questionnaire writers to start viewing the whole process as a "respondent experience" and to start showing some empathy for the poor sod who, hitherto, has faced one too many matrices from hell.
Dan Culshaw
12 years ago
Fair points made about the single question per page issue but the rest is just common sense isn't it? I feel that we really should be well past these basic rules by now as an industry and into the realms of how to design surveys that are as engaging/effective/intuitive/interesting as possible. Unfortunately, there seems to be a lot of online research that just ignores (or does not care about) common sense and the respondent experience. How many studies like this do we need to finally "get it"?
Jason More
12 years ago
Good peice of confirmation research but I have to agree with Ray in being surprised this is regarded as 'news', I'm actually a lot more interested in their UserActionTracer technology as this sounds like a new approach to survey evaluation (at least I've not heard of it before).
Pete Cape
12 years ago
I was (pleasantly) surprised to find this in "news"; perhaps it's more "current affairs". To Dan's point I don't really know the answer but have heard that you need to be told someting 9 times to "get it". As someone who just reviewed a brand image questionnaire running to 77 numbered questions over 50 pages of a Word document I'm surprised the researchers found what they found in a "mere" 23 questions...
If you haven't had the chance to complete the MRS Census on Inclusion, there is still time. This census will help u… https://t.co/7O7SabnKcn
The post-demographic consumerism trend means segments such age are often outdated, from @trendwatching #TrendSemLON
6 Comments
Ray Poynter
12 years ago
Good to see another study in this area, but I am surprised to see it treated as news. Basing findings on one study can be risky, for example how dependent are the findings on the culture and characteristics of the respondents and the topic. I would suggest than anybody working in this area should read Mick Couper's book 'Designing Effective Web Surveys' for a more complete treatment of the area. Of the four recommendations, the first three would be familiar to any reader of works in the area. The fourth one, about asking a single question on the page is one where previous research appears to less clear, and the addition of one more finding does not make it less clear. My take (based on reading research and conducting research) is that generally asking one question per page is safer and easier, but it can reduce engagement and if the second question is a corollary of the first can make the task harder.
Like Reply Report
Duncan Stuart
12 years ago
The finding about asking on question per page refutes what I hear from respondents, at least those who take the trouble to contact me. (I put my contact details on my surveys.) It appears that having a single question per page just creates the feeling of endlessness - another one, another one - and every click of the NEXT PAGE button is another opportunity to bail out completely. If one question leads naturally on to the next - then why not put them together while the respondent is in the appropriate cognitive zone? Blanket rules (keep intro texts as short as possible) are potentially too dogmatic. There are moments, as in a face-to-face interview where one may need to stop; change gear, re-set the mood, shift into a different area of questions: and for these reasons some savvy "chat" can do the job nicely. For sure, it is important for questionnaire writers to start viewing the whole process as a "respondent experience" and to start showing some empathy for the poor sod who, hitherto, has faced one too many matrices from hell.
Like Reply Report
Dan Culshaw
12 years ago
Fair points made about the single question per page issue but the rest is just common sense isn't it? I feel that we really should be well past these basic rules by now as an industry and into the realms of how to design surveys that are as engaging/effective/intuitive/interesting as possible. Unfortunately, there seems to be a lot of online research that just ignores (or does not care about) common sense and the respondent experience. How many studies like this do we need to finally "get it"?
Like Reply Report
Jason More
12 years ago
Good peice of confirmation research but I have to agree with Ray in being surprised this is regarded as 'news', I'm actually a lot more interested in their UserActionTracer technology as this sounds like a new approach to survey evaluation (at least I've not heard of it before).
Like Reply Report
Pete Cape
12 years ago
I was (pleasantly) surprised to find this in "news"; perhaps it's more "current affairs". To Dan's point I don't really know the answer but have heard that you need to be told someting 9 times to "get it". As someone who just reviewed a brand image questionnaire running to 77 numbered questions over 50 pages of a Word document I'm surprised the researchers found what they found in a "mere" 23 questions...
Like Reply Report
mihai
12 years ago
see this as well: http://blog.vovici.com/blog/bid/22341/Use-Multiple-Questions-per-Page-of-a-Web-Survey I think there should be a balance between number of pages vs. number of questions per page. Any extreme approach will most probably fail.
Like Reply Report