Respondent engagement and survey length: the long and the short of it
US— Long online surveys lead to less engaged survey takers, right? That’s the conclusion of a new white paper from Survey Sampling International (SSI). But MarketTools has also published its own analysis of the factors affecting respondent engagement, in which it argues that the reality is less clear cut.
SSI’s work is a warts-and-all repeat of a study done in 2004, which compared a survey more than 20 minutes in length to a shortened version to test for fatigue effects and the impact on response quality.
Five years on, SSI said little has changed. On both occasions during the course of the survey respondents were presented with a sliding scale question where the slider was already set at the mid-point, meaning the respondent could hit the ‘next question’ button without moving the slider yet still leave some data behind.
SSI experimented with placing this question at various positions in the survey and found that generally speaking, in both 2004 and 2009, “non-use of the slider increases as the [long] survey progresses”. For the short survey it said “the effect is less dramatic and less clear”.
It found a similar pattern when comparing the number of words used to answer an open question. Again, the question was positioned at various points throughout the survey, and again SSI noted that “the number of characters typed decreases as the open question moves back in the [long] survey”. Comparing 2009’s results to 2004’s, it found the number of characters per open question had decreased in the intervening years. With the short survey, the pattern was again less clear.
SSI’s global knowledge director and white paper author Pete Cape wrote: “In both 2004 and 2009, the long survey proved itself too long. It fatigued the respondent and led to satisficing behaviour.” Reiterating 2004’s findings, he said: “If researchers work to keep surveys shorter, it will not only help ensure response quality, but it will also make for more motivated and responsive respondents.”
Up to a point, says MarketTools. In its white paper, the firm looked at the impact of various design variables on a number of engagement metrics, such as survey rating (as scored by the respondent), the rate of abandonment, the first incidence of speeding and the percentage of pages sped through.
According to MarketTools: “While survey length proved to be generally predictive of most respondent engagement measures, there was wide variation in the design variables that were most influential in driving various measures of engagement.”
Taking the survey rating measure as an example, the company found ‘elapsed time per page’ to be most predictive of that measure, though one of the least important when it came to predicting when and to what extent respondents would start speeding through the survey.
“The findings reveal that engagement is driven by a complex interaction among design variables,” says MarketTools. “There is no axiom that applies in all cases, such as ‘surveys that require more than 20 minutes result in poor respondent engagement’.”