NEWS31 May 2023

Poor survey design will drive respondents away, ASC hears

News UK UX

Poor survey design is damaging the future of the market research industry, with respondents increasingly averse to badly-made and over-long surveys, last week’s Association for Survey Computing (ASC) conference on survey fraud, data quality and best practice heard.


Speaking at the conference on 25th May, Arno Hummerston, founder at Amplify MR, said that fraud and respondent experience were both pressing issues, but felt that dealing with the former had overshadowed the latter. “When we talk about fraud, it feels like a pretty low bar that we expect people to not be fraudulent, and therefor that equates to quality,” he said.

“I don’t think we should necessarily be saying our panels are good quality if they’re just not fraudulent. Just because you are a real person doesn’t necessarily mean you are giving high quality data.”

Hummerston ran through a number of examples of bad survey design, and warned that poor experiences were detrimental for the industry. “Respondents, the people who do the work for us to pay the bills, they can’t be impressed with this,” he said. “Why would you stay on a panel if this is going to happen?”

He said industry standards were required, and recommended adopting profile questions as standard, and removing screening questions and repeated questions. “It can’t all be about efficiency, as that’s a very short-sighted perspective,” Hummerston explained, adding that companies that addressed the participant experience would still be standing in 10 years while others that did not tackle this problem will be out of business.

A better experience
In a panel session, Debrah Harding, managing director at the Market Research Society, said it was paramount to improve respondent experience. “Most members of the public can live long and happy lives never taking part in market research,” she said. “Getting people to want to take part in research is a challenge. If we don’t give them a good experience and if they don’t feel positive after they have been through a survey, they are less likely to do it in the future.”

On the other side is increasing fraud, Harding added, and the need to ensure researchers are speaking to genuine people. “By trying to confirm the identity and confirm the people you are speaking to are genuine, you are more likely to create an experience that pushes people away,” she explained. “It is how to reconcile those push and pull factors that keep me awake at night.”

The issues was survey design. “Surveys are too long, they are too boring and they are asking questions that don’t necessarily need to be asked with data that is not necessarily being used.” Bad survey design is still creeping through, Harding stated, and it needs to be addressed.

“When you think about a participant, think about your Mum,” she concluded. “How would like your mum to be treated if it was them answering this question. If you treat them as how you’d want your mum to be treated, then we might start to have better conversations with participants.”

AJ Johnson, board member at the ASC, told the panel that researchers had to make improvements to ensure consumers would be willing to hand over personal details. “Consumers have rightfully understood the value their data should have and therefore seek a fair exchange for the participation they give,” he said. “It is not a done deal they will give us this data that helps us to identify them.

“Survey design is not necessarily getting much better, and we are asking people to do things and not always giving good rewards for doing them. What is in it for the participant to give us that level of data?”

Solutions needed
Jon Puleston, vice-president, innovation at Kantar Profiles, said in a separate session that survey design was not eliciting good responses from respondents. “The way that people answer survey questions and the way we bias questions we ask in surveys is a big problem,” he explained.

“People tell white lies, they are embarrassed to tell you the truth in the survey, degrees of social compliance and self-delusion. As researchers we ask leading questions, difficult questions and unrealistic questions. That melee of problems we are confronted with are quite significant, if you unpick a typical survey. So many of the everyday questions we ask are biased in some way.”

Puleston said humour can increase honesty and encourage people to be themselves. “The number one cause of bad data is bored and frustrated respondents,” he added. “Survey length is clearly an issue. The longer the survey, the more you frustrate respondents and the more people will drop out.”

He added that the tone and spirit of a survey can prompt dropouts, as can repetition. This was making some respondents less engaged with surveys, and Puleston said that in some cases there was up to 40% familiarity with fake brands among survey participants, and 20% on average claim to be familiar with any advertisement they are shown.

“So much of this problem is in our hands as market researchers to fix by making surveys less boring,” Puleston added. “There is a quiet rebellion going on among panellists. Respondents are becoming resistant to the dry platitudes of research we ask them. A lot of problems could be fixed with better operational procedures.”

He concluded that survey satisfaction surveys could help improve standards, as well as general good practice measures on trap questions and a rethink of incentives for good responses.

“Our industry is operating on a 66% failure rate – that’s shocking,” Puleston said. “What industry survives with those types of failure rates? We need to do something about that. Someone who drops out of a survey is 50% less likely to ever do another survey.”