What’s the background to Savanta’s panel app, YourVue?
Essentially, we started it to provide better quality insights for our clients through better quality data. Firstly, we wanted to have better control around the way we access research participants, largely because in the last three to five years we’ve seen an increase in fraudulent panellists, bots and AI in research.
For us, one of the best ways to address that was to have our own panel and control the process ourselves in terms of who joins, how we make sure they are who they say they are. We are stopping AI bots. In the last two years we’ve seen an increase in what we believe are [respondents using] large language models, ChatGPT etc taking part in market research. You can always see when it’s a language model because the answers are too perfect. The grammar and spelling is perfect.
The second point, linked to data quality, is the ability to target and recruit people on to the panel that reflect the real-world population. Over the past 10 years, I’ve seen a race to the bottom in terms of the incentives offered and the types of people you get on to panels because the experience isn’t great. A large part of what we’re trying to do is to offer a better research panellist experience and to hopefully get people that reflect the real-world population doing surveys rather than people who want to earn some cash and are ‘professional respondents’.
It was the brainchild of a few of us in the business – Roger Perowne, David Turner and myself – it launched in beta in October last year and since then we’ve been primarily focused on improving the app in areas including app stability, quality and assurance tools, and also around user experience. We’re not a product development company in a traditional sense, so there’s been a lot of work to upskill ourselves and get the tech team involved and think about this in a consumer product way.
The last few months have been about the app but we’re now going to be looking to recruit more members on to the panel – we have around 15,000 at the moment. About one in eight surveys in Savanta are completed by YourVue respondents, and by the end of the year we want to get to around 40-50% being completed by YourVue, and part of that will be dependent on getting more numbers on to the panel – we have ambitions to grow to around 50,000 by the end of the year. We’re seeing average response rates of about 30-40% on the panel.
What do you think is driving that higher response rate?
I think it’s partly because it’s an app. Panels are traditionally email-based. I used to run a youth panel at YouthSight and we found that email was becoming a less efficient way to reach people and apps are becoming more ubiquitous in terms of the way in which people engage with the digital world today.
Secondly, we’re very conscious that the research we do through the app is going to give people a good experience. What I mean by that is we have some hard rules around what type of research can be done through the app – a survey can’t be more than 25 minutes, and we don’t allow a survey to have a screen out rate of more than 80% – the incidence rate is 20% and above.
What’s been the main challenge with giving people a good experience when they’re using the app?
There’s an extrinsic value and an intrinsic value in terms of experience. From an extrinsic perspective, it’s how much you are getting back from doing the survey from a financial, hard value perspective – in market research, incentives are getting less and less. Put that against the fact that in the last five to 10 years, the amount of stuff you can do online on your phone has increased – TikTok, Instagram and doom-scrolling, for example – so I think that’s had a huge impact on the engagement levels within panels and that’s something we’re trying to address. Linked to that is the ability to actually earn the reward.
From an intrinsic standpoint, it’s about feeling that what you are doing is valuable in terms of the insights you are giving. We try to make sure that the context of the research relates to the person doing the survey. When I was at YouthSight, a lot of the research we did with the youth panel was related to education, for example…. they had a stake in the outcomes. A large part of that comes down to making sure that the surveys you have target the right people to do the survey. If you can target car buyers interested in Volvos and the survey is about Volvo buyers, the participants are going to be more engaged.
What do you see as the more pressing challenge for the industry right now – data quality or participant experience?
It’s all linked. In the last year, the industry on the supply side has been pushed hard on cost. A lot of clients have pushed back with budgets, and as a result something needs to get squeezed. Invariably, what gets squeezed is the last bit of the chain – the respondents, the people taking part in the surveys. As a result of that, because the industry is offering lower incentives, the people that you find on panels taking part in research either don’t reflect the real-world population because the incentives and experience are so bad, or you’re finding people using bots and AI tools to do the survey because it saves them time doing it themselves.
Cost has had an impact on quality, which has had an impact on respondent experience. Technology is increasingly becoming more ubiquitous; more and more people are able to use technology to do research surveys, and not actually take the survey themselves and that invariably has an impact on the quality we see in the survey. The industry has started to be attacked, since Covid, by people who see this as an easy way to make money without having to put in too much effort, using AI and LLMs, as a way to have, let’s say, 20 different profiles on a panel and use AI to answer questions on all of the profiles.
How has the feedback been from clients about this new approach and your aims to source more of your audiences from the app?
Because the numbers have been so small as we’ve been in beta phase, we’ve been combining our panel with external sources we use in the market so clients haven’t really been able to distinguish between us and those because the numbers have been small. However, what we have been hearing is that a lot of clients want a supplier that doesn’t offer the same thing that they’re getting from everyone else. A lot of suppliers in the market are using marketplaces and the same sample sources, and as a result if you’re a client or a research agency, if you’re getting most of your sample from all the different suppliers you work with from the same place, you get a bit worried because you’re duplicating the data.
We don’t recruit through affiliates or partners, we have recruited organically through word-of-mouth with the app, as well as social media advertising campaigns, which is working quite well for us.
What steps are you taking to achieve your ambitions for growth?
Money does help to be able to get the numbers! So, that’s number one. Number two, a large part of what we’re going to do is use a peer-to-peer recruitment method – leverage the panellists we have to get people to join on our behalf. We hope we can get more people from the real-world population who might not typically take part in research and who might be more engaged or feel it’s more credible coming from someone that they know, rather than us as a random company they’ve never heard of. With a recommendation from a friend, partner or family member, you might be more likely to take part than through an ad you’ve seen online.
Part of the reason for pushing the app is the ability to access people who wouldn't normally take part in research. I think that’s a big challenge for the industry. We don’t see that we’re going to answer all of these challenges through the app but if we can eat into some of the problems, it’s better. We just want to take a proactive approach.
Will research agencies have to decide between real or synthetic data, or can they effectively blend the two?
When online research first launched and there was a move away from telephone and face-to-face surveys, a lot of people were apprehensive and there were concerns over data quality and that it would have a negative impact on the industry. I feel we’re at the same point with synthetic data. After a while, I do think it will become more accepted, however, the key to it all is that you still need the insights from real people. Synthetic data is essentially modelling what people would normally say, and to be able to build those models, you still need answers and data or some sort of data set from a real respondent who has taken part in a survey. So I think what you'll end up seeing is a combination of both.
0 Comments