FEATURE29 July 2015
All MRS websites use cookies to help us improve our services. Any data collected is anonymised. If you continue using this site without accepting cookies you may experience some performance issues. Read about our cookies here.
FEATURE29 July 2015
As client side marketing and insight teams push insight agencies to provide smarter solutions in less time and for less investment, one area ripe for innovation is in the ways that respondents are acquired.
Is there a faster or cheaper way to get a good quality representative sample? With so many potential respondents already engaged on social platforms, from social media to gaming sites, would it be viable for market researchers to take samples from these sources and these sources alone, instead of using online panels?
At Lab42 we wanted to find out whether getting sample from these alternative sources makes a smart choice for online research. We wanted to see not only if social media sources were faster, but most importantly, whether the data collected via these sources matches traditional panels when it comes to behavioural and attitudinal representativeness.
For the purpose of the study, we generated samples via our proprietary social media methodology (where we obtain respondents for online surveys via social networks, games and applications) and from a traditional panel provider, both representative of the US Census based on gender, age, region, household income levels and ethnicity. We also set up quotas for each category in order to ensure populations were identical to each other, and the quota groups were held constant for both sample sources.
To compare the samples, the questionnaire focused on determining the attitudes and behaviours of respondents with specific questions related to media consumption, motivations behind completing online surveys, purchasing and shopping habits, and personal beliefs.
Right from the start, we saw that social media sampling brings a benefit in the speed of filling quotas for certain demographics. By using social media, e-commerce and gaming sites to fill the sample, we were able to get younger (under 35 ) respondents more quickly. The rate of filling the 35 – 55 year old age groups was similar with both methodologies. In addition, the data integrity measures were stronger when using respondents acquired through social media.
In the majority of cases the answers from the two samples were virtually identical, and in almost all cases, the order rankings were identical, indicating the two samples are attitudinally and behaviourally comparable. Both groups’ opinions about survey taking were similar and they were both as motivated to take part and to receive the same types of incentives.
Interestingly, we also found that social media and technology use by these two samples was identical. About half of the respondents from both sample sources considered themselves to be early adopters of technology. Purchasing patterns of electronics and groceries as well as media consumption were also similar. With electronics, this includes where items were purchased as well as which items were purchased. With grocery shopping, the top five staple items bought by the two groups were the same.
The survey also found that the differences in attitudes across a wide variety of topics ranging from volunteering, donations, and organic food consumption to travel and political affiliation were negligible.
As expected there were some differences in the outputs of the two samples. Social media sourced panels are less frequent survey takers. More than half of the traditional panel respondents are likely to report being on panels vs. 26% of the social media respondents. Traditional panel respondents, are more likely to be on multiple panels and to have taken multiple surveys in the past month. Social media sample has a better reach among younger generations (millennials, generation Z) and older respondents ( 55+) and can fill surveys faster than traditional sample providers.
These differences can provide significant value depending on the subject of the survey. For instance, we found that this type of sample sourcing provides significant additional value for brands in consumer electronics, apparel and gaming. Over half of the social media sample shopped for electronics and 72% for clothing/apparel during the past month compared, with 39% of the traditional panellists shopping for electronics and 60% shopping for clothing/apparel. Looking at grocery shopping, the traditional panellists shop more from grocery chains than from mass merchandisers — 43% and 29% respectively — while the social media sample shop from these channels more equally.
We could conclude that this new sampling approach would support key business decisions just as solidly as other types of online research. Despite the differences in respondent recruitment, a social media sample can be used in place of more traditional online-based panels without sacrificing research accuracy or validity.
Jonathan Pirc is co-founder and VP of Product at Lab42
5 Comments
Annie Pettit
9 years ago
It's not really a new recruitment or sampling approach but the theory holds. No matter what method is used for sampling, data quality needs to be monitored at every stage of the process. Every company has unique data quality and incentive practices that come into play at the recruitment stage, the profiling stage, the survey taking stage, and all stages in between. All of these processes put together determine whether the data differs from other sources or is even valid.
Like Reply Report
michalis michael
9 years ago
Over 80% of the UK population has access tot he internet i.e. can be a member of an access panel, but only 50% have a social media account. Having said that based on experience I would say that combining the two sources of respondents currently provides a better sample compared to any one of them.
Like Reply Report
Jonathan Pirc
9 years ago
Great comment, Annie. The team at Lab42 takes data quality very seriously. In addition to automated, mechanical data quality checks, our Research Team actually reviews every respondent's answers to ensure that they are of sufficient quality. If something about the answers looks off (especially with open end answers), we remove the respondent and replace them in the final sample. We feel that this approach helps to alleviate some of the concerns that come about when we talk about Social Media sample.
Like Reply Report
chris robinson
9 years ago
Of course missing here is the more fundamental question of how different are panelist of any kind from the general public? The answer to that is pretty clear, fundamentally different, giving much higher scores in areas of recall and quality of recall. The question then becomes - is cheap and fast any basis for declaring suitability? Of course that fast/low cost model is a train that one simply has to get on or get out of the way, but lets not ignore those old milestones of validity and reliability
Like Reply Report
Kevin Gray
9 years ago
It's easily forgotten that, historically, a substantial amount of mr has been done using an intercept methodology, mall intercepts in the US, for instance. More than one client has been led to believe that simply through weighting by age and sex made these samples "representative."
Like Reply Report