OPINION6 January 2012

The end of online panels: Redux

Opinion

Are online panels finished? That was the subject of a head-to-head debate we published in November. Last month, Dutch research association MOA addressed the same question. MOA’s Lex Olivier reports.

Rounding out his argument on the future of online sample in Research’s November debate, Pulse Group’s Bob Chua said that the days are numbered for online panels now that researchers are starting to experiment with recruiting respondents via social media. His adversary in the debate, uSamp’s Gregg Lavin disputed this, instead predicting a future that takes in the best of both worlds: river sampling (that is, recruiting respondents through websites) and panel sampling.

After its own debate last month, the Dutch market research association MOA concluded that the days of traditional online panels are indeed numbered – and, we would add, so are those of straightforward banner recruitment. Panels have plenty of problems to contend with, including an increase in non-response rates, respondent frustration with long and boring questionnaires and the often impolite way willing survey-takers are rejected for failing to meet recruitment criteria. But river sampling in the form of pure banner recruitment will not be the natural substitute.

So we have no choice other than to look for alternative sources of recruitment – and to learn from the mistakes of the past. During the MOA discussion, a panel of experts came to the conclusion that two new approaches must be incorporated into existing best practice.

The first is to adopt the Wisdom of Crowds principle. Author James Surowiecki writes: “If you get a large enough group of people, who are chosen randomly enough, are independent of each other and not subject to pressure to conform – then you will get a better answer than asking any expert his opinion.”

How would this work in research? If river recruitment comes from a wide selection of websites and if this is done in combination with online panel and adequate reweighting procedures we should be able turn this into a validated research approach.

A second interesting approach is the introduction of respondent-driven sampling or random snowball sampling (RSS). RSS produces samples that are independent of the initial subjects from which sampling begins. As a result it does not matter whether the initial sample is drawn randomly. How this works is best explained by a case study.

Recently the Social and Cultural Planning Office (SCP) of the Netherlands conducted a survey among transgender people. The incidence rate of this group is extremely low and the honesty of the answers was expected to be severely biased using traditional means of sampling and recruitment.

So using the RSS approach, researchers set out to find “starter seeds”. These seeds are peer group leaders who can be expected to have contact with many other people within a group. A random sample is then drawn from this list of starter seeds and the seeds in turn invite respondents to participate in a temporary market research online community (MROC). The sequence has to be repeated until the group of participants is big enough to start the survey.

The principles of RSS apply to both complex social studies as well as to more practical commercial studies, but it goes without saying that the suitability of either a Wisdom of Crowds approach or RSS depends on the goals of the study. If census quality is needed than neither is suitable, but for monitoring projects, concept testing and product optimisation, the principles behind both approaches can be used to get relevant research outcomes.

Lex Olivier is MOA ombudsman