This site uses cookies. By continuing to browse the site you are agreeing to our use of cookies. Find out more here

OPINION6 January 2012

The end of online panels: Redux

Opinion

Are online panels finished? That was the subject of a head-to-head debate we published in November. Last month, Dutch research association MOA addressed the same question. MOA’s Lex Olivier reports.

Rounding out his argument on the future of online sample in Research’s November debate, Pulse Group’s Bob Chua said that the days are numbered for online panels now that researchers are starting to experiment with recruiting respondents via social media. His adversary in the debate, uSamp’s Gregg Lavin disputed this, instead predicting a future that takes in the best of both worlds: river sampling (that is, recruiting respondents through websites) and panel sampling.

After its own debate last month, the Dutch market research association MOA concluded that the days of traditional online panels are indeed numbered – and, we would add, so are those of straightforward banner recruitment. Panels have plenty of problems to contend with, including an increase in non-response rates, respondent frustration with long and boring questionnaires and the often impolite way willing survey-takers are rejected for failing to meet recruitment criteria. But river sampling in the form of pure banner recruitment will not be the natural substitute.

So we have no choice other than to look for alternative sources of recruitment – and to learn from the mistakes of the past. During the MOA discussion, a panel of experts came to the conclusion that two new approaches must be incorporated into existing best practice.

The first is to adopt the Wisdom of Crowds principle. Author James Surowiecki writes: “If you get a large enough group of people, who are chosen randomly enough, are independent of each other and not subject to pressure to conform – then you will get a better answer than asking any expert his opinion.”

How would this work in research? If river recruitment comes from a wide selection of websites and if this is done in combination with online panel and adequate reweighting procedures we should be able turn this into a validated research approach.

A second interesting approach is the introduction of respondent-driven sampling or random snowball sampling (RSS). RSS produces samples that are independent of the initial subjects from which sampling begins. As a result it does not matter whether the initial sample is drawn randomly. How this works is best explained by a case study.

Recently the Social and Cultural Planning Office (SCP) of the Netherlands conducted a survey among transgender people. The incidence rate of this group is extremely low and the honesty of the answers was expected to be severely biased using traditional means of sampling and recruitment.

So using the RSS approach, researchers set out to find “starter seeds”. These seeds are peer group leaders who can be expected to have contact with many other people within a group. A random sample is then drawn from this list of starter seeds and the seeds in turn invite respondents to participate in a temporary market research online community (MROC). The sequence has to be repeated until the group of participants is big enough to start the survey.

The principles of RSS apply to both complex social studies as well as to more practical commercial studies, but it goes without saying that the suitability of either a Wisdom of Crowds approach or RSS depends on the goals of the study. If census quality is needed than neither is suitable, but for monitoring projects, concept testing and product optimisation, the principles behind both approaches can be used to get relevant research outcomes.

Lex Olivier is MOA ombudsman

6 Comments

7 years ago

Lex has some interesting ideas, I think the crowdsourcing reference is particularly appropriate. Panels by nature have difficulty representing all segments of the population and are prone to overuse while rivers/social media often provide funny or inconsistent data perhaps due to lack of committment to the survey process or inexperience at taking surveys. The intelligent blending of these sources will ultimately give us the required stability and representivity. The industry is working hard to improve sample source blending practices.

Like Report

7 years ago

Since we'll have people ready to answer surveys, online panel we'll be the key ! Questionare now : how to recruit them ? How to improve their experience (topics of surveys; length, look and feel) ? how to incentive them ?

Like Report

7 years ago

Interessante onderwerpen: River Sampling, Random Snowball Sampling & Wisdom of Crowds, en we moeten inderdaad heel kritisch blijven op online panels. Maar dat betekent niet dat Wisdom of Crowds, Random Snowball Sampling, enz., nu ineens dè oplossing zijn en de betere panels ineens in de ban moeten worden gedaan. T.a.v. Wisdom of Crowds vergeten velen dit deel van de tekst van Surowiecki: “who are chosen randomly enough, are independent of each other and not subject to pressure to conform”. Ik hoor en lees te vaak de gedachte dat een groot aantal personen op zich reeds een garantie voor goede spreiding zou zijn. Dat geldt alleen indien de fractie van het universum de 100% begint te benaderen: hoe dichter bij 100% hoe kleiner uiteraard de theoretische kans op een afwijking van het universum. Daarnaast is het criterium ’random enough’ natuurlijk wel erg vaag en boterzacht. Bij Random Snowball Sampling lijkt mij – analoog aan Surowiecki’s ’not subject to pressure to conform’ – van groot belang in hoeverre de door “starter seeds” uitgenodigde respondenten voldoende willekeurig zijn geselecteerd. Ook indien zij al hun contacten hebben doorgegeven, valt er veel af te dingen op de potentiële spreiding in de kenmerken van die contacten. In hoeverre zijn dit in (te) veel opzichten vooral klonen van de ‘starter seeds’? En hoe verifiëren we dat? Food for further thought.

Like Report

7 years ago

Frank Kelly Thanks for your suppport. Only a minority of people leaving a panel do this because they feel overused. On the contrary they leave panels because they did not qualify for surveys or because they did not get as many surveys as they expected. Nevertheless I agree with you that blending is one of the attractive ways forward

Like Report

7 years ago

Naim Mejaat At this moment we estimate that 6% of the population is member of one (or more) online panels. Most of the remaining 94% say that they were never approached or are fully unaware of panels. This seems to support your argument. However if we have not reached this group by now by means of the traditional recruitmnet methods, future endeavours to reach this group at reasonable cost will fail and we have to look for creative alternatives. Moreover a large portion of this unreached group is only prepared to participate in surveys and not in panels

Like Report

7 years ago

I feel that the concept of Access Panels as we know it today is gradually disappearing. Just a bunch of companies are out there offering a concept which is becoming more and more expensive to maintain and provides tigher returns. In addition to that, Access Panel companies offer on average, an awful experience to their respondents by presenting them with long, and boring questionnaires and they know it. As result, these elements boost attrition rates and attracts fraudsters to their panels to grab easy rewards. However, revenue is revenue despite their preaching with regards to quality panellists, quality processes and quality data, to all that I have to say that my experience indicates that it is just cheap propaganda to justify their positions on a market that has evolved a lot faster than them. The consumer are now empowered, young and not so young with plenty of spirit and Access Panels are walking alongside with the help of zimmer frames.

Like Report