NEWS11 September 2009

Online ‘no less accurate’ than phone, says ARF

Features North America

US— According to new data from the Advertising Research Foundation, there is “no clear pattern” of telephone research based on random samples providing more accurate results than surveys conducted using online panels.

The ARF’s chief research officer Joel Rubinson revealed the data from the organisation’s Foundations of Quality study in his blog.

Comparing the representation of various demographic groups in US Census data with samples for a mail survey of 1,500 people, a random-digit dial telephone survey of 1,000 people, and 100,000 interviews conducted online using 17 different panel providers, Rubinson said: “There is no clear pattern of RDD providing the more accurate answer vs the average result from internet panel research on a series of benchmarking questions and demographics.”

The biggest failures of the telephone sample were in the age of respondents and the usage of cellphones – a problem he put down to the increasing number of “cell-onlys”.

“Even in the world of political polling,” Rubinson said, “RDD interviewing methods can produce very different results and might not be as accurate as well-designed online research”.

If proper practices are implemented he claimed there is no reason online research cannot produce “comparable, consistent and accurate” data.

Rubinson said the ARF will be unveiling a recommended quality assurance process for buyers and sellers of research at an open meeting later this month.

@RESEARCH LIVE

2 Comments

15 years ago

Very interesting, yet totally contradictory to some recent findings that have been funded by more academic research as well as some of the findings of some of the big-wig end-user clients. Also curious as to why the sample size was so large for online and so few for telephone and mail - as clearly the differences in sample sizes would skew any real comparison of the data, even moreso than a targetable opt-in panel would normally skew any data set. Read the following for some recent insight into the validity and legitimacy of online panels: http://tinyurl.com/mkmnhj Certainly offers some food for thought - especially since as I am typing this, the only banner ads flashing on the research-live website are for an online panel company!

Like Report

15 years ago

Thank you Anonymous for the link to that ABC-Polling article in your comment above! That article is a must-read for anyone dealing with online market research. The issues I am having with the original post come from statements like: "... While a research supplier can weight data ... the demographic imbalances from RDD are problematic. The more skewed the sample is, the more extreme the weights....”. Talking about skewed sample: we're discussing opt-in online panels. People sign themselves up to click through internet-based questionnaires in exchange for cash or gifts – you might have seen the pop-ups. Online research results are not based on a probability sample of respondents, but what’s called a convenience sample. They have no calculable error margin, because they live outside the realm of inferential statistics on which scientific surveys are based. Beyond sampling, there are other potential problems. The majority of online opt-in panel surveys are actually being filled in by a small number of panelists – professionals, if you will. Since participants join for the compensation, one might wonder whether they pay real attention to the questionnaires or just click through them. (“The more surveys you participate in, the more you can earn,” One potential way for the MR Industry to move away from "convenience" sampling and start building smaller, but non-opted-in panels by actively inviting potential panel members in an offline mode, requesting them to sign-up: by invitation only, or at least not allowing everyone to be able to join, but only one-out-of 5 registrations, or one-out-of-two... Anyone more thoughts?

Like Report