This site uses cookies. By continuing to browse the site you are agreeing to our use of cookies. Find out more here

FEATURE1 October 2009

Is ARF roadmap to online quality the true path?

News

There’s a familiar ring to the Advertising Research Foundation’s Quality Enhancement Process for online research. But with big names like Coke and Unilever backing the scheme, it might just be the breakthrough the industry needs.

When ARF president Bob Barocci talked of taming the “wild west” that was the online research industry back in 2006, one anticipated that its efforts would come (like all good Westerns) to a violent, noisy, bloody conclusion. But this sheriff’s no gunslinger, more a peacemaker. And he arrives unarmed, save for a few spreadsheets.

In the two years since the ARF set out on this path, a host of companies have brought to market technological solutions to the panel quality problem, chiefly digital fingerprinting. Compared to these offerings, QEP is resolutely low-tech. To quote the ARF, it is “a process with templates, definitions, metrics and declarations” that is intended “to bring structure to the conversations that buyers and sellers are having or need to have about data quality”.

If that sounds familiar, you’re probably thinking of Esomar’s ‘25 Questions to ask of a panel provider’ – first published in 2006 and since upgraded to 26 questions. Or perhaps you recall the Interactive Marketing Research Organisation’s own set of questions. More recent than both those examples was the publication of the Platforms for Data Quality Progress, which contained a set of transparency guidelines requiring the disclosure of information about panel structure and administration policies, and information related to panel composition and sourcing, sampling procedures and respondent recruitment methods.

The above are mentioned not to denigrate the ARF’s efforts, but to demonstrate how – on a number of occasions – basic transparency, conversations between buyer and supplier and the application of ‘old fashioned’ research rigour have been shown to be key to raising quality standards in online methods.

ARF chief research officer Joel Rubinson acknowledges what’s gone before – pointing out that some, if not all, of the above mentioned documentation formed part of the evidence base for its $1m research-on-research programme, Foundations of Quality.

But what ARF brings to the party, he says, is more structure to the process of information-sharing between buyer and supplier; a structure that takes into account the nature of organisational dynamics so quality issues can be managed in way that doesn’t detract from the real purpose of research: unearthing insight and consumer understanding.

“No one wants to be spending a huge amount of time on this stuff,” says Rubinson. “Companies are not doing research to scrutinise research; the research is done because there is a marketing need.”

What QEP provides are templates and spreadsheets in which panel companies will be asked to log important information about their panel: where it’s recruited from, what incentives are used, the profile of their panelists and how they are managed.

The next step in the QEP calls for the development of sampling plans for different types of research programmes, and for each study a report is to be produced setting out the key sample characteristics known to affect consistency and response.

Finally we arrive at the study consistency and engagement report, which aims to capture “key design elements on a per-study basis that drive response quality”, explains the ARF. “Metrics are used to monitor adherence, record exceptions and create a repository of aggregated knowledge to improve future quality.”

Providing a means for panel firms and their clients to ensure sample consistency, and hence consistency in research results, is central to the purpose of QEP. Steve Gittelman, president of research firm Mktg, has spent the past year and half on his own study of online panel consistency. He says: “I applaud the ARF for its efforts and focus. Our data agrees with theirs and we are excited by the validation they have provided.

“There are myriad drivers of online inconsistency such as mergers, changes in panel tenure and sourcing shifts just to name a few. The ARF templates will provide critical indicators that will alert informed researchers when inconsistency may be an issue.”

It’s too early to say ‘mission complete’ or ‘problem solved’, however. This week’s launch of QEP was really just the start of the process. ARF has secured year-long commitments from such household names as Coca-Cola, Unilever, Microsoft, General Mills, Kraft Foods, Bayer, GM and Capital One, each of whom will be piloting and helping refine what’s on the table.

Securing such support was undoubtedly the ARF’s biggest coup – and Rubinson agrees. “We realised that if we did not have a number of these advertisers ready to put their resources and their people behind this, then [QEP] would just be another point-of-view.”

0 Comments