FEATURE1 July 2009

The IT crowd gathers at the Casro tech conference

Technology

Tim Macer files his report from this year’s optimistic Casro technology conference in New York.

?The mood was refreshingly, perhaps surprisingly confident at Casro’s Annual Technology Conference held in NY in May. While the downturn and its negative effects on business for research organisations were widely referred to, minds were concentrated on the opportunities a recession presents – particularly to engage technology to work a bit smarter. It was also the second year of the conference’s new format, with a competitive call for papers rather than a roster of invited speakers, and the quality of the programme seemed greatly increased as a result.

Christian Super (Ipsos), in his introduction as co-chair of the conference observed that the 2001 downturn, though focused around the dot-com bust, actually set online research on its present growth trajectory. Hugh Davis (Greenfield Online) then broadened the argument to show how real innovation in the past has often had its roots in times like these. When cash is tight and people are focused on eliminating waste and streamlining their businesses,it pays to review your technology strategy.

Davis provided examples from Greenfield Online of recent process remodelling exercises that had enabled the firm to simplify its workflow and cut costs substantially. “Process mapping will quickly demonstrate the pain points and show where there is no value being added, from the client perspective,” he said, advocating that research firms develop a technology strategy to help them recover from the effects of the recession. He set out seven strategic principles, which included rigorous automation, using vendors to drive down costs and concentrating on few technologies.

Cathy Allin (Decision Insight) presented shopper marketing, and shopper marketing research, as “an explosive area of opportunity for retailers and for market researchers” and, as advertising fragments and wanes in importance, “a way for the [research] industry to be relevant to retailers and to consumer packaged goods manufacturers”. Differentiating it from conventional consumer research, Allin explained it was important to understand that there were important differences between consumers and shoppers and that most consumer purchasing decisions are made not by consumers but by shoppers. Her firm has developed a web-based virtual shopping technology platform which simulates the entire shopping experience, from car park to checkout.

Virtually speaking
Allin said that her clients were finding the simulations to be “highly predictive and highly correlated with actual sales figures” when making changes to both packaging and also in-store positioning and presentation. She referred to a recent project in which a classic consumer survey had indicated a strong preference among participants for a new package design. Yet simulating the package in the actual retail environment revealed that the new design would make no difference to sales figures. Instead the virtual shop simulation was able to identify other strategies for boosting sales, while avoiding needless expenditure on a new identity.

The method has also proved to be popular with respondents. Recruiting from consumer access panels, the virtual shopper surveys were routinely achieving a 96% completion rate.

A conference highlight was Moskowitz Jacobs’ VP, Alex Gofman and his hilarious but highly informative presentation on future Webs beyond Web 2.0. His research had actually discovered future Web versions mapped out as far as Web 42.0, though there is really only consensus over the next two.

Web 3.0 is talked of as the ‘semantic web’ where text, image, audio and video start to converge, revealing the meaning and significance of their content, making it easier to discover and utilise the whole range of data on the internet, overcoming the current limitations of search engines and incomplete arbitrarily applied taxonomies. Web 4.0, by contrast, will be the ‘connected web’ where everything from our laptop to our car, our fridge and our domestic central heating system joins in with the same global network. All of these products could present new research and data acquisition opportunities.

An important point Gofman made was that each of these waves of the internet does not replace the previous wave in the way people often imagine. He even traced through several recognisably Web 2.0 services which, rather inconveniently, did not appear in the last two to three years but have been prevalent for over ten years. By the same token, neither will Web 2.0 or the future versions in any way replace Web 1.0 phenomena.

Web 2.0 and social networks
Focusing on the role of Web 2.0 within research, in the context of the five stages of the social web identified by Forrester, Gofman considered there was still much more mileage to be had from Web 2.0 in providing a shared social experience for consumers, and this included predictions that consumers will become active in using the web to combine their purchasing power and even develop new products.

The potential of social networks for research was also a topic for Simon Chadwick (Peanut Labs), whose company has pioneered using such sites as a sample source. Chadwick aimed to overturn a number of prejudices held against this approach, by challenging some of the assumptions held about existing online research – particularly using email , which is declining in use, as a means of inviting respondents to participate in surveys. “Every six months, the average age of the Facebook user increases by one year,” Chadwick explained. “Peanut Labs started using social network sites to reach generation Y, now we are finding it is a good way to access all generations… Email no longer works as a method of invitation to research with some groups, such as Generation Y, ethnic minorities, young mothers.” He also saw research as providing a useful source of revenue to social network providers who are failing to support their activities with actual revenues.

The technology built into social network sites to allow interoperability of server-side applications makes it relatively easy to sample across a broad range of social networks. Peanut Labs, for example, had recently redesigned their sampling tool so it could work with 250 different online communities. Incentives are often easily achieved through micropayments in virtual currency, which is a feature of most communities.

Data quality
Yet he warned that researchers need to reform some of their practices to work within these communities – with special care required to ensure respondents do not consider the approaches to be spam, or to abuse the relationship, as the wrath of the dissatisfied will be broadcast widely. He had observed that the quality of data tends to diminish rapidly after 17 minutes in this medium, though use of flash and other sophisticated graphics can extend engagement by, in effect, reducing the perceived length of the interview.

Qual’s tipping point
Social networks and social media were also providing another unprecedented opportunity in research, according to Jim Longo (Itracks) who said “we have now reached the tipping point for qualitative research online: we have seen an exponential growth in the last six months.” And this was despite the capability having been around since the mid 1990s. He explained: “In the early days, you tended to get short answers. We did not get the emotion and candour which we do today.”

Longo stressed that online qual should not be seen as a replacement for face-to-face, but that it offers two additional different research methods, with real-time focus group and asynchronous bulletin-board discussions. He noted that some providers are attempting to use video to allow group members to see faces on screen, which he considered was a retrograde development. Online focus groups based around text tended to result in transcripts around a third longer than those from the equivalent face-to-face group where participants must speak in turn, and it was rare to have a group where one or two persons dominated. He argued that the methods were much more respondent-centric, and participants often continued to contribute even after the group had officially closed.

“The end result that you get with an online focus group is the same, as many studies now show: it’s just that the way it is articulated is different.” The advantages he stressed were in reducing cost and better coverage, by being able to include participants that are beyond the reach of a central group facility.

It was a conference of both pragmatism and optimism, in which virtually every speaker stressed not only that research must adapt to changing circumstances, but offered practical advice on how to do it – and importantly, revealed an understanding that this not at the expense of the participant. Those close to the action tend always to know who’s the real boss.

0 Comments