FEATURE1 December 2008

Technology review of the year


Tim Macer remembers the products and trends that changed the way we worked last year

The rise of SaaS

2008 seems to be the year when ‘software as a service’ or SaaS passed the tipping point. The media has picked up on moves being made by Microsoft and Google into so-called ‘cloud computing’ – software that exists in an apparently weightless state out there on somebody else’s server – while you no longer need to concern yourself with finding your server to run the application.

Almost all the major MR providers have offered solutions hosted on their servers for years. Confirmit, now well into its eleventh year, has never been other than a cloud solution, and there are many others. It has always made sense for web-based survey software, but it now extends to every other aspect of the MR process too. MR software suppliers are enthusiastic SaaS proponents. Of the last twelve software products we reviewed in Interface, nine were offered in SaaS form, including two in a choice of SaaS or desktop, and just three in desktop-only formats.

The technology has now largely caught up with expectations. True drag-and-drop interfaces, and modern, fast internet connections now mean that using the best hosted software is no longer the faltering, treacle-wading experience it used to be.

Converso, Confirmit, GMI Nebu, Voxco and many others today offer SaaS multi-modal interviewing for web, CATI, CAPI and in some cases even paper. There are several very complete analysis tools online available today, such as Askia Vista, Marketsight, MTab, Pulsar Web. Even specialist data collection methods are covered, such as The 3rd Degree’s hosted SMS survey platform and a couple of interactive voice response tools. There are online qual tools (Vision Critical, nQual) and online verbatim coding from Ascribe. Dexterity even offers online research project management and there is even something for those still drowning in paper – Data Liberation’s SaaS archiving service lets you scan and deposit images into their datacentre, so what once occupied the basement is now consigned to the cloud.

CAPI has now fully morphed into ‘traditional’ CAPI on laptops and ‘mobile’ CAPI. Though perhaps a better distinction would be ‘stand up’ and ‘sit down’ CAPI. For a while the stand up strand was hailed as ‘handheld’ CAPI, with the emphasis on size and portability. Now the emphasis is on communications and mobile telephony. It’s just as well really, as the personal digital assistant, or PDA, on which the old handheld CAPI was dependent upon has waxed and waned in its fortunes and will soon footnote in the history of personal computing, alongside floppy disc drives and dot-matrix printers.

It was the availability of relevant consumer products that make CAPI just about affordable in fieldwork. However, the action in the consumer market has now shifted from PDAs to the web-enabled feature phones from Nokia, Samsung and Sony Ericsson, or mobile devices like the iPhone and the BlackBerry. Fieldwork forces investing in handheld interviewing solutions need to ensure they are future-proofed with technology that is relatively device-independent, as today’s devices will be obsolete within a few weeks and spare parts such as chargers or even replacement devices for those lost through attrition get harder to find by the month. Web and Java-based platforms may prove more resilient in the long run by being more open and less proprietary.

Just as the web allows servers and databases to be location-independent, it also facilitates location-independent work and collaboration. Now, software providers are adding functionality that specifically caters for collaboration not just between colleagues in the same company, but across the chain of clients and suppliers that exist around market research activities. Tools across the board often come with elaborate permission structures that let you grant then restrict access to different areas of functionality or data. They can be seen in everything from survey platforms to panel management tools, in online analysis packages and report publication portals, so clients or suppliers can interact with your systems safely and in a way that respects respondent and commercial confidentiality.

It’s an area that is about to become much more dynamic, mirroring the kind of ‘data mash-ups’ and contributed content that typify Web 2.0. Several MR software companies are now exploring ways to allow suppliers to promote services or contribute content, to create an effective marketplace within their SaaS platforms. For some, this means adding an area to their portal where customers can invite bids and go on to strike the deal, alongside a news feed about requests and offers. Others are building in content management systems so that firms or agencies with a research brief might be able to find the answers to 80% of their questions already done in recent surveys, with the data ready to download – leaving just a slimmed-down list of questions to go into a new survey.

Quality control
At last, panel providers and their technology suppliers have borrowed an idea from the insurance industry and are starting to use IT to trap survey fraudsters from their tell-tale behaviour and then to share that knowledge across panels. In the USA, Peanut Labs launched Optimus, Markettools brought out TrueSample, and WesternWats now have Bloodhound. In Europe, there is MoWeb and there are others too. All use digital fingerprinting methods to identify the PCs participating in their surveys. These take the information web browsers necessarily provide to any web server about the PC and its setup. Couple this with the location-specific parts of the IP address and you have a very reliable way to give any PC in the world a unique identifier.

The flaw with the current approach is that the technology providers are still competing for the best patented fingerprinting method. They will share blacklisting data between panel companies signed up to their service, but there is no sharing of knowledge across services yet. This will only come if there is only one survivor, or if the providers agree a common standard for fingerprinting, so that the knowledge can be usefully pooled.

Death of a survey
While the industry continues to fret that a combination of reluctant respondents, over-demanding clients, unscrupulous direct marketers and bad samples are conspiring the kill off the survey, innovative providers like DatStat are finding ways to liberate research from the penury of having to crank the handle ever faster. DataStat’s Illume end-to-end survey product goes further than most systems in transforming the survey process into a series of independent, re-usable components. By doing this, each question is treated as a separate object – it can be viewed in the context of a survey, or not, as you wish.

Componentisation and the use of data objects is a widely accepted design principle of large IT systems today which often goes hand in hand with the imaginative use of modern database systems architecture. All too often, the way databases are deployed in market research software packages is depressingly unimaginative, trapping data from each question within the rigid confines of the survey.

The DatStat approach lets you uncouple questions then reassemble them, combine or re-use them and create hybrid reports that bring together data from existing surveys and new surveys and external data sources. Just as the marketplace approach can reduce the burden of pushing out a new survey each time, so too can this further reduce the length of the eventual survey that needs to be fielded – and there is no surer way to get a good response than to make sure your survey is short. Couple this approach with a trading exchange for existing data, and market research could truly enter the information era.

December | 2008