This site uses cookies. By continuing to browse the site you are agreeing to our use of cookies. Find out more here

FEATURE20 November 2012

What does Google’s election success mean for polling?

Features

Google was the second most accurate pre-election pollster, according to Nate Silver’s analysis of Presidential polls. Experts offer their thoughts on what this means for future election surveys.

Tom Mludzinski, deputy head of politics, Ipsos Mori

The US election really ushered in the era of big data. Pollsters collected more data, and published it more frequently than ever before. Over 50 pollsters published during the election and there were six daily tracking polls (Ipsos included).

Modellers and aggregators such as Nate Silver and Mark Blumenthal thrived on the amount of data available while interested spectators could look at averages and a whole number of polls to see the clear picture rather than relying on one or two polls that may show a different picture.

At this election, in the US, some of the more traditional pollsters proved to be out of step while online polling, including Ipsos, proved to be among the most accurate. Polling is changing as demographics shift and access to landlines, mobile phones and internet evolves.

Everyone needs to be considering how to proceed in this new world. Google was successful this time around, but they, like everyone else, will need to stay on their toes to keep up.

Emily Hunt, director of insights, Edelman Berland

The big surprise for me was the swing to online polls as being far more accurate than they have been in the past – in particular Google’s polls. I wouldn’t go so far as to say that the telephone poll is dead – but this shift from telephone polls as the gold standard of accuracy has very interesting implications going forward. And not just for political polling, but also for corporate clients.

As an industry, it seems like the time to embrace a more method-neutral approach, and to capture data wherever and however our respondents are willing to communicate with us. We usually say these sorts of things about digital and mobile research methods – but these days it seems wiser to remember that social media polls are the right choice for one audience, traditional online surveys for another and telephone for yet another. It is how we view the data at the backend and perform the analysis that is key.

Wayne Goodreau, research director, Invoke Solutions

I think this past election has to be a big wake-up call for pollsters tied to traditional means of gathering data. Results were all over the place. Romney, himself, was so convinced he was going to win because of how the polls were moving, he didn’t even write a concession speech.

Many feel a main reason for this is that those tied to automatic dialling (and thus to landlines) were experiencing a Republican slant in their data collection because many voters, especially young voters, live a “cell-only” life.

However, moving to cell phones is, at best, expensive and, at worst, impossible for some (especially in cases of automatic dialling). Google’s survey tool, in my opinion, is perfect for this. When it first came out, it was immediately apparent to me that the company’s surveys would be more useful as a polling application because of its strength at asking quick-and-dirty surveys to a captive audience at a relatively low cost. This is also one of the reasons I don’t know if it can move beyond a simple polling application.

What does this mean for political polling plans? I think it represents an opportunity but I don’t know if we will see a complete shift to this methodology as a replacement for traditional methods anytime soon. Political polling is a very heated environment and implementing a complete shift to online would be tough to manoeuvre.

Even in traditional research, you still get the occasional concern over a bias against those without computers. In politics, this protest will be even louder. I think, though, that maybe pollsters could start considering a hybrid model that combines traditional methods of polling with methods such as Google’s surveys to paint a more complete picture.

Simon Chadwick, managing partner, Cambiar

There’s a wonderful irony in the fact that Google Consumer Surveys was the second most accurate “pollster” (out of those doing five or more polls in the final weeks) in the US election, and it is this: it all comes down to sampling.

When GCS was launched onto the market, one of the arguments that Google advanced in favour of its new platform was that it was a much more robust sampling technique than either online panels or phone (with or without random digit dial). Arguably, they have been proved right.

This result also goes to support Google’s assertion that their attention to the respondent, most notably in limiting the number of questions asked and providing something of real value as an incentive (access to the premium site they were trying to get to in the first place), is key to producing a sound sample and accurate results.

The failure of phone and online panels is due to a combination of skewed sample bases with dire response rates. By opening up the sample frame to the entire internet and by making the respondent experience a pleasant one, Google have actually restored research to the standards that were preached and taught decades ago – but which were lost in the avalanche of new technology.

I think we can expect to see a lot of Google lookalikes spring up in the next four years – and maybe a few polling firms outsourcing their polls to Google itself. Expect fewer robocalls (they did not perform well). Maybe we should also expect to see a lot more multi-mode polling as well. Finally, don’t count out mobile – especially if Google itself introduces its mobile platform before the next election.

George Terhanian, North American president and group chief strategy officer, Toluna

Conformity in pre-election polling is not necessarily healthy for the enterprise – 2012 was a splendid year for almost every organisation that tried to forecast the outcome, especially the ones that relied on online methodologies.

What’s interesting is that both online and telephone surveys had a remarkably narrow band (no greater than three percentage points for each candidate). This is because the recent trend in pre-election polling is to produce forecasts that look like all others, and analysts such as Nate Silver are part of the reason for this.

The “poll of polls” methodology they employ puts pressure on organisations that produce outliers to alter their methodologies so that their forecasts conform to the others. But will the success of Google’s and the other online election forecasts in 2012 serve as the final nail in the coffin of telephone research? Probably not. A great election forecast is necessary evidence of a methodology’s accuracy, but it is not altogether sufficient given that pre-election polling is only a small part of public opinion research and, more generally, market research.

The question to consider here is not whether online research will replace telephone or face-to-face pre-election polling. Rather, it’s whether DIY research will one day obviate the need for research buyers to rely on pollsters, or even market research agencies.

Companies such as Google, Survey Monkey, and Toluna have already placed their bets by offering strong DIY products, but these are still quite primitive compared to more mature, non-DIY approaches. Consequently, the DIY methods tend to produce less accurate information across a variety of topics. However, this may change in the near future if organisations that offer DIY survey systems are able to refine their methodologies to improve accuracy without raising costs or reducing speed.

In the run-up to the next US presidential election, we may see hundreds of final forecasts – not only from mainstays and upstarts, but also from individual consumers who throw their own hats into the ring. The democratisation of market research is sure to continue.

0 Comments