It is fair to say that 2015 was a difficult year for pollsters. Despite some successes (correctly calling the Scottish National party surge in Scotland and Jeremy Corbyn’s election as leader of the Labour party) they got the one that mattered most wrong. They called the general election as a dead heat, only for David Cameron to be returned as prime minister of a majority Conservative government.

In response, the British Polling Council set up an inquiry to analyse what went wrong and make recommendations on how things can be improved in future. That inquiry, led by Patrick Sturgis of the University of Southampton, published its initial findings this week ahead of a more detailed report later this spring. The inquiry’s initial conclusions appear to suggest that inadequate sampling (eg not reaching the right people in the first place) was to blame for what happened. Other theories that were largely ruled out included ‘late swing’ ie voters switching to the Conservatives at the last minute, failing to correctly deal with turnout or even cynically manipulating survey results to fit in with the crowd as Dan Hodges has suggested was the case.

So where does the polling industry stand now? Here are a few things that you need to know:

1) Political polling is a very small part of what market research companies do and we may see less of it in future

Political polling rarely makes money. At least it typically does not make money for those doing the public polls we see in newspapers or on the television. Ben Page of Ipsos MORI was candid about this earlier this week. The political polls you see in the public domain are usually done as quickly and as cheaply as possible for media companies that pay next to nothing for them. Of course, it should go without saying that market research companies are in the business of getting things right however much they are paid. When things go wrong (as they did in May) the first reaction would usually be to invest in finding out what went wrong and fixing the issue. However, with such little obvious return on investment for doing so, we might find that some companies decide not to bother doing voting intention polls in future instead. Especially if they conclude that they cannot easily be done.

2) Voting intention polls are hard (and getting harder). There are no easy answers

In this context, it is worth remembering what voting intention polls are and why they are so difficult to do. A voting intention poll is seeking to understand the hypothetical future behaviour of voters at an election that – in reality – may or may not be happening any time soon.  This is different to simply asking people what they think about David Cameron. There are several variables within this that are problematic. Perhaps the main one is that pollsters need to build a reliable picture of what the voting public looks like amid shifting party loyalties and alongside the increasing reluctance of people to take part in opinion polls.

Indeed, one of the key initial findings of the polling inquiry so far is that Conservative voters appear less likely to want to take part in polls than Labour voters. Or, looking at things another way, that politically engaged voters from the left are much easier to get hold of than say older voters over the age of 70 that skew Conservative. Perhaps this led to an overstating of the Liberal Democrat vote share in 2010 and then the Labour vote share in 2015 as this group of ‘engaged leftists’ switched sides. Why this problem has become so pronounced over time – and what to do about it – is likely to be a key question the inquiry seeks to answer in its final report.

3) Polls understating the Tories are not a new phenomenon

One of the most striking findings of the inquiry, outlined below in a slide produced by Will Jennings, is that none of this is new. Polls have consistently underestimated the eventual Conservative vote share over time. This shows that the failure of pollsters to accurately reflect the Conservative vote share in 2015 should be seen as an existing problem that is getting worse rather than something unique to last May – although that of course does not fully explain why May was wrong to the extent that it was. (This does incidentally mean that, however badly Labour is doing at the moment, the chances are that its ‘real position’ is actually worse).

Pedley 1

4) Different pollsters will have different solutions – and that’s OK

The initial findings of the BPC inquiry suggested that there is ‘no silver bullet’ that can fix what went wrong last May. This is just as well because the chances are that pollsters will disagree on what went wrong and how to fix it. Some think that online surveys are the future, some will stick with telephone, some will adjust how they deal with turnout (as ComRes have already started doing) and others will look at how they deal with the oversampling of the ‘politically engaged’. This is fine. Each pollster is entitled to come up with a different solution and time will tell who is right and who is wrong. In future, it would be better if polling was treated as something of an academic pursuit for ‘the truth’, perhaps funded by universities, rather than an artificial (and largely meaningless) commercial pursuit by companies for the title of ‘gold standard’. But perhaps that is too ambitious.

5) Polling will be taken less seriously from now on

Given what happened last May, politicians that face bad poll numbers have an easy response. I suspect in the coming months and years we will see many a politician on TV claiming that we should ignore polling that does not suit them because ‘we all know the polls can be wrong’. This is an inevitable consequence of what happened. In reality, for polls to be truly trusted again, the pollsters will need to call a general election right. Ironically we may find that politics comes to their aid. If the Conservatives continue to dominate British politics, the 2020 general election may just be about getting the scale of the Conservative victory right as opposed to calling the winner. The media will likely be more forgiving there as long as the pollsters call the fundamental result correctly.

6) Look beyond voting intention figures for the real story

Opinion polls are still useful. Perhaps in future looking at the general ‘vibe’ coming from public opinion should be taken as seriously as voting intention polls that claim to have ‘the answer’.  For example, if we had Pedley 2looked more closely at which party the public trusted on the economy or which of David Cameron or Ed Miliband was preferred as prime minister, then we may have understood which way 2015 was going better than we did by looking at the voting intention polls. In fact, Mike Smithson of Politicalbetting.com has produced a chart showing that looking at the ratings of party leaders might be the best way to understand who will win a general election. This is unlikely to ever be the whole story – and cannot give us a complete picture of what the parliamentary arithmetic will look like after a general election – but perhaps such ratings should carry more weight than they previously have.

7) There may be trouble ahead

It is fair to say that the last thing the pollsters need after 2015 is to call another high-profile race wrong. However, analysis conducted by ComRes has suggested that there is a very real prospect of that happening at the coming European Union referendum. Before Christmas, ComRes conducted an online and telephone poll simultaneously looking at the EU referendum question and found a strong lead for ‘Remain’ on the telephone but a close contest online.

Pedley 3

ComRes experiment: EU referendum poll conducted online and by telephone Source: ComRes

This trend has been shown by other pollsters too. If this trend continues and pollsters vary in terms of the survey mode they use then someone is going to call the referendum wrong. Perhaps polling will face another inquiry sooner than we think.

8) Looking forward – using data better

So as we wait for the final report from the BPC inquiry to be published it is fair to say that pollsters face many challenges. They will need to decide if measuring voting intention is either possible or indeed the best way of taking the nation’s political temperature in the future. Perhaps, like Gallup in the US, some will decide to focus on other measures such as leader satisfaction or economic confidence rather than voting intention. Those that continue to publish voting intention polls will need to decide how to fix what went wrong in 2015 and have the confidence to pursue their own way even if others disagree. Finally, perhaps we will all have to learn to use data better. We may need to take a ‘big data’ approach to politics that consults many sources – such as voting intention polls, leader ratings and real election results – to truly understand what is going on. Perhaps our biggest problem is demanding the certainty of ‘the answer’ from voting intention polls that are becoming harder to do and may not even have it.

———————————

Keiran Pedley is a polling and elections expert at GfK. He tweets about polling and politics at @KeiranPedley