Can we really trust political polls to predict the outcome of an election or referendum?

The campaign for Scottish independence took a dramatic turn two weeks before polling day when a poll from YouGov showed the Yes campaign ahead for the first time. The result of the poll wiped billions of pounds off the stock market, the pound dipped and political leaders hurriedly rushed to Edinburgh to announce sweeping new powers for the Scottish parliament.

How the polls have shifted (via YouGov)

Yet questions continue to be asked about how accurate these polls really are. The spectre of the 1992 general election, when a projected Labour poll lead of seven points became a Conservative victory, still haunts pollsters to this day.

Speaking to The Drum, Anthony Wells, associate director of YouGov, says: “We are broadly confident but it's trickier than it would be for a general election poll." He explains that polling companies work by refining their methodology over a number of elections and learning from their mistakes.

However, as the Scottish referendum is a one-off event with no previous results to use as a comparison, pollsters are “working from first principles” in trying to model the electorate.

The issue of modelling led to a very public dispute earlier this year when the president of YouGov, Peter Kellner, accused competing polling company Survation of having "too many passionate, pro-independence Nats" in its sample. In reply, Survation's director of research, Patrick Briône, said that Kellner's charges were "unprecedented in the industry".

Wells tells The Drum there are two big concerns about the current polling numbers: firstly that their surveys generally did not sample the most “marginalised bits of society", adding that: "People who are not engaged with politics are under-represented in polls. They don’t open the door.”

Usually, he adds, this is not important in polling terms because this group generally doesn’t vote, but with turnout in the independence referendum expected to be 80 per cent to 90 per cent, people are voting in a “significantly different way” from those sampled by the pollsters and could negate all of the carefully crafted predictions

The second big issue the associate director raised was the “enthusiasm problem”. Yes voters, he says, tended to be vocal and happy to participate in online or telephone surveys. However there is doubt about the number of “shy noes” who simply refuse to participate in surveys or claim to be undecided. Wells says the fact that all of the pollsters, despite using different methods, had broadly similar results gave him confidence that this was being allowed for in the final predictions

Asked about the 1992 debacle, Wells tells The Drum that opinion on pollsters' accuracy depends on predicting the correct result. In 1997 pollsters predicted Labour would win the general election but “overestimated how many votes they would win", however Wells says there was no real criticism of the polling organisations afterwards: “The polls said Labour landslide and it was a Labour landslide."

He adds: "In that case no one really noticed." However in a tight race small differences can lead to negative perceptions: “If we predict Yes will win 49.9 per cent of the vote but they win by 50.01 per cent the headlines will say ‘idiot pollsters get it wrong again',” he says.

Given the amount of money financial markets can win or lose based on a poll we ask Wells if his company gets calls asking for advance notice of results? “We’ve had lots of people from banks and investment companies calling us to commission private polls,” Wells says, “and we have to tell them we are doing all the polling we can.”

He continues: "It’s a bit like turning up to the day of the World Cup final and asking for a ticket, these things are booked in months in advance.”

Peter Kellner, president of YouGov, recently wrote: “In the final analysis, all pollsters assemble the best sample they can and seek to extrapolate as accurately as possible from the people they reach to the voting population as a whole. It is not, and never will be, an exact science. We all rely on our judgement. And in a close race, quite separate from the factors listed above, we always risk being caught out by random sampling fluctuations: none of us can repeal the laws of probability.”

Much rests on the result of the independence referendum, the future of the UK and the fate of the prime minister might be called into question depending on the result. Spare a thought though for the pollsters, trying to model an electorate that votes for Labour in Westminster elections and the SNP in Scottish ones while having to take into account 16 and 17-year-olds having the vote for the first time in British history.

The referendum has also revealed another challenge facing the traditional polling industry. Facebook has published statistics showing that the Yes campaign page on Facebook has more than 258,000 likes while the No campaign’s has just over 182,000. Twitter has also been full of self-created polls asking people to retweet for yes or favourite for no.

This may be the first campaign where we see social media have more influence than traditional media. However it may also show us that the future of opinion sampling might rely more on what people are saying online than the carefully crafted work of the pollsters.

Get The Drum Newsletter

Build your marketing knowledge by choosing from daily news bulletins or a weekly special.