Whenever a poll goes up that shows bad news for someone you get the same sort of comments on social media. As I write this piece in May 2017 comments like these generally come from Jeremy Corbyn supporters, but that’s just the political weather at this moment in time. When the polls show Labour ahead you get almost exactly the same comments from Conservative supporters, when UKIP are doing badly you get them from UKIP supporters, when the Lib Dems are trailing you get them from Lib Dem supporters.

There are elements of opinion polling that are counter-intuitive and many of these myths will sound perfectly convincing to people who aren’t versed in how polls work. This post isn’t aimed at the hardcore conspiracists who are beyond persuasion – if you are truly convinced that polls are all a malevolent plot of some sort there is nothing I’ll be able to do to convince you. Neither is it really aimed at those who already know such arguments are nonsense: this is aimed at those people who don’t really want to believe what the polls are saying, see lots of people on social media offering comforting sounding reasons why you can ignore them, but are thinking, “Is that really true, or is it rather too convenient an excuse for waving away an uncomfortable truth…”

1) They only asked 1000 people out of 40 million. That’s not enough

This question has been about for as long as polling has. George Gallup, the trailblazer of modern polling, used to answer it by saying that it wasn’t necessary to eat a whole bowl of soup to know whether or not it was too salty, providing it had been stirred, a single spoonful was enough. The mention of stirring wasn’t just Gallup being poetic, it’s vital. Taking a single spoonful from the top of a bowl of soup might not work (that could be the spot where someone just salted it), but stirring the soup means that spoonful is representative of the whole bowl.

What makes a poll representative is not the size of the sample, it is its representativeness. You could have a huge sample size that was completely meaningless. Imagine, for example, that you did a poll of 1,000,000 over 65s. It would indeed be a huge sample, but it would be very skewed toward the Tories and Brexit. What makes a poll meaningful or not is whether it is representative of the country. Does it have the correct proportions of men and women? Old and young? Middle class and working class? Graduates and non-graduates? If the sample reflects British society as a whole in all these ways, then it should reflect it in terms of political opinion too. A poll of 1000 people is quite enough to get a representative sample.

The classic example of this was at the very birth of modern polling – in the US 1936 Presidential election a magazine called the Literary Digest did a survey of over two million people, drawn from magazine subscribers, telephone directories and so forth. It showed Alf Landon would win the Presidential election. The then newcomer George Gallup did a far, far smaller poll properly sampled by state, age, gender and so on. He correctly showed a landslide for Roosevelt. A poll with a sample skewed towards people wealthy enough to have phones and magazines in depression era America was worthless, despite have two million respondents.

2) Who do they ask? I’ve never been asked to take part in a poll!

Sometimes this is worked up to “…and neither has anyone I’ve met”, which does raise the question of whether the first thing these people do upon being introduced to a new person is to ask if MORI have ever rung them. That aside, it’s a reasonable question. If you’ve never been polled and the polls seem to disagree with your experience, where do all these answers come from?

The simple answer is that pollsters obtain their samples either by dialling randomly generated telephone numbers or by contacting people who are members of internet panels. Back when polls were mostly conducted by telephone the reason you had never been polled was simple maths – there were about forty million adults in Britain, there were about fifty or so polls of voting intention of a thousand people conducted each year. Therefore in any given year you had about a 0.1% chance of being invited to take part in a poll.

These days most opinion polls are conducted using online panels, but even if you are a member of a panel, your chances of being invited to a political poll are still relatively low. Most panels have tens of thousands of people (or for the better known companies, hundreds of thousands of people) and 95% of surveys are about commercial stuff like brands, pensions, grocery shopping and so on. You could still be waiting some time to be invited to a political one.

3) But nobody I know is voting for X!

We tend to know and socialise with people who are quite like ourselves. Our social circles will tend to be people who live in the same sort of area as us, probably people who have a similar sort of social status, a similar age. You probably have a fair amount in common with your friends or they wouldn’t be your friends. Hence people we know are more likely than the average person to agree with us (and even when they don’t, they won’t necessarily tell us; not everyone relishes a political argument). On social media it’s even worse – a large number of studies have shown that we tend to follow more people we agree with, producing self-reinforcing bubbles of opinion.

During the Labour leadership contest almost every one of my friends who is a member of the Labour party was voting for Liz Kendall. Yet the reality was that they were all from a tiny minority of 4.5% – it’s just that the Labour party members I knew all happened to be Blairite professionals working in politics in central London. Luckily I had proper polling data that was genuinely reflective of the whole of the Labour party, so I knew that Jeremy Corbyn was in fact in the lead.

In contrast to the typical friendship group, opinion polls samples will be designed so that they reflect the whole population and don’t fall into those traps. They will have the correct balance of people from all across the country, will have the correct age range, will have the correct balance of social class and past vote and so on. Perhaps there are people out there who, by some freak co-incidence, have a circle of acquaintances who form a perfectly representative sample of the whole British public, but I doubt there are very many.

4) Pollsters deliberately don’t ask Labour/Conservative supporters

In so far as there is any rationale behind the belief, it’s normally based upon the perception that someone said they were going to vote for x in a poll, and weren’t asked again. As we’ve seen above, it’s a lot more likely that the reason for this is simply that it’s relatively rare to be invited to a political poll anyway. If you’ve been asked once, the chances are you’re not going to be asked again soon whatever answers you gave.

Under the British Polling Council rules polling companies are required to publish the details of their samples – who was interviewed, what the sample was weighted by and so on. These days almost every company uses some form of political sampling or weighting to ensure that the samples are politically representative. Hence in reality pollsters deliberately include a specific proportion of 2015 Labour supporters in their polls, generally the proportion who did actually vote Labour in 2015. Pollsters are required to report these figures in their tables, or to provide them on request. Hence, if you look at last weekend’s Opinium poll you’ll find that 31% of people in the poll who voted in 2015 voted Labour, the proportion that actually did, if you look at the ICM poll you’ll find that 31% of the people who voted at the last election say they voted Labour, the proportion that actually did, and so on with every other company.

5) Pollsters are biased, and fix their figures

Again, this an accusation that is as old as polling – if you don’t like the message, say the person making it is biased. It’s made easier by the fact that a lot of people working in political polling do have a background in politics, so if you want to look for someone to build a conspiracy theory upon, you don’t need to look far. Over the years I think we’ve been accused of being biased towards and against every party at one time or another – when Labour were usually ahead in the polls YouGov used to be accused of bias because Peter Kellner was President. When the Conservatives were ahead different people accused us of being biased because Stephen Shakespeare was the CEO. The reality is, of course, that polling companies are made up of lots of people with diverse political views (which is, in fact, a great benefit when writing questions – you can get the opinion of colleagues with different opinions to your own when making sure things are fair and balanced).

The idea that polling companies would bias their results to a particular party doesn’t really chime with the economics of the business or the self-interest of companies and those who run them. Because political polls are by far the most visible output of a market research company there is a common misapprehension that it brings in lots of money. It does not. It brings in very little money and is often done as a loss-leader by companies in order to advertise their wares to the commercial companies that spend serious money doing research on brand perceptions, buying decisions and other consumer surveys. Voting intention polls are one of the very few measures of opinion that get checked against reality – it is done almost entirely as a way of the company (a) getting their name known and (b) demonstrating that their samples can accurately measure public opinion and predict behaviour. Getting elections wrong, however, risks a huge financial cost to market research companies through reputational damage and, therefore, huge financial cost to those running them. It would be downright perverse to deliberately get those polls wrong.

6) Polls always get it wrong

If the idea that polling companies would ruin themselves by deliberately getting things wrong is absurd, the idea that polls can get it wrong by poor design is sadly true: polls obviously can get it wrong. Famously they did so at the 2015 general election. Some polls also got Brexit wrong, though the picture is more mixed that some seem to think (most of the campaign polls on Brexit actually showed Leave ahead). Polls tend to get it right a lot more often than not though – even in recent years, when their record is supposed to have been so bad, the polls were broadly accurate on the London mayoral election, the Scottish Parliamentary election, the Welsh Assembly election and both of the Labour party leadership elections.

Nevertheless, it is obviously true to say that polls can be wrong. So what’s the likelihood that this election will be one of those occasions? Following the errors of the 2015 general election the British Polling Council and Market Research Society set up an independent inquiry into the polling error and what caused it, under the leadership of Professor Pat Sturgis at Southampton University. The full report is here, and if you have some spare time and want to understand how polling works and what can go wrong with them it is worth putting aside some time to read it. The extremely short version is, however, that the polls in 2015 weren’t getting samples that were representative enough of the general public – people who agreed to take part in a phone poll, or join an internet panel weren’t quite normal, they were too interested in politics, too engaged, too likely to vote.

Since then polling companies have made changes to try and address that problem. Different companies have taken different approaches. The most significant though are a mix of adding new controls on samples by education and interest in politics and changes to turnout models. We obviously won’t know until the election has finished whether these have worked or not.

So in that context, how does one judge current polls? Well, there are two things worth noting. The first is that while polls have sometimes been wrong in the past, their error has not been evenly distributed. They have not been just as likely to underestimate Labour as they have been to overestimate Labour: polling error has almost always overstated Labour support. If the polls don’t get it right, then all previous experience suggests it will be because they have shown Labour support as too *high*. Theoretically polls could have tried too hard to correct the problems of 2015 and be overstating Conservative support, but given the scale of the error in 2015 and the fact that some companies have made fairly modest adjustments, that seems unlikely to be the case across the board.

Secondly is the degree of error. When polls are wrong they are only so wrong. Even those elections where the polls got it most wrong, like 1992 and 2015, their errors were nowhere near the size of the Conservative party’s current lead.

Short version is, yes, the polls could be wrong, but even the very worst polls have not been wrong enough to cancel out the size of lead that the Tories currently have and when the polls have been that wrong, it’s always been by putting Labour too high.

So, if you aren’t the sort to go in for conspiracy theories, what comfort can I offer if the polls aren’t currently showing the results you’d like them to? Well, first the polls are only ever a snapshot of current opinion. They do not predict what will happen next week or next month, so there is usually plenty of time for them to change. Secondly, for political parties polls generally contain the seeds of their salvation, dismissing them misses the chance to find out why people aren’t voting for you, what you need to change in order to win. And finally, if all else fails, remember that public opinion and polls will eventually change, they always do. Exactly twenty years ago the polls were showing an utterly dominant Labour party almost annihilating a moribund Tory party – the pendulum will likely swing given enough time, the wheel will turn, another party will be on the up, and you’ll see Conservative party supporters on social media trying to dismiss their awful polling figures using exactly the same myths.