Photo: Thinkstock.com

Public opinion surveys are everywhere.

Mainlined by news and politics junkies, polls are dismissed and disparaged even as they're obsessed over and dissected. They are the wallpaper of our election cycles and, arguably, the thoroughfares that help guide public discourse.

As Canadians prepare to cast a ballot in a 2015 federal election, competing voter-preference polls will be peppering the airwaves, each claiming to be a representative snapshot of Canadian public opinion.

Here's a look at how those competing surveys come to life.

Whose opinion is it, anyway?

There are two main avenues pollsters use to reach Canadians, the telephone and the Internet. Which route you take, and how you drive it, will influence who responds to your survey.

Live telephone calls, in which an interviewer walks respondents through a series of questions, remain the "gold standard" of polling, says Paul Adams, a former political reporter and pollster who now teaches journalism at Carleton University in Ottawa.

Live calls, however, are also time-consuming and expensive, making them increasingly rare — at least when it comes to the "horse race" numbers in the news.

The rise of caller ID, call screening and cellphones has also tarnished the old gold standard.

It used to be pollsters could expect about a 20 per cent response rate to live calls, says Adams, meaning 20 poll respondents for every 100 calls made. Surveys done for the federal government, which are posted publicly with complete methodology, show response rates for some large national telephone polls as low as eight per cent.

Another factor: who's picking up the phone? A single mother with three kids is less likely to have time for a phone survey than a retiree. What about the unemployed versus an executive working 60-hour weeks?

Nonetheless, randomized live phone polls "produce strikingly accurate results — even when response rates for those surveys are as low as 10 or 20 per cent," says Jon Krosnick, a U.S. expert on polling and director of Stanford University's political psychology research group.

The other method of telephone survey is known as IVR — Interactive Voice Response.

A recorded, automated call is more likely to get a hang-up than a live caller, and may be able to ask fewer questions once it gets a respondent before wearing out the person's patience. But for a bare fraction of the price per call, pollsters using IVR can make tens of thousands of calls and build response samples significantly larger than live surveys.

Land lines or mobile?

Regardless of whether using live calls or IVR, if a polling company isn't dialling cellphone numbers as well as land lines it is missing out on a fully representative sample. Increasing numbers of Canadians, especially younger people, only have mobile phones.

Pollsters use an algorithm to randomly dial both land lines and cellphones — "There are actually no telephone lists that are required," says Nik Nanos of Nanos Research — although cellphone users, who pay for air time, are usually less responsive.

The proportion of cell respondents included in any telephone survey is up to the polling company; Nanos Research usually includes 25 per cent.

Casting a wider Net?

The other main route to survey respondents is the Internet.

Access to high-speed broadband is limited in many rural or remote regions of the country, and there remains an income, education and generational divide among the web savvy, says Adams, which suggests web-based polls have sample limitations.

There are also very different ways of getting respondents to complete an online survey: Self-selected panels and random recruitment.

Most online pollsters create large pools or panels of potential respondents, using an "opt-in" method such as offering entry in a prize lottery to those who sign up.

Online news readers last week might have seen a banner ad asking, "Will you vote Trudeau next time?" The ad went on to say polling firm Angus Reid wants to know, and asked respondents to sign up for a survey.

Pollsters may pad their numbers by buying lists of email addresses — from retailers, for instance. From a panel that might include more than 100,000 potential respondents, the polling company then selects a much smaller subset of respondents for any given survey.

By opting in, panellists may provide pollsters with a detailed profile (such as age, gender, income, employment, locale, and even banking and shopping preferences) that can then be used to build samples for particular surveys. This is a great tool for doing targeted market research but is problematic when trying to survey public opinion at large.

"Most of the companies doing Internet surveys claim that they can do sleight of hand with statistics to fix the fact that they don't actually have random samples. And they can't," says Krosnick. "There's actually no way to do it."

Because of the self-selected nature of the panel itself, surveys based on opt-in pools are not supposed to provide margins of error, according to the Marketing Research and Intelligence Association, which represents the industry in Canada.

Randomly recruited panels, by contrast, may use a variety of methods to get a proper probability sample of respondents to online surveys, including randomized IVR phone calls as the initial point of contact.

Truly randomized online surveys are considered comparable in quality to live telephone polls, says Krosnick. In fact, studies have showed people answering questions online are more likely to pause, reflect and respond more accurately than on the telephone.

Raw data, now what?

All pollsters "weight" their sample data in an effort make it reflective of the public at large. The simplest example is male-female balance. Canada's population is 50.4 per cent female, according to census data. A national poll with more men than women would have to be weighted accordingly.

Pollsters routinely weight for age (younger people tend to be undersampled), gender and region but the more variables in the weighting mix, the more murky the poll. Online surveys tend to over-represent younger, more urban, wealthier and more educated people, which also requires weighting.

"The target for any researcher is not to weight, or to do the minimal amount of weighting," says Nanos.

An increasingly troublesome issue for political polls is relating broad public opinion to the much narrower electorate.

With anywhere from 40 to 60 per cent of eligible voters failing to cast a ballot in any given election, pollsters must attempt to divine whose party preferences actually matter on election day.

Who pays the piper?

Most media polls these days, especially before the election campaign starts, are done for free by polling companies as a promotional or marketing exercise. The days of wealthy media companies paying for political polling are almost gone — which may, in part, explain the shift toward lower cost methodologies and less reliable results.

"A lot of it is like asking the dog to fetch the stick you're going to beat it with," pollster Frank Graves of Ekos Research says of offering freebie political surveys.

Largely overlooked, says Carleton's Adams, is that the paying clients of polling companies tend to be corporations and government, which raises the potential for "massive" conflicts of interest.

Accurate political polling can embellish a pollster's reputation "but that's not their only interest," says Adams. "They have other interests and they have more direct commercial interests."

Just as news consumers want to know if an academic's chair was endowed by a big corporate interest, or whether a journalist was paid handsomely for industry appearances, he says, so pollsters should be revealing their client base

So do I trust political polls or not?

Pollsters will howl but research scientists without a dog in the hunt are adamant only truly randomized surveys predictably and repeatedly measure public opinion accurately.

Most polls don't do that, but go ahead and sample them — and feel free to pour on a few grains of salt.