People like me are always looking into the polling data to see what has worked well in the past to correctly forecast election results for the future. The problem is that there are a number of polling methods and ideas that worked well in prior years, yet are in opposition to each other right now. Who will be right?

Here's a list of the top five polling match-ups of 2012.

1). Traditional polling averages v skewed polling averages

I learned a powerful lesson eight years ago: polling averages work. The averages have correctly predicted all state presidential contests except for five since 2000. They have accurately projected every Senate winner, save a few, over the last few years. The state polling averages say Obama is going to win.

There is a crowd, however, that believes the polls have too many Democrats. They look at the polling data and see the same, or even higher, percentage of poll respondents in states like Ohio self-identifying as Democrats than the polls had in 2008 – a year of record high enthusiasm for Democrats.

My personal opinion is that the polling averages are likely correct. I witnessed Democrats making the "skewed" argument in 2004 when polls showed "too many Republicans". The averages won, and George W Bush served another term. We'll see, though, if I'll be eating my words.

2). State v national polling

I've spoken about this one ad nauseum. The national polls indicate that Mitt Romney and Barack Obama are in a dead heat. The state polls have Obama ahead in the key swing states of Ohio and Wisconsin and, therefore, have him ahead in the electoral college. You might say that Romney may run up the score in states where there are fewer state polls, but as Sean Trende pointed out, the math on that equation doesn't work.

In the past, when state and national polling split, as in 1996 and 2000, it was state polling that won the battle. This year, however, the national polls tend to be conducted by pollsters with longer track records. My guess is that the state polling is going to be right again, although I don't feel particularly confident on this one.

3). Likely v registered voters

President Obama would probably have this election sewn up if all registered voters cast a ballot. The issue is that not all registered voters will actually do so. The task pollsters face is to try and narrow their polling samples to only those who will vote. They utilize different methods in hopes of getting as close to modeling the actual electorate as possible.

The result has been mixed. Steve Singiser has compared likely v registered voters in state polls and found likely voter polls no more accurate. Nationally, likely voter samples were no better than registered voter ones at predicting the outcome in 2000. They did better in 2004, and again, in the 2010 midterms.

It could be that the disparity we're seeing in state and national polling may simply be that the likely voter screens are, for whatever reason, too tight nationally, while just right in the state surveys. Either way, I'll lean towards the likely voter screens being closer to the truth – at least, in the state data.

4). Live v automated polls

I've heard some version of this argument for years: you can't trust automated polling (or "robo-polls") because they are cheap and anyone – including a dog – can answer them. Live interviewer polls, on the other hand, are done by trained professionals – plus the campaigns use them, so they must be correct.

I've been skeptical of these hypotheses for as long as I can remember. I look at the track record and see that both are equallyaccurate, despite automated polls having lower response rates (and the dog issue). Perhaps, it's because people are more honest with a machine than a person, or the lower response rate acts as a likely voter screen.

In recent years, a new wrinkle has emerged: cellphones. Automated polls can't call cells. The cellphone population tends to be overwhelmingly Democratic compared to the home telephone population. Some automated pollsters like Rasmussen and SurveyUSA try to supplement their home telephone samples with online or other electronic devices, but I'm not sure this works too well.

Automated polls overall (save Public Policy Polling and SurveyUSA) have leaned toward Romney compared to live-interview polls. If they're wrong, it will just confirm the conventional wisdom. If they're right, we'll all need to have a conversation.

5). Public Policy Polling v Rasmussen

Democrats and Republicans each have a polling agency they cling to.

Democrats have Public Policy Polling (PPP) – an openly Democratic-affiliated pollster. PPP did well in both 2008 and 2010 and has shown a knack for accurately surveying difficult local contests. They have, however, been accused of bias because of the groups who sponsor some of their polls, as well as the questions contained within some of their surveys.

Republicans have Rasmussen Reports: officially, it's nonpartisan, but its owner goes on Republican cruises (among other signs he's not exactly non-partisan). After a disastrous 2000, Rasmussen came back and was among the most accurate pollsters in 2004 and 2006. Their 2008 data leaned a little to the right, and in 2010 midterm polls were far too Republican.

This year, PPP and Rasmussen have vastly different state data. PPP has Obama ahead in Colorado, Florida, New Hampshire, Ohio and Virginia. Rasmussen has Romney leading in all those states. They can't both be right.

A PPP victory would demonstrate that you can have an openly partisan affiliation and still be trusted to deliver accurate results. It might also signal that automated surveys can be just as good as live interview ones, but they just need to have good weighting and call backs. A Rasmussen win would send a lot of insider DC types into a tailspin, wondering what the heck happened.

Conclusion

Pundits and pollsters have been sticking their necks out. Some are going to be right and some wrong. All I know is there's a lot on the line come election day – and not just for the political candidates.