OPINION

: : : : : : : : : :

While visiting another blog (hey, we do that sometimes; how do you think we find such great stories?), I came across a comment from another poster who made a logically false claim about political polling. He said that “If you get good representation among the respondents (regarding, age, gender, race, education, home etc) you can get accurate results with less than 2000 respondents regardless of how large the real population is.” (The site was Mudflats. If you are interested in Alaskan politics – and who isn’t going to be these next two months? – this is the go-to guy. But don’t worry, we’ll keep you up-to-date at The Zoo with any developments in the Sarah Palin Show.) Well, those who know me can confirm that I do not believe political polls are a valid application of statistics and, at the end, I asked any statisticians out there to explain to me why I was wrong. Surprisingly (to me), a few posts later a statistician told me that I was correct not to believe polls. I shall post a response of his, with a link to his explanation (warning, it gets pretty heady in there) at the end of my comments.

Every poll that you hear about comes with a “Margin of Error”. While I admit that I do not know a lot about stastistics, I know that there is a formula they use based on the number of people sampled, without any regard for the size of the population sampled, to determine this Margin of Error. Now, as I understand it, the Margin of Error is not how far up or down you can make each number and have the results be “correct”, but rather it is the probablility that the sample you took is not representative of the population as a whole (from which the samples came). Therefore, a 3% Margin of Error does not mean that Obama’s numbers or McCain’s numbers can be adjusted up or down three points, but that there is a 3% chance that the results cannot be trusted at all, that they are a complete abberation. And there would be no real way to know unless the results were so wildly different than the ones released the day before. (“TV Announcer: “In a stunning development, Republican Presidential nominee John McCain’s support among blacks in America rose to 82%, giving him a 37-point lead over Democratic Presidential nominee Barack Obama.”) They just don’t report those poll results. They sit on them and do another round of polls to see if those results look like the ones from two days before. Then they say, “Whew, it was that 3-out-of-100 chance that we would get a bad sampling. That must be the third time in three months that this has happened. Who’d have guessed it?” Hmmm, third time in about a hundred days. What are the odds of that?

Statistical analysis of a “population” assumes some commonality among that population so that any 900 or so samples could “represent” the entire population. But this is false thinking when applied to political polling. You cannot say that the American people are all “more or less” the same, and that you could ask questions of any 900 people and get a good idea of what the entire country is thinking, because you could easily be getting 900 different opinions to start with. This kind of analysis might work when analyzing probabilties about events (such as car accidents or heart attacks), because those things either did or did not happen. But you cannot properly apply this rationale to analyzing how people think, because those thoughts do not necessairly stay static. They could change before the pollster finishes collecting the data. And for now, we won’t even talk about the fact that they are basing the published results on a probability that a certain thing has happened. (As I said, I don’t know this stuff, but I heard from someone who does.)

Taking a poll of what is clearly an uninformed electorate is no way to determine public policy, or even what the public thinks. The only way polls can be considered valid is if it was known that everyone in the country knew all the facts before announcing their opinion/decision. Do you really think that the average American is “informed” on the issues? Take a look at the people you know in your neighborhood, your local stores, even your place of work. Now, you’re here reading this blog, so you must have some interest in what is going on in the world. But do all of your friends, neighbors, and co-workers stay as informed as you? And yet, if you’ve never been called by a pollster (more on that later), there’s a chance that someone else you know might have been. Do you think he’s the right guy to be asking which direction this country should be going? I know I wouldn’t trust some of the people I know. (Don’t worry, it isn’t any of you. 🙂 )

For these reasons, I believe that political polls are nothing but a waste of time and money. Our tax money should never be used to fund such polls, as desired results can almost deliberately be obtained from the mere framing of the questions. The easiest, and most relevant, examples would be “Are you ‘Pro-Life’? Or are you ‘Anti-Choice’?” “Are you ‘Pro-Death’? Or are you ‘Pro-Choice’?” How would you respond to a pollster who referred to those who share your beliefs with the “offensive” term when asking you a question? If you call yourself “Pro-Choice”, and the pollster asks you if you are “Pro-Life or Pro-Death?”, how are you going to answer the rest of his questions, knowing he’s “one of them”? I think it’s a fair question, which works the other way around, too.

Of course, there’s a nice little cottage industry for polling, and organizations like Gallup and Zogby (among others, and I’m just picking two of the more famous ones, simply as examples of the kind of groups I’m talking about) have staked their reputations on claiming their polls are useful. But it is easy to demonstrate that pollsters can’t be getting an accurate assessment of political feelings in this country, because they do not call cell phones. And young people often do not have land lines (like grown-ups), so they probably never get called by pollsters. How are their feelings measured in the poll’s public results? It makes no sense that they could ignore the amazing increased voter registration among younger voters. Yes, they did that four years ago and a lot of new younger voters didn’t vote. But this time, there’s a guy who’s closer to their own ages, with kids younger than them (kid sisters), so there’s a connection that may not be quantifiable. Without a valid way to be certain your polling sample matches what you need to have polled, its results have no meaning in the real world.

If political polling was exposed for the fraudulent representations of public opinion that it logically must be, the TV talking heads wouldn’t know what to think or say. Much of their politial analysis and reporting is based on what pollsters tell them people think. For that reason, they will never give them up, nor will they teach people why they shouldn’t waste their time answering political polls. A thought occurred to me the other day. I was watching MSNBC (which should surprise no one), and they talked about the results of “an NBC/Wall Street Journal” poll. And I thought to myself: Rupert Murdoch owns the Wall Street Journal. He also owns Fox News Channel (also known as FNC, which stands for a lot of more accurate descriptions than “news channel”), which has Bill O’Reilly regularly blasting NBC (sometimes for things their parent company, GE, did in the weapons industry, as if NBC had a say in any of that) for being “ultra-liberal”. (For the record, besides being unsure what “ultra-liberal” means, if it means what I hear people like him say it means, I see no actual evidence of it in my viewing.) So, if NBC is “ultra-liberal”, and the Wall Street journal is “ultra-conservative”, then what kind of questions do they ask people? Do they use words like “pro-choice” and “anti-choice” or “pro-life” and “pro-death”? It makes a difference. (I realize that the specific questions they ask may not have anything to do with abortion; that was just an example, what some folks would call “a fer instance”.) My point is simply this: How would the questions be framed in a poll conducted by two groups with (allegedly)ideologically opposing views?

How would the media talking heads know what the American people think about an issue? Well, I’m no journalist, nor am I a police detective, and I’m not even a private dick (though some say I’m a public one), but I would start with asking each Member of Congress about the issues on which their constituents are writing them. Sure, there are 435 Representatives (when it is full), and 100 Senators (when that body is full, and when has it been lately?), but that’s no reason to not pick up the phone and ask them. I’m not talking about how the pollsters should collect their data, I’m talking about how the media that commissioned those polls can learn what the public is concerned about without using the polling organizations, whose methodolgy for collecting data cannot possibly prove accurate. I’m not saying that you can base percentages on how many letters a Member of Congress gets, because some are form letters, inspired by activist organizations (both left and right), and some may be disproportionately from one town or district faced with a problem, and they were all encouraged to write their Congressman for help. But at least you can learn whether the wars in Iraq are stillimportant to voters, or are many concerns about jobs and the economy (Why is the price of gas so high? Answer: Phil Gramm, John McCain’s “former” economic advisor, and the one touted as a possible Secretary of the Treasury in a McCain Badministration. If that isn’t enough to keep you up at night worrying about how your family will get through the next four years, I’m not sure what will. And that was not a typo.)

Pollster John Zogby once told Jon Stewart that to get those 900 people that they sample (and this was the specific number Jon used in his question), they have to call about 7,000 to 8,000 people. This means they are only polling people willing to talk to pollsters, not Americans as a whole, which is another flaw in their thinking. I find no logical reason to believe that a random sampling of 900 people willing to talk to pollsters (about 10% of the population) can possibly tell you with any accuracy what a nation of 300,000,000 are thinking. It is absurd to try to make believe that it can. Whatever world these results apply to (and being a sci-fi fan, I know that there exists an alternate reality in which these results are 100% correct), it ain’t this one.

As promised, the comment that confirmed my suspicions about why those political polls that tell us the race is neck-and-neck between Senators Obama and Mccain are a bunch of bullshit. (And remember how they kept trying to convince us that Hillary and Barack were “too-close-to-call”, and then later, they all but admitted that they were pretty sure it would come down that way a long time before?) Thank you, Bob Gladd, for your help. I encourage people to visit the link, even if you don’t speak Statistics.

I am a trained statistician, though not in polling, more in science, financial risk modeling, and health care. Here’s a link to one of my snarky short essays on probability distribution problems.

I’m sure that Bob Gladd would appreciate the interest in his work.