Monday, October 6, 2008

Are We Winning?

You take a poll by calling people on the phone, and asking them whom they plan to vote for, A or B. While you’re at it, ask them which party they associate with, R, D, or I. When you’ve got enough answers, go check the historical record to see how the area you’re polling usually turns out. It’s a pretty good bet that folks will turn out in a similar manner this year.


Now pull the right number of responses from each of the 3 stacks (you did sort the responses by party, right?) and treat them like ballots. This should give you a pretty good approximation. In most states, the turnout for one party is seldom more that 5% more than the turnout for the other, discounting the occasional rout.

Classical Values observes:

Voter models are the essence of political polls. You take a sample of a few hundred or a few thousand people and predict how that sample can reflect 10s-100s of millions of people. If you are off by even a small fraction in your assumptions the bottom line could be off by 5, 10 or 20% (despite an MoE claim of a few points).

We have a perfect example of this in two Colorado Polls out recently. The first poll was commented on by our Reader MerlinOS2:

PPP just released a poll in Colorado which puts Obama up +7

Now what the issue is here is that the party split was

Dem 40
Rep 36
Ind 24

However August voter registration number per the spreadsheet available from the Secretary of State show the registration breakdown is

Dem 30.6
Rep 34.8
Ind 34.5

Details on the poll in question can be found here. Just this week American Research Group (ARG) also released a poll for Colorado (which is not used in the RCP poll of polls strangely). Its voter model was Dem 32%, Rep 35% and Ind 33%, very close to the ACTUAL voter registration levels noted by MerlinOS2. The result: McCain 48%, Obama 45% - a McCain lead of +3%. (Note: this polls also shows McCain tied with women)

These polls were taken at basically the same time in the same state. But we can see how the voter model can really change the bottom line (a 10% difference).

If the results you get differ from what your client had in mind, just send a Bunch of Rs to Siberia, and have a bunch of Ds immigrate from wherever they normally hang out. If the Is are breaking the wrong way, just jigger the Ds upward, or the R’s down, until you aren’t bothered by them so much.

Bottom line: when a poll comes out, treat it like any other campaign ad unless you personally want to look at their methodology and maybe see if you can correct the work they did. Any poll that does not publish its methodology should be ignored.


No comments: