< Pawlenty news conference live at 2pm | Main | On ATC: Musician Ian Anderson (not the Jethro Tull guy), Patricia Hampl and Matisse >
Do you think polls are: a) valuable; b) useless; c) none of the above
Posted at 3:13 PM on November 9, 2006 by Bill Wareham
The Star Tribune had an interesting article this morning analyzing the accuracy of polls that seemed to show Mike Hatch with a slight to moderate edge over Tim Pawlenty in the days leading up to the election.
Of course, the only poll that matters proved the others "wrong." Though as the Strib correctly points out, the pre-election polls generally showed the race within the margin of sampling error, in effect a toss-up, and even the unanimity with which they showed Hatch ahead couldn't erase the sampling error factor.
Which isn't to say we shouldn't cast a critical eye on the polls. I tend to think they're interesting insofar as they provide a slightly blurry snapshot of the campaign. And I think the public has become much more educated about accounting for the margin of error.
But with the proliferation of polls out there now, I wondered how much we added to the mix with the polls we commissioned in partnership with the St. Paul Pioneer Press. On balance, I think they were valuable bits of reporting, but it's something I will continue to evaluate as we head into the next election cycle.
Then there are the exit polls. We purchased state data from Edison/Mitofsky, the research firm that does the national exit polling on Election Day. That data is supposed to help us in two ways: 1) provide us some predictive information about the outcome for use after the polls close, but before all the returns are in, and 2) provide some demographic and analytical data that helps us understand why the electorate did what it did.
On Tuesday, the first two waves of data, sent around 4 p.m. and 8:30 p.m, showed Hatch with a significant lead. It wasn't until the final data set we received around 11 p.m. that it showed Pawlenty and Hatch locked in a virtual tie. But of course by then we knew that from the returns we were seeing, so the quick predictive power of the poll was pretty minimal. On the other hand, the analytical data proved helpful. Among other things, it showed the depth of feeling on Iraq and less emphasis on social issues than we measured two years ago.
By the way, I should explain why I'm being vague about the actual numbers from the poll. It's because we only paid for rights to use them in our on-air reporting. Online rights would've more than doubled the costs and we made the decision that it wasn't worth it for data with a somewhat limited shelf life.
Anyway, if you have any thoughts about the value of polls, let me know.