by David Safier
If you read today's Political Notebook in the Star, you might remember it began with the headline:
The last few sentences give more information about the poll, saying it was done by and for conservatives:
The poll was commissioned by the conservative American Action Forum and conducted by Ayers and McHenry Associates Inc. last week.
The company has done several polls for Republican politicians and organizations.
The Capitol Report's Yellow Sheet goes into some careful analysis of the statewide poll the Giffords/Kelly results are drawn from, which casts doubts on the poll's accuracy.
First, it gives some information on the firm that commissioned the poll, American Action Forum, as well as the pollsters, Ayers and McHenry.
American Action Forum is led by Douglas Holtz-Eakin, who was McCain’s chief economic policy advisor during his 2008 presidential bid. The group’s CEO is former Minnesota Sen. Norm Coleman and its board of directors includes Jeb Bush and Tom Ridge.
Similarly, Ayres McHenry is commonly viewed as a Republican polling firm. Its clients include the RNC, NRCC, NRSC, RGA and a host of Republican politicians from the southeast.
That doesn't mean the poll isn't accurate, but let me put it this way. If I were talking to a surgeon who needed one more gall bladder surgery to finish paying off his yacht, I might get a second opinion before going under the knife.
The pollsters' script asking about the candidates appears unbiased, according to the Yellow Sheet, and those questions are placed near the beginning. But later questions indicate this was constructed as a push poll.
At one point, the respondents were asked for the biggest reason they would not vote for the Dem incumbent “that you do not hear discussed often by the media.” Later, they were asked which statement is more true: The Dem’s time in Congress provides the needed experience and power to give the district a strong voice in D.C., or the time in Congress means the Dem “cares more about staying in power” than listening to constituents. The poll also gauges voters’ thoughts on whether a list of characteristics best describes the Republican or Dem, though the characteristics tested are slanted to favor the Republican: new ideas to change the country for the better, fight to keep taxes low, “do the right thing to fix” the health care reform law and work to “get government spending under control.” The polling verges on push-polling, the anonymous pollster said, when it asks respondents to assess “some criticisms that some people might make” against the Dem and how much less likely the criticism would be to make them vote for the incumbent. For instance, voters in all three districts were told the incumbent “voted for ObamaCare.” Voters in CD5 and CD8 were told the incumbent voted for bailouts, “which could cost taxpayers $1 trillion,” and voted for card check. Similarly, voters in CD1 and CD8 were told their Dem incumbent opposes S1070.
Let's take the next logical step in deciding whether the poll numbers are meant to be accurate and objective. If a polling company is commissioned to do a push poll for a Republican group, you might suspect it also skewed the sample to over-represent Republicans so their candidates look stronger than they actually are.
Here's a fairly subtle point one person made about a way the poll could have been slanted. The pollsters only contacted 400 people instead of the usual 600, which could affect the poll's accuracy. To make up for the small size, the pollster "weighted" the results. It's a standard polling procedure, but in the hands of someone who has an agenda, it would be very easy to "weight" the results toward Republicans.
As an example of poor sampling, let's look at the party breakdown in CD1. It looks like the pollsters rejiggered the numbers to overcount the Republicans by as much as 11%:
For example, in CD1, 35 percent said they were Republicans and 29 percent said they were Dems, while voter registration figures from last month show the actual breakdown is 33.2 percent GOP and 38.2 percent Dem.
When you take a look at the poll and the Yellow Sheet analysis, you have to consider the source and take the poll with a few grains of salt. It may be accurate. But it's pretty likely the pollsters came up with the results their employers wanted, which were then fed to the press.