by David Safier
As you know if you've been reading this blog, I've had several bones to pick with the Goldwater Institute's studies comparing public and private school students in their knowledge of civics, tolerance of others and feelings about the schools they attend. (My criticism of the Civics study is here, and the criticism of the other two are combined in this post.) But here's one objection that never occurred to me until two faithful readers emailed me material that puts the polls themselves into question. (Hat tip to todd and to Eli Blake.)
It's possible the polling company, Strategic Vision LLC, simply made up the numbers in the surveys it gave to Matthew Ladner at G.I. which formed the basis of the three studies.
Nate Silver at FiveThirtyEight believes Strategic Vision LLC concocts fraudulent surveys on a regular basis. In case you haven't heard of Silver, . . . If you're looking for the smartest guy in the room — more specifically, in any room where statistics are the topic — Ladner and I would be hanging around in dark corners looking confused and Silver would be standing in the spotlight with other stat heads gathered around listening to what he had to say.
In one of his many posts on Strategic Vision (I list all I could find at the end of the post), Silver looks at a survey Strategic Vision conducted in Oklahoma which asked the same 10 civics questions G.I. asked students in Arizona. It got equally dismal results, indicating that Oklahoma students, like those in Arizona, know very little about civics. Ladner was involved in setting up the Oklahoma survey and wrote about the results on the Oklahoma Council of Public Affairs website.
Funny thing, though. Oklahoma state Rep. Ed Cannaday asked the same 10 questions of high school students using the same basic methodology, and his students appeared to be about 3 times more knowledgeable than the ones Strategic Vision surveyed. Cannaday, by the way, is a former teacher and school principal, so he knows a bit about education.
[Cannaday] arranged to have all the seniors in the 10 secondary schools in his district take the Strategic Vision/OCPA survey. Cannaday tried to replicate the Strategic Vision survey to the greatest extent possible. The same exact questions were used, and as in the case of the original survey, the answers were open-ended rather than multiple choice. The survey was administered to a total of 325 seniors, including special education students.
Ladner has responded to the allegations that Strategic Visions made up the numbers, saying he will look into them. "If I got snookered," he wrote, "I’ll own up to it, but the jury is still out."
As it happens, about 3 weeks ago, I asked Ladner if I could see copies of the actual Arizona survey results, and he was kind enough to fax me the two surveys used for all three Arizona studies — one survey of public school students and another survey of private school students.
I'll do some of my own analysis of questionable aspects of the surveys after the jump. And at the end of this post, you'll find a long list of links from a number of sites concerning Strategic Vision's questionable integrity.
I'll leave it to Silver to do the subtle statistical analysis, which, I must admit, goes way over my head. But here's some of my own brute strength analysis of the student's answers to the survey questions.
On the survey, the students were asked 10 civics questions such as, "What are the two parts of the U.S. Congress?" and "Who was the first President?". The questions are open ended, not multiple choice, so the students had no prompts as to the right answers. They either gave their best answer or said "I don't know."
One question asks, "What are the two major parties in the United States?" Before I give you the wrong answers the surveys received, or said they received, I want you to think of plausible wrong answers to the question.
Did you come up with "Communist and Democrat" or "Communist and Republican" as the two major parties in the U.S.? Neither did I. But the survey says 8% of private school students — 108 students — gave one of those two answers. The only other wrong answer on the private school survey was "Green and Socialist": 1% (7 students). No answers said any of the 4 possible combinations of "Green/Socialist and Republican/Democrat," although those seem to be more likely wrong answers. And there is no "other" in the responses, indicating the students either made one of those 3 wrong choices (9%), got the right answer (60%) or didn't know (31%). To me, that doesn't smell quite right.
The public school students' responses aren't broken down by the wrong answers. For some reason, the survey simply lists 12% as "Other." Why break down one set of answers and not the other? Again, it smells a bit fishy.
Interestingly, the Oklahoma survey cited by Nate Silver listed 10% as responding "Republican and Communist" to that question. Odd that Communists keep cropping up in high school students' answers. Commies aren't being used for political target practice much these days. Socialists and Nazis, yes, Communists, no.
"What are the two parts of the U.S. Congress?" another question asked. Both private and public school students replied either "The Senate and the House" or "Don't know." What, no "Supreme Court" or "President" as an answer from any of the 2700 students surveyed? It doesn't seem likely to me.
"How many Justices are on the Supreme Court?" Every private school student guessed 6, 8, 10, 15 or "Don't know." Why not 7, 9 or 11-14? And you mean to tell me, not one private school students knew the right answer is 9? Really? Public school students had a more compact cluster of numbers — 7, 8, 9, 10, 12 or "Don't know" — meaning a few of them (10%) came up with 9. With 1,350 students in each group, I would expect to see every reasonable number represented in both samples, and at least a few of the kids in private school would have known there are 9 Supreme Court Justices.
For "We elect a U.S. Senator for how many years?" not a single student in either group answered with an odd number. Both groups said 2, 4, 6, 8, 10 or "Don't know." That implies that all students understand Senators serve for an even number of years. It's possible, I guess, but if the students are as clueless as the survey indicates, someone certainly would have guessed 5 or 7, you would think.
Open ended questions with a limited number of answers like this stretch the plausibility of this survey to the breaking poing for me. It would be interesting to see if Cannaday's test results had a similarly limited number of wrong answers.
Number crunchers among you will probably delight in watching Silver's mind at work as he deconstructs the data on these and other Strategic Vision LLC surveys in some of the links below. And for anyone interested, there's plenty of information you can understand without knowing statistics.
Links about Stategic Vision LLC and the Civics survey.
Did the Dog Eat the Data? (Pollster.com, 4/23/09)
Mourning Constitutional (Oklahoma Council of Public Affairs, 9/1/09)
AAPOR Raises Objections to Actions by Atlanta-Based Strategic Vision LLC (American Association for Public Opinion Research, 9/23/09)
AAPOR "Raises Objections" to Strategic Vision's Non-Disclosure (Pollster.com, 9/23/09)
A Few More Questions for a Sketchy Pollster (FiveThirtyEight, 9/24/09)
Strategic Vision Polls Exhibit Unusual Patterns, Possibly Indicating Fraud (FiveThirtyEight, 9/25/09)
My thoughts on the Strategic Vision Controversy (Public Policy Polling, 9/25/09)
Embattled pollster defends methods (Politico 9/25/09)
Strategic Vision: Time for Transparency (Pollster.com, 9/26/09)
Are Oklahoma Students Really This Dumb? Or Is Strategic Vision Really This Stupid? (FiveThirtyEight, 9/26/09)
Comparison Study: Unusual Patterns in Strategic Vision Polling Data Remain Unexplained (FiveThirtyEight, 9/26/09)
Seen Through Sharper Statistical Lens, Anomalies in Strategic Vision Polling Remain (FiveThirtyEight, 10/5/09) [New addition]
Real Oklahoma Students Ace Citizenship Exam; Strategic Vision Survey Was Likely Fabricated (FiveThirtyEight, 11/8/09)
Civic Knowledge Polling Controversy (Jay P. Greene's Blog, 11/12/09)
Eli, your points are well taken, but the one about some students saying they would go to college and others saying they would graduate from college is right on the money. If that’s an open ended question, I can’t imagine anyone saying they plan to go to college and not graduate.
Questions like that sound like adults trying to concoct probable answers from students — badly –not like students answering questions.
Here’s another issue:
According to the survey results, the private school students were much more likely to know (43%-27%) That George Washington was the first president.
Leaving aside the fact that it’s more like 96% of all high school students actually know that,
My kids learned that in preschool. And then again in Kindergarten. And then again in first grade.
By second grade they knew who the first president was.
So the point is this: What proportion of private high school students also went to kindergarten at a private school? Not very many I’d venture. So if they are that much more knowlegeable about who was the first President in high school, then do we assume that the private high school is spending time teaching that George Washington was the first president? If so, then what else are they teaching in the private high school that should have been learned in Kindergarten? That ‘A’ comes before ‘B’?
Looking at the entire survey, not just the questions adds even more evidence.
For example, when asked about such matters as whether their school treats everyone well, you get a very nice statistical distribution– however I know from having conducted surveys in issues of opinion like this in the past that a significantly higher number of people (and I’d venture even of schoolkids) are likely to answer the to the extremes (i.e. ‘strongly agree’ or ‘strongly disagree’ than the low numbers shown by the survey. Even on questions that should provoke a strong opinion, all the data seems to fit a nice, normal distribution when it should more likely be bimodal. For example, I’d think even the G.I. would question whether 2% of public school students give their teachers an ‘F.’ I could name more than that percent of my kids’ classmates who want to blame the teachers for all their problems. But hey, once again look at the nice normal distribution there– on a question that should very likely not be normal.
Even more strongly, look at the public school survey question on what the students want to do after high school. Note that vocational school is listed separately from the military. So if you look at just those who plan to attend college, 27% say they expect to attend college but not finish, compared with 39% (30% + 9%) who do plan to finish. This may fit the actual numbers (after all a lot of people do flunk out of college) but I don’t think I’ve yet met a single young person who enters college intending to not graduate! Yeah, that’s the ticket– run up a lot of student loan debt and flunk! And they are claiming that that is the INTENT of 27/66 = 41% of public school students who DO plan to go to college?
Clearly THAT is a fabricated number– the person who did it looked at the approximate number of kids who actually do flunk out of college (or leave early due to a family emergency or other reason) and extrapolated to that’s what they were planning in high school.
Duly noted, duly added (I just missed it the first time). But I left out the editorializing about it being the best link, considering the source of the endorsement. Besides, critiquing stat analysis is way above my pay grade.
Hey- You skipped the best one!
OK, that’s a biased opinion.
David: I saw your post on 538. The information from Dr. Ladner is new to me. I wrote a short report with historical comparison, and the results from my students. It’s listed in the 538 comments.
I have wrong answers. Though not as many as Mr. Cannaday, I have the written sheets from all 41 of my students, so I can show all the responses. The distribution I have is very different from what’s shown in the charts. I will have all my responses, correct and incorrect, typed up and double-checked, by Wednesday. Then we can talk about comparison.
If you’re interested, can you email me? (firstname.lastname@example.org) I can’t find a direct link for you.
I thought this thing stunk to high heaven from the moment I read it and found the results simply unbelievable.
While I have yet to see any evidence to suggest Ladner or the GI were in cahoots with Strategic Vision LLC to purposely skew the data, I also find Ladner’s ‘we are the victims here’ schtick to be absolutely unbelievable.
The GI pays for studies and surveys they know are going to come out the way they want them to. Want to know what the impact of a tax increase will be on the Arizona economy – commission the ‘limited government’ Beacon Hill institute to give us the answer. Want a study on how Arizona should change its tax code – commission Arthur Laffer to do a study. Want to compare high schools to private schools in civics – commission Strategic Vision LLC whose president is a right-wing Republican. Of course, they are going to tell the GI what they want to hear so no one there is even going to question the outcomes and release these ‘studies’ out in the world and then are used to justify all sorts of policies. I suspect this is the real goal and the truth be damned.
Very nice. I had read Nate Silver’s posts about Strategic Vision faking poll results, but I hadn’t put together that they are who does the polling for GI. This just gets better and better.