Fooled Gold? Another look at the G.I. Civics Test

by David Safier

As you know if you've been reading this blog, I've had several bones to pick with the Goldwater Institute's studies comparing public and private school students in their knowledge of civics, tolerance of others and feelings about the schools they attend. (My criticism of the Civics study is here, and the criticism of the other two are combined in this post.) But here's one objection that never occurred to me until two faithful readers emailed me material that puts the polls themselves into question. (Hat tip to todd and to Eli Blake.)

It's possible the polling company, Strategic Vision LLC, simply made up the numbers in the surveys it gave to Matthew Ladner at G.I. which formed the basis of the three studies.

Nate Silver at FiveThirtyEight believes Strategic Vision LLC concocts fraudulent surveys on a regular basis. In case you haven't heard of Silver, . . . If you're looking for the smartest guy in the room — more specifically, in any room where statistics are the topic — Ladner and I would be hanging around in dark corners looking confused and Silver would be standing in the spotlight with other stat heads gathered around listening to what he had to say.

In one of his many posts on Strategic Vision (I list all I could find at the end of the post), Silver looks at a survey Strategic Vision conducted in Oklahoma which asked the same 10 civics questions G.I. asked students in Arizona. It got equally dismal results, indicating that Oklahoma students, like those in Arizona, know very little about civics. Ladner was involved in setting up the Oklahoma survey and wrote about the results on the Oklahoma Council of Public Affairs website.

Funny thing, though. Oklahoma state Rep. Ed Cannaday asked the same 10 questions of high school students using the same basic methodology, and his students appeared to be about 3 times more knowledgeable than the ones Strategic Vision surveyed. Cannaday, by the way, is a former teacher and school principal, so he knows a bit about education.

[Cannaday] arranged to have all the seniors in the 10 secondary schools in his district take the Strategic Vision/OCPA survey. Cannaday tried to replicate the Strategic Vision survey to the greatest extent possible. The same exact questions were used, and as in the case of the original survey, the answers were open-ended rather than multiple choice. The survey was administered to a total of 325 seniors, including special education students.

Ladner has responded to the allegations that Strategic Visions made up the numbers, saying he will look into them. "If I got snookered," he wrote, "I’ll own up to it, but the jury is still out."

As it happens, about 3 weeks ago, I asked Ladner if I could see copies of the actual Arizona survey results, and he was kind enough to fax me the two surveys used for all three Arizona studies — one survey of public school students and another survey of private school students.

I'll do some of my own analysis of questionable aspects of the surveys after the jump. And at the end of this post, you'll find a long list of links from a number of sites concerning Strategic Vision's questionable integrity.

I'll leave it to Silver to do the subtle statistical analysis, which, I must admit, goes way over my head. But here's some of my own brute strength analysis of the student's answers to the survey questions.

On the survey, the students were asked 10 civics questions such as, "What are the two parts of the U.S. Congress?" and "Who was the first President?". The questions are open ended, not multiple choice, so the students had no prompts as to the right answers. They either gave their best answer or said "I don't know."

One question asks, "What are the two major parties in the United States?" Before I give you the wrong answers the surveys received, or said they received, I want you to think of plausible wrong answers to the question.

Did you come up with "Communist and Democrat" or "Communist and Republican" as the two major parties in the U.S.? Neither did I. But the survey says 8% of private school students — 108 students — gave one of those two answers. The only other wrong answer on the private school survey was "Green and Socialist": 1% (7 students). No answers said any of the 4 possible combinations of "Green/Socialist and Republican/Democrat," although those seem to be more likely wrong answers. And there is no "other" in the responses, indicating the students either made one of those 3 wrong choices (9%), got the right answer (60%) or didn't know (31%). To me, that doesn't smell quite right.

The public school students' responses aren't broken down by the wrong answers. For some reason, the survey simply lists 12% as "Other." Why break down one set of answers and not the other? Again, it smells a bit fishy.

Interestingly, the Oklahoma survey cited by Nate Silver listed 10% as responding "Republican and Communist" to that question. Odd that Communists keep cropping up in high school students' answers. Commies aren't being used for political target practice much these days. Socialists and Nazis, yes, Communists, no.

"What are the two parts of the U.S. Congress?" another question asked. Both private and public school students replied either "The Senate and the House" or "Don't know." What, no "Supreme Court" or "President" as an answer from any of the 2700 students surveyed? It doesn't seem likely to me.

"How many Justices are on the Supreme Court?" Every private school student guessed 6, 8, 10, 15 or "Don't know." Why not 7, 9 or 11-14? And you mean to tell me, not one private school students knew the right answer is 9? Really? Public school students had a more compact cluster of numbers — 7, 8, 9, 10, 12 or "Don't know" — meaning a few of them (10%) came up with 9. With 1,350 students in each group, I would expect to see every reasonable number represented in both samples, and at least a few of the kids in private school would have known there are 9 Supreme Court Justices.

For "We elect a U.S. Senator for how many years?" not a single student in either group answered with an odd number. Both groups said 2, 4, 6, 8, 10 or "Don't know." That implies that all students understand Senators serve for an even number of years. It's possible, I guess, but if the students are as clueless as the survey indicates, someone certainly would have guessed 5 or 7, you would think.

Open ended questions with a limited number of answers like this stretch the plausibility of this survey to the breaking poing for me. It would be interesting to see if Cannaday's test results had a similarly limited number of wrong answers.

Number crunchers among you will probably delight in watching Silver's mind at work as he deconstructs the data on these and other Strategic Vision LLC surveys in some of the links below. And for anyone interested, there's plenty of information you can understand without knowing statistics.

Links about Stategic Vision LLC and the Civics survey.

Did the Dog Eat the Data? (, 4/23/09)

Mourning Constitutional (Oklahoma Council of Public Affairs, 9/1/09)

AAPOR Raises Objections to Actions by Atlanta-Based Strategic Vision LLC (American Association for Public Opinion Research, 9/23/09)

AAPOR "Raises Objections" to Strategic Vision's Non-Disclosure (, 9/23/09)

A Few More Questions for a Sketchy Pollster (FiveThirtyEight, 9/24/09)

Strategic Vision Polls Exhibit Unusual Patterns, Possibly Indicating Fraud (FiveThirtyEight, 9/25/09)

My thoughts on the Strategic Vision Controversy (Public Policy Polling, 9/25/09)

Embattled pollster defends methods (Politico 9/25/09)

Strategic Vision: Time for Transparency (, 9/26/09)

Are Oklahoma Students Really This Dumb? Or Is Strategic Vision Really This Stupid? (FiveThirtyEight, 9/26/09)

Comparison Study: Unusual Patterns in Strategic Vision Polling Data Remain Unexplained (FiveThirtyEight, 9/26/09)

Seen Through Sharper Statistical Lens, Anomalies in Strategic Vision Polling Remain (FiveThirtyEight, 10/5/09) [New addition]

Real Oklahoma Students Ace Citizenship Exam; Strategic Vision Survey Was Likely Fabricated (FiveThirtyEight, 11/8/09)

Civic Knowledge Polling Controversy (Jay P. Greene's Blog, 11/12/09)