by David Safier
Man, Tom Horne has spent a lot of Ed Supe time and Dept of Ed money trying to knock down Tucson's Ethnic Studies programs. Who knows how much time he's wasted and how many miles he's logged traveling to Tucson just to hold press conferences trashing the program.
And now he's wasted DOE money and time creating a study by Robert Franciosi, PhD, Deputy Assoc. Supe of Research and Evaluation. The title: "The Effect of Tucson Unified Ethnic ("Raza") Studies on Student Achievement." His finding? The program has no effect on achievement in reading, writing and math.
Clearly, the only reason Horne ordered the study is his rage over proponents of the program claiming it raises student achievement. He'll show them! Damn the expense! Damn the time taken away from other research! Tom Horne is not the kind of guy who lets "brown people" (to quote Bruce Ash) get away with that without a fight!
Of course, raising achievement in reading, writing and math isn't the prime reason for the program. If student achievement went down, that would be significant, but if it stayed the same, as the study claims, the program may still have other important effects on its students.
Here's a copy of the study. It takes a group of Hispanic students from the Raza Studies program and a comparable group of Hispanic students from the same and other schools and looks at their AIMS scores. It only looks at students who took AIMS a second time, usually because they failed the first time or sometimes (possibly) because they wanted to raise their scores.
Since students take AIMS as sophomores and don't start Ethnic Studies programs until they're juniors, only the retakers can be compared. That's a pretty shaky way to study the overall achievement of students in the program, since it focuses on the students with the lowest achievement, not the average or above average achievers in the program. According to the study, 50-58% of the students passed AIMS the first time, so they weren't considered as subjects. Who knows? Maybe the more motivated and/or capable students in Raza Studies took a huge leap in achievement. We'll never know. Limiting your study to one skewed segment of the population — even if your control group is similarly skewed — makes for a very weak study with questionable results. As a matter of fact, it's hardly worth the bother of releasing a study whose results are so constricted due to the narrow slice of the population studied.
Another problem: I believe all the AIMS tests used in the study were taken during students' junior years, either in the fall or in the spring. A Raza Studies student would only have been in the program a few months in the fall and less than a year the next spring. That's not much time to see an effect. But we'll never know if there was significant achievement growth as seniors when they'd been in the program longer, because that's not part of the study.
And while I'm being picky, some Raza Studies students in the study were only taking one course in the program. If the study teased out the information a bit, separating students by the number of Raza Studies courses taken, it might have found that the scores differed depending on the level of involvement in the program.
(An Aside: You would expect in a respectable research paper, the final paragraphs evaluating the validity of the study as a whole would mention those possible weaknesses. But this study has no final evaluation, period. Correct me if I'm wrong, but shouldn't any decent study have a few paragraphs of conclusions and analysis at the end?)
The study compared both the average scores of the Raza Studies students and the control group as well as the amount of growth in the scores of the two groups. Average scores were pretty much the same for both groups. And the study says that the difference in growth between the two groups was statistically insignificant. Therefore, there was no growth attributable to the Ethnic Studies program.
Maybe so. When you do your statistical regressions and arrive at your T-values (the meaning of which I've forgotten from my stat courses), the differences probably aren't statistically significant. But looking at the tables, I saw a clear trend. The Ethnic Studies students grew more than the control group across the board. It was only one point more growth in each area, but only in one case out of 12 measurements did the control group have greater growth in their scores than the Ethnic Studies group. That consistency in greater growth for Raza Studies students could have been mentioned, even if it wasn't significant. Careful,objective researchers do that kind of thing. No mention here.
Sorry, Dr. Franciosi. Your study may not have shown significantly greater growth by students in Ethnic Studies, but for a number of reasons, it doesn't show with any degree of certainty there was no growth. You've created a weak study with weaker conclusions, hardly worthy of seeing the light of day.
However, it served its primary purpose: a DOE news release putting down Raza Studies that was picked up by The Star and probably others. Here's what Horne says about the study:
". . . this scientific study establishes that differences between ethnic-study
students and other Hispanic students are not statistically
"This scientific study establishes . . ." Wrong. "A flawed study indicates the possibility that . . " is much closer to the truth. But the truth doesn't serve Horne's ends nearly as well.
UPDATE: Commenter davewave64, who apparently knows more about the AIMS numbers than I do, confirms what I said and adds this:
One thing that you missed is the measurement used in the statistical calculations. Robert used the students' scale scores (a continuous variable) rather than the performance rating (an ordinal variable) of Falls Far Below, Approaches, Meets, Exceeds. While the scale score can be pressed through the statistical meat-grinder, it is irrelevant to the student, unless it is above the passing cut-point. What Robert failed to report is that students taking Ethnic Studies classes have a lower initial passing rate on the AIMS, but end up with an equal or higher passing rate by graduation. While the percentages may not be "statistically significant" you can bet the outcomes are "significant" for students who end up graduating because they become more engaged in their studies.