Those D.C. test scores

by David Safier

This morning I posted about a Nicholas Kristof article praising D.C. School Chancellor Michelle Rhee's "reform" agenda. I noted his statement, "Test results showed more educational gains last year [2008] than in the previous four years put together," and said I would look into possible reasons for the gains. If they're legit, great. But I wanted to see for myself.

Look I did. And I found we have to attach a few qualifications to those higher scores.

The most important qualification is, D.C. began using a new standardized test in 2006 which is specific to D.C. That means, first, the scores can't be used to compare results against scores from the previous test taken in past years, and second, the scores can't be compared with national results.

The new test had students give short responses, while the earlier test was multiple choice. That's probably good, but it means no one knew how to help students perform well on the test. In the first year, the scores went down, probably for that reason.

The results during Rhee's first year as Chancellor, 2008, were for the third year of the new test. That gave the entire D.C. district time to learn the ins and outs of the test and figure out how to best "prepare" the students. Translation: teachers learned how to teach to the test. That would almost certainly give the scores a bump.

But did teachers prepare students to take the test on Rhee's watch. You bet they did, with a vengeance!

Rhee had schools conduct three pretests early this year [the 2007-8 school year] to gauge student progress. Students spent several hours a week taking practice tests. Teachers . . . analyzed the data and retaught material that students got wrong. Before the DC-CAS was administered in the spring, principals were required to devise a plan on how teachers would prepare for it.

If teachers used the three pretests and the weekly practice tests as diagnostic tools, that could have improved students' learning. But if they mainly used the tests to find areas where students did poorly and narrowly focused on helping the students do better, that's going to raise scores without necessarily raising achievement. Most likely, teachers did a little bit of both. So which had more to do with the bump in scores, an increase in student learning or an increase in test taking skills? It's hard to tell. No, it's impossible to tell.

And let's not forget that taking practice tests over and over and seeing their results will make the students better at taking that test,regardless of what else happens in class.

Rhee began as Chancellor in 2007, so the jump in scores we're talking about happened at the end of her first year. She didn't have much time to implement a whole lot of change. Maybe she scared people into teaching better. Or maybe she scared them into taking the test more seriously. Again, maybe it's a little bit of both.

This is not an indictment of Rhee. She may be improving D.C. schools. But I wouldn't put too much stock in the rise in test scores as an accurate indicator of progress made on her watch.