It is one of the more remarkable press releases you’ll see.
It’s from the Department. And it's about SIG.
First, the background.
At SIG’s onset, I went on the record predicting it would end as a monumental, possibly historic, waste of precious resources—investing billions of dollars in dysfunctional schools embedded in dysfunctional districts against the clear lessons of decades of research and experience. SIG was surely the morbid apotheosis of the turnaround craze.
So, of course, I’ve been yearning for real data showing how the program is doing. But I’m not the only one. Many others, far less rabid than I, have been pestering the Department for SIG student achievement data.
Late last year, the Department famously partook in the fine Beltway tradition of “Friday-night trash-dumping,” releasing a smidge of really bad news about SIG’s progress on the Friday before Thanksgiving Week. It showed that about 40 percent of participating schools had actually gotten worse. As for the rest, we were told nebulously that they made either “single-digit” or “double-digit” gains.
No school level results; just aggregate numbers in the form of a bar chart.
And old numbers at that! Changes in performance from 2009-10 to 2010-11. And this was late November 2012.
Egad.
Eagerly have we been waiting for more.
So on Tuesday, there’s an announcement about SIG—a release in which the Department praises itself for its continued “commitment to transparency.” (I’m not making this up.)
I hoped that this would finally be the release of data showing SIG-school by SIG-school how proficiency rates had changed since the SIG interventions—data to show us whether the billions spent on the turnaround craze went to good use.
No such luck.
The first product is “leading-indicator” data…from three years ago. Some of it is marginally interesting. There are SIG school graduation rates and daily attendance rates. But they are from three years ago.
Moreover, there are no comparisons to each school’s pre-SIG numbers or to those of non-SIG schools—just some numbers absent context. Not exactly illuminating.
But the document also contains rehash—numbers we’ve seen before, like the percentages of each SIG model chosen…three years ago.
And it includes input-y reporting stuff like, “number of states with timely submissions.” (FWIW: 36)
And then there are state summaries: three-year-old data broken down by state.
If you think I’m going too far, Ed Week’s superb Alyson Klein is equally incredulous, though showing steely reporter impartiality. You really ought to read her very good piece on this.
Right around the corner, every state is going to have 2012–13 test data, meaning the Department could publicly produce for each SIG school its 2009–10, 2010–11, 2011–12, and 2012–13 test scores. That simple. They could do the same just without 2012–13 data right now.
So why release repackaged three-year-old information?
Why tell us about the timeliness of data submissions?
Why the Friday-night trash-dump?
Why the nebulous and aggregate “single-“ and “double-“ digit gains talk?
If it looks like a smoke screen, smells like a smoke screen, and acts like a smoke screen…
Maybe, just maybe—and I know this is going to sound really cynical—just maybe SIG results aren’t all that hot.
But at this point I’m not sure if we’ll ever know.