Pedro Reyes and Joy C. Phillips, University of Texas at Austin
August 2001
A lot is going on in Houston by way of school reform and there's plenty of interest in whether various initiatives are succeeding. A number of reform efforts are loosely clustered in the Houston Annenberg Challenge, which has been underway since 1997 with substantial ($60 million) five-year funding from Annenberg and local matchers. 88 schools are being directly supported in Houston and five smaller nearby districts. A number of other ventures (e.g. professional development, institutes for teachers) are also being underwritten by the Annenberg Challenge, and some of these have already grown into larger initiatives with support from elsewhere. To its credit, the Houston Annenberg program invited a research team led by the University of Texas's Pedro Reyes to conduct a "formative" evaluation. This report-dated August 2001, issued in December-is called the "year two summary report" but in fact it reviews the Houston Annenberg program through its 4th year (2000-2001). The researchers claim to have found laudable progress, both the soft kind (e.g. teacher satisfaction and parent involvement) and measurable test-score growth by Annenberg-funded schools. The press release says "Our Year Two research finds that Annenberg-funded schools have made progress-in the case of Beacon schools quite considerable progress-raising achievement levels for their students" and "Minority student are making even bigger gains." ("Beacon" schools are one of three subsets of participating schools.) The problem is this: the Annenberg schools were hand-picked and, in the case of the praised "Beacon" schools-11 such, in 5 districts, that entered the program first and were concluding their 4th year when this study was done-they were picked "because they have already demonstrated the capacity to engage in school reform." One might even say they were cherry-picked. This report contains reasonably strong achievement data emanating from the Beacon schools, but it also shows that, for the most part, they were doing better than the Houston average when they entered the program-and were still doing better, by about the same margins, 3 years later. For example, in middle school math, the Beacon Annenberg schools (the reader is not told how many of these there are, nor how many are in Houston proper) surpassed the HISD average on the statewide TAAS test by 8 points in 1997 and 7 points in 2001. The Beacon schools made greater gains than HISD in high school reading-but the other two categories of Annenberg high schools slipped in reading between 1997 and 2001 (as did HISD as a whole). As for gap-closing, two of the three Annenberg school categories made worthy gains for minority and poor kids (relative to white and middle-class youngsters) but here we're given no comparison data for the district as a whole. In general, it's really difficult to know for sure what to make of the Annenberg schools' progress. The schools were hand-picked. They were doing relatively better than the citywide average when they entered the program. There's no real control group other than the district as a whole. And much else was happening in Houston during this period of time, so we can scarcely tell what was caused by Annenberg and what may have been shaped by other influences. My beef isn't with the program. It's with this approach to program evaluation. I do not doubt that the adults involved in the Houston Annenberg Challenge have positive vibes. But is the program the cause of their students learning more? This study doesn't really shed much light on that. If you'd like to see it, surf to http://www.utexas.edu/projects/annenberg/index.html. Hard copies may be ordered by contacting The Houston Annenberg Challenge, First City Tower, 1001 Fannin, Suite 2210, Houston, TX 77002-6709; 713-658-1881.