If you feel amused or provoked by anything you read in the Education Gadfly, write us at [email protected]. From time to time, we publish correspondence that we think might interest other readers, such as the following letter.
Ray Domanico's November 14th Gadfly column criticized Standard & Poor's Michigan and Pennsylvania school district evaluations, available on the S&P School Evaluation Services Web site. Mr. Domanico may judge a "simplified rating" of school district performance to be of more value than a comprehensive analysis, but most parents, taxpayers, school administrators, educators, and policymakers would disagree.
Mr. Domanico asserts that S&P's School Evaluation Service's "bland" reports have not simplified data with easily understood analysis. In Michigan, we find them forceful and forthright. For example, in the opening sentence in the reports of one of our troubled school systems, S&P evenhandedly concludes that students in this school district achieve below average results, yet operating expenditures are among the highest in the state. The comparison of each school system's performance with that of a group of demographically similar districts can be an exceptionally powerful catalyst for change. Such a comparison does not give policymakers and administrators opportunities to hide behind excuses; rather, it strips excuses away by showcasing districts with similar demographics that perform better with fewer resources. This is not "editorial embroidery"; it is well-placed analysis never before made available in Michigan on such a comprehensive scale.
Mr. Domanico's claim that this kind of analysis might "reinforce low expectations" is as baffling as the factual errors in his column, beginning with the statement that there is nothing on the S&P Web site that is not available on Web sites managed by the state. To the contrary, nowhere in Michigan are there any calculations of each district's return on resources. The return on resources concept is a hallmark of the School Evaluation Services analysis. Only S&P's SES Web site publishes a summary indicator combining each district's scores and participation rates on the state test, ACT, S.A.T., and AP, together with dropout and graduation rates. In fact, a calculation of our state's test participation rates did not exist until Standard & Poor's analyzed and compared a variety of databases from several sources. For the first time, through SES, we present disaggregated academic data by race, ethnicity, gender, socio-economic group, and special education status. The S&P Web site includes numerous benchmarks and trend data that pertain to student results, spending, return on resources, the learning environment, finances, taxes, debt, and demographics. This reporting system on school results exceeds even the most well-regarded state report cards.
All of the information described above is available at the school district level; some of it is available at the school building level in Michigan. The Michigan site does not yet include building level financial data because this information is still being refined. Once that is refined, the data will be incorporated into S&P's analysis to examine each individual school's return on resources - an analytical finding that is hardly "behind the times." Until then, district-wide data most certainly warrant a thorough analysis because important policy determinations and resource allocations are made at the district level.
Mr. Domanico misses a critical point when he implies that S&P merely rehashes existing state data and reports. Besides providing a valuable analytical framework, S&P's evaluations provide value because S&P is not a vested party in the state's education system. The firm's impartiality is precisely what makes its analysis so widely accepted and usable by various stakeholders. It is that same impartiality and reputation upon which investors rely when reviewing Standard & Poor's municipal and corporate credit ratings. S&P makes fair assessments without jeopardizing the business relationships of its affiliated companies.
In just its first six months, S&P's School Evaluation Services Web site has received more than 260,000 visitor sessions and more than nine million page hits, more than half of which were made by parents. Mr. Domanico got it backwards when he suggested that states need to ask tough questions about the value S&P adds to school reform efforts. Rather, S&P is asking parents and policymakers to ask tough questions about the value added by the state's public school systems, and at least in Michigan, they are asking those questions.
Sincerely,
Madhu R. Anderson
Director
Center for Educational Performance and Information
Lansing, Michigan