Last August in The Gadfly (see http://www.edexcellence.net/gadfly/issue.cfm?issue=93#1209), I reviewed the results of the New York City summer school program for 2001, the second time that the giant school system had attempted to corral more than 300,000 kids to return during the hot months for remediation or enrichment or Regents test preparation.
Of that large number, some 72,000 children in grades 3-8 had been ordered to go to summer school because of their academic deficiencies. Of that group, 8,000 did not show up. The summer remedial program was part of the Board of Education's attempt to end social promotion by directing help to those children who had fallen behind in math and reading. The results, as reported by the Board of Education at summer's end, were discouraging. Most who attended summer school failed their end-of-course exams in reading and math, but were promoted anyway. Two-thirds showed little or no improvement in math, and nearly 60 percent failed to improve in reading.
Average reading scores actually dropped for eighth graders, both in 2000 and again in 2001. Further, nearly three-quarters of the eighth graders scored in the lowest level of performance in reading and math after their summer of remediation.
After reviewing these dismal statistics, I recommended that the Board of Education try to learn something from the summer program about "what works" and what doesn't. It seemed to me that the school system ought to be able to use upcoming studies to figure out which methods were most effective, which curricula were most effective, and what kinds of teachers were most effective, especially with the lowest-performing children. It also seemed to be a good chance to learn about the effectiveness of private vendors.
Well, as it happened, the Board of Education had indeed contracted out an evaluation that aimed to answer some of these questions. That's now been done. But the evaluation, conducted by Metis Associates along with the Institute for Education and Social Policy at New York University, raises more questions than it answers. It begins by telling the reader that 374,411 students from kindergarten through high school "registered" for summer school, and "269,620 (72%) actually attended at least one day." For the balance of the study, the reader is asked to think of students who attended for as little as a single day as participants in the summer program. It would have been truly useful to know whether test results were better for those who attended almost every day, as compared to those who attended for only one or two or three days, and whether the students' amount of learning is related to their "time on task," but no such comparison is made.
The report shows that 16,000 teachers and 900 administrators worked in the program, but since we don't know how many children attended regularly, we know nothing about class size and whether it had any relationship to student achievement. The study should have been designed to shed some light on that important issue.
The report tells us that, of thirty-three community school districts and seven high school superintendencies, only 12 of the 40 units actually had clear goals and objectives for the summer programs they administered.
The report finds that administrators and teachers, on the whole, were very satisfied with the program. But in light of the results released last summer by the Board of Education, with so many children failing to make discernible academic progress, this satisfaction seems unwarranted, to say the least. The best measure of the program's effectiveness must be whether children learned, not whether the providers were satisfied with their efforts.
We also learn that the summer schools were allowed to use the curriculum of their choice, and many different ones were in use. Nothing is said about whether some curricula were more effective than others in helping kids master math and/or reading.
We learn that some of the classes were taught with curricula supplied by private vendors, but an effort to compare the effectiveness of the private vendors was "inconclusive."
We learn nothing about the effectiveness of methods employed in different classes, nothing about whether teachers stressed phonics or whole language, this math program or that one, and whether some pedagogies functioned better than others or made no difference.
One issue that the report did question was whether teacher certification bore any relationship to student achievement; the answer is that it did not. The report does not mention that the state and city have just issued mandates to exclude uncertified teachers from the public school classrooms of New York City beginning in 2003, which will exacerbate current shortages and probably be impossible to implement.
The report recommends that the summer program be continued and also that more evaluations be funded.
With the current budgetary pressures on the public schools, the summer program demands a far more discerning review. To the observant reader, even this friendly evaluation shows that the program has been run with minimal planning, with few goals, with inadequate results for students, with scant evidence of "what works," with no demonstrably effective curricula, but with enormous satisfaction on the part of providers. What's wrong with this picture?
"Summer School Draws More Critics," by Abby Goodnough, The New York Times, March 27, 2002