Cleveland: Work to do, but signs of hope
Much work has been done to transform Cleveland schools, with much more still to be done. We take a look at progress so far.
Much work has been done to transform Cleveland schools, with much more still to be done. We take a look at progress so far.
No one said it would be easy. Two years ago, Cleveland Mayor Frank Jackson, along with the city’s business, philanthropic, and education leaders, came to Columbus and asked Governor Kasich and the General Assembly to help them with legislation to reform the city’s long-struggling school system. The result, the “Cleveland Plan,” has drawn attention from around the state and across the nation.
The effort held promise that it would allow Cleveland to emerge from the bottom of the national heap in student achievement. The summer legislative victory in Columbus was followed by a successful levy campaign in Fall 2012, and the school district was off to the races busily trying to implement the components of the plan.
Reform plans, if they’re actually going to work, change the way a school district does business—and as anyone who follows education reform knows, that’s hard to do. It should come as no surprise, then, that Cleveland Schools CEO Eric Gordon’s implementation of the plan has come under fire. Let’s take a look at some of the most recent challenges.
Impatience
Rising expectations are essential for a struggling school district trying to improve its academic performance, but when the improvement plan requires additional local support from the community through a property-tax levy, those expectations extend beyond the schools and to every corner of the community. As reported by the Cleveland Plain Dealer, test scores in Cleveland’s investment schools (the lowest-performing schools “targeted for extra attention for improvement”) have been mixed and have not shown any clear improvement. Even though this is the first major checkpoint after the levy passage, the apparent lack of progress will leave some scratching their hands and wondering if this is another failed attempt at reform.
It’s important that we reserve judgment, however, and give the schools a little time to improve. This year’s and next year’s state-assessment results help us analyze the effectiveness of the plan. That analysis should focus both on the school’s performance index (the measure of how well students are doing on the state assessment) and their value-added score (the measure of whether students are meeting or exceeding their individual expectations, based upon past performance). If both of these measures show progress, the overall school ratings will increase—even if it’s a slow ascent. There’s no doubt the challenge is great, as Cleveland has routinely been one of the lowest-scoring cities in the nation on the National Assessment of Education Progress (NAEP). The good news, though, is that progress matters like never before. The decision by Mayor Jackson and Gordon to limit the levy to a specific period of time, even placing a countdown clock on the CMSD website, and to report back to the community on results means that never again will student achievement in the city be ignored.
Student-based budgeting
On top of the lack of clear academic progress, some of the systemic changes are proving to be even more of a challenge. Cleveland’s reform plan shifted its organizational structure from what could best be called a traditional to a portfolio model. Portfolio models, advocated for by the Center for Reinventing Public Education (and others), are built on seven key components. The component garnering the most attention (mostly negative) in Cleveland is pupil-based funding for schools. (Additional details on pupil-based funding can be found here.)
While pupil- or student-based funding systems are designed to allocate resources based upon the needs of the students the school is serving, it can result in less funding for school buildings with declining enrollments. This provides transparency as to exactly how much money each building is getting, but teachers and parents with students in buildings receiving less than the year before will take little solace in this openness. Despite the ongoing controversy and increasing public concern, Eric Gordon deserves a tip of the cap for tackling the issue head on. Gordon recently sent a letter to education stakeholders in Cleveland explaining student-based budgeting and why it’s important.
In the letter, Gordon explained that student-based budgeting puts more control of school budgets into the hands of principals. Not only does it give them the responsibility for managing the budget, but it also empowers them to identify those things in the budget that would most help the students in the school make academic progress and provide support to the staff of the school so they can help the students to achieve. Patrick O’Donnell recently wrote about what this could mean for Cleveland principals. This type of principal/building autonomy is often absent in a district-controlled system, where everyone has a piece of the pie and where there responsibility doesn’t rest on any one individual’s shoulders. When things don’t work well and a school struggles, superintendents can blame a principal, principals can blame teachers, and teachers can blame the bloated central administrative office.
Despite some growing pains, Cleveland’s new portfolio system offers hope that school leaders will have the autonomy necessary to drive student success with clear accountability if the school struggles.
Declining enrollment
The need for additional budget cuts isn’t related to student-based budgeting but is instead a result of an expected decline in enrollment. The district’s budget next year calls for 1,500 fewer students and a loss of $14 million, and the trend is expected to continue.
In the same letter in which Gordon explained the importance of student-based budgeting, he also talked about the importance of working “together in the coming months to retain our students and attract others back.” A new website created by the Cleveland Transformation Alliance will make it easier for parents to acquire high-quality information about their educational choices. If successful in its recruitment efforts, Gordon promises that the additional funds generated by the increased enrollment will go directly toward school budgets. This represents a fundamentally different mindset. Gordon acknowledges the district is losing students, but instead of making excuses, he implies that the district will need to compete to get the students to return to the school.
This is the exact sort of competitive response that has occurred elsewhere and holds the most promise for the long-term value of school-choice programs. It’s not always easy for a district superintendent to publicly acknowledge the need to compete for students, but the fact that Gordon has suggests that CMSD has reached a turning point in its way of doing business. If he is successful and creates schools that are academically strong, attractive to parents, and present a safe environment, then that will help to right the ship. And Cleveland families will be the direct beneficiaries.
The House Education Committee tucked two provisions into the Mid-Biennium Review bill that would alter the state’s calculation of student progress. They both relate to the value-added model (VAM), the state’s method for computing a school or district’s impact on student-learning progress over time.
Value added is a statistical model that uses student-level data, collected over time, to isolate the contribution of a school on learning. This calculation is a noble and necessary undertaking, given what research has shown, time and again, about the significant influence of out-of-school factors on students’ educational success (e.g., parents, tutoring, private art and music lessons, faith-based education, etc.).
If the objective is to gain a clearer view of the true effectiveness of a school—its educators and their approach to curriculum, behavior, scheduling, and so forth—we want to minimize the influence of the out-of-school factors. Increasing clarity to school performance applies both to high-wealth schools, which can skate by on the backs of upper-middle-class parents, and to low-wealth schools, which can be handicapped in an accountability system based on raw proficiency measures.
I believe—and yes, to a certain extent, based on faith—that the state is moving in the right direction with its approach to value added.[1] But in my view, the House is making two missteps in its proposed changes to VAM. The following describe the provisions and why the state legislature should remove them as the bill heads to the Senate.
Provision 1: Changes value added from a three-year to one-year calculation
The amendment reads,
The overall score under the value-added progress dimension of a school district or building, for which the department shall use up to three years of value-added data as from the most recent school year available.
The amendment’s language would actually revert the state back to its one-year value-added calculation, which it used from 2005–06 until 2011–12. Beginning in 2012–13, however, the state switched to a calculation that covered students’ assessments over three years to compute a value-added score.
The about-face is concerning, and here’s why. Recall that value-added scores are estimates of a school’s impact on growth, with a degree of uncertainty related to that estimate. A school’s point estimate is the average student gain on math and reading assessments and is reported in Normal Curve Equivalent units (NCEs). Generally speaking, the larger the sample size, the smaller the degree of uncertainty around that estimate. Battelle for Kids writes in its guidebook, “Uncertainty around growth estimates is greater when the sample size is small.”
Thus, a one-year calculation—with a smaller sample size—increases the uncertainty around the estimate of a school’s value-added score—hardly a desirable property. The following charts display how the uncertainty around the VAM estimates changed, depending on whether Ohio used the one-year (Chart 1) versus three-year (Chart 2) calculation. Notice the wider confidence intervals—the blue lines that indicate the range of plausible VAM scores for a particular school—in the one- versus the three-year computations. Because of the wider range of plausible values, there is more uncertainty about the true value-added “effects” of schools.[2]
A concrete example might be valuable at this juncture. In 2011–12, KIPP Journey Academy—a high-poverty, high-value-added charter school—received a point estimate of 2.34 NCEs with a range of plausible values of 1.11 to 3.57 (a range of 2.46 NCEs). Remember: this estimate used just one year of VAM data. Then, in 2012–13, KIPP received an estimate of 3.53 NCEs with a plausible range of 2.99 to 4.07 (a range of 1.08 NCEs). By using a three-year average, the model was able to shrink the range of plausible values and to zero in on its estimate of the true impact of the school.
However, a one-year estimate could be preferable if a school’s educators are substantially different in the most recent school year versus a couple years back. If this were the case, it might be unfair to hold a different group of educators accountable for the impact of their predecessors, either receiving undue credit or blame. Unless this phenomenon occurs frequently in practice, it seems preferable to seek greater precision in schools’ value-added estimates.
Chart 1: Greater uncertainty around VAM estimates under one-year calculation, Ohio schools 2011–12
Chart 2: Less uncertainty around VAM estimates under three-year calculation, Ohio schools 2012–13
Source: Ohio Department of Education Notes: These charts display the value-added estimates (green dots) and the 95% confidence intervals—the range of plausible values—around that estimate for schools in Ohio (point estimate +/- 1.96 times the standard error). The top of each blue line represents the upper 95% CI and the bottom represents the lower 95% CI. Point estimates are the average learning gains for a school, in Normal Curve Equivalent units, averaged across grades (including grades 4–8) in both math and reading. For Chart 2, there may have been schools that had been open for just one or two years and hence did not have the full three years of value-added data.
Provision 2: Excludes some transfer students from value-added calculations
The Mid-Biennium Review bill adds the following amendment to state law:
For calculating the metric prescribed by division (B)(1)(e) of this section [overall value-added for a school or district], the department shall use assessment scores for only those students to whom the district or building has administered the assessments prescribed by section 3301.0710 of the Revised Code for each of the two most recent consecutive school years.
In effect, this provision would exclude a school’s incoming transfer students, be they transfers from another school or those making a structural move (i.e., from elementary to middle school). (Schools are allowed to exclude, for accountability purposes, the test results of mid-year transfers.) This raises both the mathematical issue of sample size in VAM calculations and, more importantly, a philosophical objection.
First, the mathematical: the “two-consecutive-years” clause would certainly decrease the sample sizes for a school’s value-added calculation. As discussed above, smaller sample sizes can increase the uncertainty around a school’s VAM estimate. Under this provision, a grade 6–8 middle school—a widely used grade configuration—would now have just two, instead of three, grade levels of available data. Seventh- and eighth-grade students would be included in the school’s VAM computation, but sixth graders would now be excluded, since they weren’t educated in that building for fifth grade. For grade 7–8 middle schools across Ohio—there were 149 of them in 2012–13—the state would have just one grade level, eighth grade, to calculate the school’s VAM.
Perhaps more fundamental than the statistical issue is a philosophical one. Some may argue that schools should not be accountable for “new” transfer students. But this argument falters as a matter of principle: if we actually think that schools should be helping all children to learn—not just the school’s long-time students—what is the compelling reason for excluding transfer students? A school is responsible—and accountable for—the students who walk through its doors.
* * *
Value-added is not a “magic model” for computing a school’s impact on learning—there isn’t one out there—but the state has made improvements to value-add in recent years. The House’s amendments should be reconsidered in light of how they could negatively alter the value-added calculations of schools and districts. Moreover, the second provision simply ignores the basic principle that all kids count.
[1] The value-added model that Ohio and several other states use is considered proprietary by SAS, the company that runs the statistical model.
[2] The bill does not clearly articulate whether the “subgroup” value-added scores would be a one- or three-year calculation. Presently, the plan is to calculate them as three-year averages once enough years of data are available. (The state used a one-year calculation in 2012-13, since it was the first year for subgroup VAMs.)
We invite you to check out our new Ohio Gadfly daily news blog posts, rounding up the most relevant education news stories from around the state and serving them up with a side of Fordham-style commentary by yours truly.
Here’s a taste of what we were commenting on last week:
Want more? Check out the Ohio Gadfly Daily every weekday around noon, or follow us on Twitter.
Elsewhere in this issue, I write at length about my take on last week’s event talking about Ohio’s Third Grade Reading Guarantee and what it means to students in Columbus. As you read, the mantra was “all hands on deck,” even while hosts and presenters and audience members alike betrayed a worrying language of “reading is hard” and “tests are icky” that could easily undo a ton of great work.
And it didn’t stop at the door of the event.
Case in point: the Columbus Dispatch’s coverage of this event, which comprised two subtly different stories by the same journalist.
So maybe this is just perception, or maybe I’m being too sensitive, but the messaging concerns me. This is an important effort that must succeed and must continue to succeed for year upon year.
Last week, I attended a forum at the Columbus Metropolitan Club, hosted by our friends at KidsOhio.org, which showcased efforts in the city of Columbus to meet the challenge of Ohio’s Third Grade Reading Guarantee. The district’s work thus far is impressive: multiple citywide family literacy events held over the last four months, recruitment of “literacy-buddy” volunteers for in-school service, extensive training for reading interventionists, and even mustering the support of school-bus drivers to encourage reading every day. Is all of this effort going to make every third grader pass the reading test before the start of fourth grade? No. Is it going to improve upon the 48 percent passing rate achieved in the district last fall? Yes—and when it does, one long-standing barrier to achievement in my hometown schools will be overcome for hundreds of children.
And as for the mighty Columbus Metropolitan Library, voted more than once the number-one library system of its size in the country? Well, they’re trying really hard. Panelist Alison Circle noted several times that she and her staff are “out of their comfort zone” in an effort of this type. Nevertheless, they should be applauded for supplying books, recruiting volunteers, and making sure that schools and families know their doors are open to all in support of this “all-hands-on-deck moment” in our community.
It is fitting that attendees seemed most impressed with the stories told—of Columbus superintendent Dan Good’s mother joining him at a family literacy event and bringing his cherished childhood books to show to children; of a bus driver who recommends books to riders; of children who are eager to share that they just learned the word “cute” starts with “C” and not “Q.” But those signs of progress and hope are tiny and fragile and will require much sustained work to keep from falling backward, given the challenge the district faces.
And therein lies a cautionary tale. Just as words and stories of encouragement and support are absorbed by children and can fortify them, so too are words and stories of test anxiety and the jokes of adults with just the opposite effect.
All attendees at the event were given practice third-grade reading test booklets and we were told to complete them; our hostess even took time to give us all the answers later to check our work. I was the only person at my table to get beyond question four, let alone finish the test. I did finish it, and I got a 14/14 according to my grader. I say this not to brag but to caution against talk of “icky tests” and “you weren’t expecting to do this today, were you?” These types of sentiments passed on to budding readers could very easily undo all of the work that has been done already. Words of doubt didn’t stop at my table, either. The session itself included talk of “failing,” mistaken reporting that it’s one-and-done on testing, and even a tiny joke about using “books as weapons” on the bus. Such slip-ups indicate to me that while hands are on deck, minds may well be elsewhere. These must be avoided.
Children see, hear, and understand so much. It is imperative to keep to the message: reading is vital, reading is awesome, and all the tools needed are here and waiting for children and their families to overcome this one barrier right now and forever. And all these hands are ready to help—not because some law says we must but because we truly believe that this is important work that can be accomplished. We can’t let our guard down for a moment on this because we already know where that will get us. We already own that.
As the Columbus Dispatch editors put it so succinctly back in March, “Extraordinary effort will have to become the new normal.” So banish the “icky-test” talk, muster every available bus driver and grandma and dad, and get in there to fight.
Most of us are aware by now that Franklin Regional High School, near Pittsburgh, was recently the site of a terrible act of violence. That district also happens to be my home school. There, I had the good fortune to learn under the tutelage of many superb educators. The tragic consequences of the human condition struck home for me, as I’m sure they have for the families of Chardon, Columbine, Sandy Hook, and just last week for the parents and students of Liberty Elementary in Columbus.
Yet I also caught a glimpse, through the news feeds, of humankind at its finest and bravest: Principal Sam King—a good man whom I remember from my high-school days—helping to disarm the assailant and young men and women casting themselves into harm’s way to save each other’s lives. The light of men shone through, even in the darkest moment. My prayers and best wishes go out to my alma mater.