It would be an understatement to say that the 2015–16 school year was one of transition. Indeed, over the past twelve months, we lived through the implementation of the third state assessment in three years, the rollout of Ohio’s revised sponsor evaluation, and the introduction of a new state superintendent at the Ohio Department of Education (ODE). Change is reverberating throughout the system, and change is hard. As Charles Kettering once said, “The world hates change, yet it is the only thing that has brought progress.”
Charles Kettering was right. Lest we lose sight of the endgame, it is important to remember that the developments of the last twelve months have their roots in policy decisions designed to improve Ohio’s academic standards overall and its charter school sector—one that many viewed as rife with poorly performing schools and controlled by special interests—in particular.
Toward that end, in 2015–16 Ohio implemented assessments developed by the ODE and American Institutes of Research (AIR). AIR is the third assessment administered in Ohio’s public schools in three years and follows administration of the Ohio Achievement Assessments in 2013–14 and the politically charged and ultimately doomed PARCC tests in 2014–15. At the same time, the State Board of Education raised standards for what it means for students to be proficient. Educators deserve a gold medal for dealing with the challenges of this messy transition.
Not unexpectedly, proficiency outcomes plunged across the state in 2015–16. When asked to comment on the new tests, Dayton Early College Academy deputy superintendent David Taylor responded, “We’re trying to project for the future and establish something that will help our kids be successful when they go into the real world and into a professional environment. Whether they’re going to college or career, kids have to have a certain skill set.” Although scores were low statewide, the higher standards should serve to better prepare young people for their futures and provide parents and taxpayers a truer picture of children’s educational outcomes.
Also in 2015–16, the ODE rolled out its reworked sponsor evaluation. Readers may recall that the ODE evaluated sponsors in 2014, but the results were rescinded due to a scandal—e-school results were not considered when judging the academic performance of each sponsor’s portfolio. It was a shame on several counts, as otherwise the evaluation was smart and rigorous.
The ODE recently released the results of its revised sponsor evaluation, including new ratings for all of the state’s charter school sponsors. Under the current rating system, sponsors are evaluated in three areas—compliance, quality practice, and school academic outcomes—and receive overall ratings of exemplary, effective, ineffective, or poor. Of the sixty-five Buckeye State sponsors evaluated, five (including Fordham) were rated effective, thirty-nine ineffective, and twenty-one poor. Incentives are built into the system for sponsors rated effective or exemplary (for instance, only having to be evaluated on the quality practice component every three years); however, sponsors rated ineffective are prohibited from sponsoring new schools, and sponsors rated poor have their sponsorship revoked.
[[{"fid":"117650","view_mode":"default","fields":{"format":"default"},"type":"media","link_text":null,"attributes":{"height":"543","width":"1458","style":"width: 600px; height: 223px;","class":"media-element file-default"}}]]
Evaluating sponsors is a key step in the direction of accountability and quality control, especially in Ohio, where the charter sector has been beset with performance challenges. Indeed, the point of implementing the evaluation was twofold. First, the existence of the evaluation system and its rubric for ratings is meant to prod sponsors to focus on academic outcomes of the charter schools in their portfolios. Second, the evaluation system is designed to help sponsors improve their own work, which would result in stronger oversight (without micromanagement) of schools and an improved charter sector. Results-driven accountability is important, as is continually improving one’s practice.
What happens next is also important. The ODE has time to improve its sponsor evaluation system before the next cycle, and it should take that opportunity seriously. Strengthening both the framework and the process will improve the evaluation. Let us offer a few ideas.
First, the academic component should be revised to more accurately capture whether schools are making a difference for their students. Largely as a function of current state policy, Ohio charters are mostly located in economically challenged communities. As we’ve long known and are reminded of each year when state report cards on schools and districts are released, academic outcomes correlate closely with demographics. So we need to look at the gains that schools are (or are not) making, as well as their present achievement. In communities where children are well below grade level, the extent and velocity of growth matter enormously. Make no mistake: proficiency is also important. But schools whose pupils consistently make well over a year of achievement growth within a single school year are doing what they’re supposed to: helping kids catch up and preparing them for the future.
It’s critical that we make sure that achievement and growth both be given their due when evaluating Ohio schools—and the entities that sponsor them. Fortunately, Ohio will soon unveil a modified school accountability plan under the federal Every Student Succeeds Act (ESSA); this would be a perfect opportunity to rebalance school report cards in a way that places appropriate weight—for all public schools and sponsors—on student growth over time.
And because dropout-recovery charters are graded on a different scale than other kinds of charters, their sponsors may receive artificially high ratings on the academic portion of the sponsor evaluation. That needs fine-tuning, too.
The compliance component of the sponsor evaluation system also needs attention. The current version looks at compliance with “all laws and rules,” which is a list of 319 laws and rules applicable to Ohio’s charter schools, many of which don’t apply to individual sponsors (for example, many sponsors have no e-schools in their portfolios, which means that the laws and rules that apply to such schools aren’t really pertinent to them). Yet all Ohio sponsors were forced to gather and draft more than a hundred documents and memos—many of them duplicative—for each of their schools over a thirty-day period. A better way to do this would be to figure out what applies and what matters most and then examine compliance against those provisions. For example, current item 209 (“the school displays a U.S. flag, not less than five feet in length, when school is in session”) is not as important as whether the school has a safety plan (that is, how to deal with armed intruders). The ODE should focus on compliance with the most critical regulations on a regular basis, while spot checking or periodically checking compliance with the more picayune regulations. Another option would be to review a sample of the required documents each year, much as an auditor randomly reviews transactions. The current compliance regimen is hugely burdensome with—in many cases—very little payoff.
The sponsor evaluation is critically important and reflects continued progress in Ohio’s efforts to improve charter school outcomes. But it’s also important to get it right, if it’s indeed going to improve sponsor practice and, in turn, the charter sector. In its current form, it measures how well a sponsor responded to rubric questions and whether there were enough staff on hand to upload documents. It needs to quickly move to 2.0, if it seeks to be a credible and effective instrument in the long term.
You can download the full Fordham Sponsorship Annual Report here.