The Ohio Department of Education is expected to release report cards for the 2016-17 school year by the end of this week. Like an annual checkup with a physician, these report cards offer valuable information on the academic health of Buckeye schools and students.
As many Ohioans know, state leaders have overhauled the assessment and report card system in recent years. To their credit, they’ve implemented more demanding state exams that now offer a clearer picture of student proficiency than under former assessments. The report cards themselves are much different from those in years past; they now include various A-F components that consider not only traditional measures like proficiency and graduation rates, but also pupils’ growth over time and their readiness for college or career. While Ohio legislators still need to do considerable work to help report cards function properly—we’ll be releasing several recommendations for tweaking them next month—the stability in state assessment policies and on key pieces of the school grading system is praiseworthy.
What are we keeping an eye out for when report cards drop? Here are three things:
Will the use of multi-year averages help to stabilize value-added ratings?
In recent years, one of the more concerning developments has been the instability of the value-added ratings—a critical gauge of student growth over time. Last year, a fair number of districts experienced dramatic swings in their value-added ratings relative to the prior year; for example, Dayton Public Schools went from an F to an A on this measure in the course of a single academic year. Some of the volatility we saw was likely due to the recent use single-year value-added data, rather than multi-year averages, which help to smooth random fluctuations. (The one-year results were used as Ohio transitioned state exams.) Now that Ohio has implemented two years of the same assessment, value-added ratings based on multi-year averages should return. Multi-year averaging is a good practice that should yield clearer views of school effectiveness and hopefully bolster public confidence in this important accountability measure.
Will Ohio’s charter schools again outperform district schools in the Big Eight?
Last year, brick-and-mortar charters fared modestly better than comparable urban district schools on value-added results. My analysis found that 29 percent of Buckeye charters received an A or B on value- added compared to 19 percent of Big Eight district schools. Conversely, a smaller proportion of charters were rated D or F compared to district run schools (52 to 68 percent). While one year of data cannot be deemed conclusive evidence that Ohio’s charter sector is on the upswing after the enactment of major reforms in fall 2015, we took it as a hopeful sign of improvement. We’ll be on the lookout to see whether the charter advantage persists with the addition of a second year of value-added data. As usual, we’ll again dissect the performance of Ohio’s urban charters in comparison to their urban district counterparts. And yes, this is our friendly reminder that it is misleading to compare urban charter performance to schools statewide—especially on proficiency-based measures! Which leads us to…
Will nearly all high-poverty schools get slammed on most report card components?
There used to be a TV show called Early Edition where the main character received the next day’s newspaper a day early. This meant that he knew for certain things like the winning lottery numbers and where and when tragedies would strike. Alas, for high-poverty schools, they don’t need to get early editions to predict their results on the vast majority of Ohio’s report card components. In all likelihood, they won’t be pretty. For as long as Ohio has issued report cards, high-poverty schools have struggled mightily when judged on the basis of “status” measures such as proficiency rates, performance indexes, graduation rates, and now college admissions test results. Research consistently indicates that disadvantaged pupils tend to start out behind their peers, which places high-poverty schools at a greater risk of receiving of Ds and Fs on measures that don’t control for prior achievement—not necessarily due to ineffective instruction but because their students have such big gaps to close.
A quick quiz: What is the only component that isn’t as closely correlated with pupil background characteristics? Answer: The state’s value-added measure, which accounts for students’ prior achievement levels, and hence more evenhandedly evaluates school performance by looking at student growth over time. However, this report card element is not given enough emphasis on report cards, leaving the public with the distorted view that virtually all high-poverty schools are failures—even those that are making an extraordinary impact on pupils’ academic trajectories.
* * *
Stay tuned to the Ohio Gadfly Daily as report cards are released. We’ll be looking into these—and other questions—and we hope you’ll turn to us as a source for objective, fair analyses of the report card data. And remember: You can also follow us on Twitter (@ohiogadfly) and Facebook for commentary and quick facts as report cards come out.