NOTE: The Thomas B. Fordham Institute occasionally publishes guest commentaries on its blogs. The views expressed by guest authors do not necessarily reflect those of Fordham.
When Governor Mike DeWine ordered the closure of Ohio schools last March to contain the spread of a deadly new virus, it was clear that Covid-19 would reshape the educational trajectory of many children in the state. Nearly a year into the pandemic, we now have clear evidence of its impact on student achievement.
In a new report, we analyze results from the fall administration of Ohio’s Third-Grade English Language Arts (ELA) assessment to document how the pandemic has affected student learning. Unlike other state exams, which usually take place in the spring, this test is given several times per year and thus provides an early glimpse of how Ohio students are faring. Despite the disruptions caused by Covid-19, more than 80 percent of current Ohio third graders showed up to take the test in late October and early November of 2020.
Specifically, we compared the performance of this year’s third-grade students on the fall ELA assessment to the performance of third graders in the fall of 2019. Importantly, we used detailed demographic records to account for lower participation rates among certain student subgroups in fall 2020, and to make the data more directly comparable over time. Unlike online, district-specific diagnostic tests, which raise concerns that parents may assist their children on the exams, the Ohio fall ELA assessment is administered on site and under the supervision of proctors.
Not surprisingly, the results show that student learning has suffered as a result of the pandemic. Third-grade ELA scores were considerably lower this past fall than they were the year prior. The achievement decline equates to about one third of a year’s worth of lost learning for the average Ohio third grader. The decline was more pronounced among economically disadvantaged children, and it was especially large for Black students. Indeed, Black students’ scores declined by nearly 50 percent more than those of their White classmates, representing approximately half a year’s worth of learning. These learning losses are especially alarming given the pre-existing achievement gaps between less advantaged children and their peers prior to the pandemic.
Governor DeWine let individual districts decide how to reopen this past fall, and we find evidence that the decisions local officials made mattered. Although schools that reopened for five days of in-person learning this year also saw substantial declines in third-grade ELA scores, the drop was bigger among districts where learning remained completely virtual at the start of the year. (Schools that used some form of “hybrid” learning saw declines that put them somewhere between in-person and virtual districts.)
It is important to recognize that school closure is not the only reason student achievement took a hit during the pandemic. In addition to the challenge of adjusting to new and unfamiliar modes of instruction, some children also had loved ones fall seriously ill or even die due to complications from the virus. Many families have also faced financial hardships as unemployment reached record levels. Indeed, we find evidence that rising unemployment contributed to the drop in student performance, with larger test score declines in the parts of the state that experienced the sharpest job losses. Overall, our preliminary calculations suggest that Covid-related unemployment explains approximately one-third of the decrease in average test scores statewide.
In some ways, Ohio is unusually well positioned to begin helping kids make up lost ground. Few other states administer tests in the fall, so we have earlier indicators than most about the magnitude of these learning disruptions and a clear sense about which students (and districts) have seen achievement fall the most. It is important that policymakers begin making plans to remediate these impacts sooner rather than later, a conversation that seems especially timely as districts prepare to receive substantial new federal funding under the relief bill passed by Congress in December.
The Ohio General Assembly has pending legislation that would suspend other state tests this spring. However, our analysis highlights the value of state assessments for tracking the trajectory of student learning—to ensure that our remediation efforts are working. One Ohio state legislator, rightly noting that we already knew that students fell behind academically, asked “Why do we need to put them in a testing situation to find that out?” The answer is that these tests reveal precisely which students have been impacted most severely, so that resources and other efforts can be targeted most effectively.
For example, we suspect that, prior to reading this article or our full report, few policymakers knew how much more the achievement of Black students declined or that test scores declined substantially more in areas of the state that experienced larger increases in unemployment. The data available thus far cover just one subject and grade, so the full cycle of state exams this spring will provide a much more complete picture of how the pandemic has affected student learning in Ohio and help our state develop a targeted academic recover strategy. We also hope to conduct further analyses that help us identify districts that were particularly effective at overcoming their circumstances, so that Ohio educators can identify best practices.
Of course, it makes little sense to punish school districts for the test score declines or to hold them accountable for a global pandemic that was clearly beyond their control. Ohio lawmakers were right to pause the state’s school accountability system last spring, and we agree with their decision in December to continue this pause through the current academic year. However, even in the absence of accountability requirements or the publication of detailed report cards, state exams contain invaluable information—akin to a set of “vital signs” that can help monitor the health of student learning in Ohio.
Having made the case for state exams, it is also important that we remember to look beyond test scores. In many ways, young people have borne the brunt of the pandemic. Emergency room visits related to children’s mental health are up considerably compared to previous years, and it is likely that extended school closures have allowed abuse to go undetected and unreported. With more limited access to school meals, many children have also gone hungry, even as a lack of physical activity during school closures may have also caused significant weight gain. Test scores don’t directly capture any of these impacts, and it is essential that our recovery strategy is mindful of the numerous hardships Ohio children have faced over the past year.
Now is also the time for policymakers to begin planning for the next academic year. It seems clear that vaccines will not be available for most children by then. Under current safety guidelines—which require six feet of distancing and limit school bus capacity to one child per seat—many districts may be forced to remain in hybrid mode into the next school year, almost surely causing learning losses and achievement gaps to grow further. It is important to reexamine whether these precautions will still make sense once adults have been vaccinated and, if so, be proactive in finding ways to protect students from further disruption to their learning.
We commend the Ohio Department of Education for moving quickly to share these data with the public and for working with us to make this study possible. We know of no other state education agency that put this information in the hands of educators, administrators, and policymakers so quickly. Ohio is well poised to be ahead of the pack to make the best of a tough situation, if we can use these data and what we learn from this spring’s state assessments to take action on behalf of students.
Vladimir Kogan is Associate Professor in The Ohio State University’s Department of Political Science and (by courtesy) the John Glenn College of Public Affairs. Stéphane Lavertu is Professor in The Ohio State University’s John Glenn College of Public Affairs. The opinions and recommendations presented in this editorial are those of the authors and do not necessarily represent policy positions or views of the John Glenn College of Public Affairs, the Department of Political Science, or The Ohio State University.