NOTE: The Thomas B. Fordham Institute occasionally publishes guest commentaries on its blogs. The views expressed by guest authors do not necessarily reflect those of Fordham.
The Covid-19 pandemic caused unprecedented disruptions to teaching and learning in Ohio, including school closures, sudden changes to instructional delivery, economic hardship, and social isolation. In January 2021, we released a report that examined the impact of the pandemic on Ohio students’ achievement, as measured by the fall 2020 administration of Ohio’s third grade English language arts (ELA) exam. That study documented significant declines in achievement relative to student performance on the same test prior to the pandemic—especially among traditionally disadvantaged students and students whose districts opted for remote instruction.
Ohio’s public officials—from Governor DeWine to school administrators—sought to stem the achievement decline by expediting a return to in-person learning and implementing learning recovery plans. For example, many districts expanded their summer school offerings and supported students by providing computers and Wi-Fi, tracking down those chronically absent, and coordinating a variety of social services. Importantly, and in spite of significant resistance, Ohio’s governor and legislators were determined to administer state tests in spring of 2021 in order to monitor student learning, identify student populations that continue to struggle, and determine what educational and administrative practices work.
Our new report, examining student performance on Ohio state tests from this past spring, demonstrates the tremendous insights state assessments provide about the educational progress of Ohio students. Unlike data released in other states in recent weeks, our analysis carefully accounts for the lower-than-usual rates of student test participation. Because disadvantaged students who tend to be low-achieving were those least likely to take state exams this past spring, naïve estimates that fail to account for low participation rates significantly understate learning declines. Our analysis addresses this problem. We also take advantage of Ohio’s rich data to document differences in learning impacts across student subgroups and assesses the consequences of different district educational delivery models.
Thus, once again, Ohio sets itself apart with its commitment to bringing the very best evidence to bear on state and local education decision-making. Specifically, our report uses Ohio’s detailed student-level data to do the following:
- Examine if schools have been able to make up ground lost between March and November of 2020 by estimating student learning growth from November 2020 to April 2021 on third-grade ELA exams. This analysis helps isolate the learning disruptions that have taken place since the beginning of the school year, separating their impacts from closures and other disruptions in the spring of 2020.
- Estimate the total impact of the pandemic—from March 2020 to April 2021—in grades 5–8 and high school in both ELA and math. This analysis focuses on how much progress students have made in achievement over this entire time span and examines differences in learning disruption across student populations and modes of instruction.
Below, we highlight some major findings from our analysis. We encourage readers to consult the report for a more complete set of results.
1. Achievement declines continued during the 2020-21 academic year.
As we reported previously, the fall 2020 third-grade ELA exams indicated that Ohio third graders fell behind due to Covid-related disruptions from spring to fall of 2020. The spring 2021 tests reveal that these third graders did not catch up during the remainder of the 2020–21 school year. Instead, they continued to lose ground compared to previous cohorts of students.
Specifically, in ELA, third-grade students learned roughly 20 percent less on average between November 2020 and April 2021 as compared to students in prior years. Indeed, as of spring 2021, at least one third of the total pandemic-related achievement decline on third-grade ELA exams is due to decreased growth during the 2020–21 academic year. The remainder is due to declines that took place prior to the fall testing window (including but not limited to school closures in spring 2020).
We found similar overall impacts across all of the grades we examined. The total decline in student achievement in spring 2021 compared to prior years was roughly equivalent to between one-half and a whole year’s worth of learning in math and between one-third and one-half of a year’s worth of learning in ELA, depending on the grade level. In most grades, ELA proficiency rates decreased by about 8 percentage points and math proficiency decreased by approximately 15 percentage points.
2. Remote instruction is to blame for a significant portion of student learning declines.
Students in districts that spent the majority of the academic year using fully in-person instruction experienced smaller achievement declines than students in districts using either hybrid or virtual learning. These differences across modes of learning were somewhat more pronounced in lower grades compared to higher grades and in ELA compared to math. In ELA, districts with fully remote instruction for most of the year recorded test score declines 2–3 times larger, depending on the grade level, than districts that spent the majority of the year fully in person.
Our analysis of fall-to-spring achievement growth on the third-grade ELA test—as well as Ohio’s data on district mode of learning—confirms that this correlation between district mode of instruction and student achievement likely captures a real difference in the effectiveness of in-person and remote modes of instruction. We estimate that students had learning declines almost twice as large with remote instruction as compared to in-person instruction (-0.19 standard deviations instead of -0.10 standard deviations; see Figure 1).
Figure 1. Changes in fall-to-spring standardized test score growth on Ohio’s third-grade ELA exam in 2020–21 compared to pre-pandemic years, by district mode of instruction
Note: The figure presents the average fall-to-spring growth of normalized test scores in standard deviation units between pre-pandemic years (2018 and 2019) and 2021. These are regression estimates that compare changes in test scores over time for students who took the same exam in fall and spring of each year. Mode of instruction is determined based on weekly data submitted to the Ohio Department of Education for weeks between the fall and spring test administration windows. Due to changes in how mode of instruction was recorded over the course of the year, the “hybrid” category combines districts that offered fully hybrid instruction across all grades and districts that offered at least some in-person instruction for lower grades and remote instruction for older students.
3. Disadvantaged students were most negatively affected by the pandemic, further exacerbating achievement gaps.
Compared to their peers, historically underserved student subgroups (measured by race, economic status, homelessness, disability, and English-learner status) generally experienced test score declines that were 1.5–2 times larger in ELA compared to their peers. The analysis of fall-to-spring achievement growth on the third-grade ELA exam is particularly enlightening. For example, students who performed in the highest quartile of achievement in the fall learned as much between fall and spring of the 2020–21 school year as they did in years prior to the pandemic. Those in the bottom achievement quartile, however, learned about 25 percent less than usual. This difference is highly consequential because third-grade students in the bottom quartile typically experience two to three times more learning growth during the school year than students in the top achievement quartile.
4. Contrary to popular belief, students in later grades experienced the greatest learning declines relative to typical achievement growth rates.
In contrast to recent analyses examining achievement trends on district-administered assessments, we find little evidence that student achievement growth declined most in lower grades. Indeed, relative to typical achievement growth in each grade, students in upper-middle school and high school have fallen behind more than students in lower-middle and elementary grades. The greater decline in later grades is particularly due to greater declines in math.
For example, between 2019 and 2021, fifth graders had test score declines of 0.34 standard deviations in math compared to prior cohorts, whereas students who took the high school geometry exam (usually in tenth grade) had test score declines of 0.29 standard deviations. It appears, then, that there were larger declines in earlier grades. However, younger students typically learn a lot more from year to year than older students. When we compare the raw test score declines to typical achievement growth in those grades, students in fifth grade fell behind the equivalent of just over half of a year of learning, whereas students who took the high school geometry test were behind the equivalent of a full year’s worth of learning. Other studies’ failure to account for differences in the amount of typical achievement growth across grade levels has led to considerable confusion and, perhaps, a failure to target interventions appropriately.
Our findings—in conjunction with emerging research on the social-emotional impacts of distance learning and student isolation—make very clear that, typically, students learn much better when they can attend school in person. The academic literature also indicates that this is particularly true for low-achieving and disadvantaged students. The surge in Covid-19 cases due to the delta variant might tempt policymakers to rethink in-person instruction. Our analysis confirms that would be a bad idea for most Ohio students, and likely those throughout the nation. It is important to balance the serious near-term public health concerns posed by the more highly contagious form of the virus against the long-term impacts of learning disruptions—particularly since these disruptions have affected disadvantaged communities the most.
Our focus needs to be on opening schools as safely as possible—and making sure they can stay open. Districts leaders should make every effort to implement layered mitigation strategies that we saw work so well during the last academic year and develop proactive testing and other approaches to minimize further disruptions from quarantines, which have already had a devastating effect on student learning. But more than simply prioritizing in-person instruction, we must ensure that students, especially those who have suffered the most learning loss, are getting the supports that they need to get back on track.
Vladimir Kogan is Associate Professor in The Ohio State University’s Department of Political Science and (by courtesy) the John Glenn College of Public Affairs. Stéphane Lavertu is Professor in The Ohio State University’s John Glenn College of Public Affairs. The opinions and recommendations presented in this editorial are those of the authors and do not necessarily represent policy positions or views of the John Glenn College of Public Affairs, the Department of Political Science, or The Ohio State University.