Over the last few years, the federal government has sent billions of dollars in emergency funding to states via several relief packages aimed at addressing the impacts of Covid-19.
Over the last few years, the federal government has sent billions of dollars in emergency funding to states via several relief packages aimed at addressing the impacts of Covid-19. Ohio schools received more than $6 billion, with the largest influx arriving courtesy of the American Rescue Plan (ARP), the final federal relief package. In accordance with federal guidance, Ohio set aside 10 percent of these funds to pay for state-level initiatives. Although we have a general idea of how the state has been spending some of those relief dollars, there doesn’t appear to be an easily accessible website or downloadable document that offers a detailed accounting for the expenditures.
The same is true for districts and schools that are spending the remaining 90 percent of federal funding. The State Board of Education has discussed district expenditures at a few of their meetings. There are somenewsstories about initiatives and improvements that are being paid for. And several districts outlineon theirwebsites how they are using or plan to use the funds. But for the most part, this information is surface level—lists with vague and generalized phrases like “virtual tutoring” or “summer scholars” accompanied by a dollar amount—and districts rarely explain what their specific strategies are, how they align with overall strategic plans, or how they will likely impact students in the short- and long-term.
In some cases, this lack of information is a huge missed opportunity. Columbus City Schools, for instance, plans to implement “LETRS reading and training for elementary teachers [and] administrators.” But that phrasing alone isn’t enough to clue in the average parent or taxpayer that LETRS—Language Essentials for Teachers of Reading and Spelling—is an early literacy professional development program that’s rooted in the science of reading and has played a crucial role in the “reading miracle” happening in Mississippi. Considering that the district earned only one star on the early literacy component of the most recent state report cards, you’d think funding a promising literacy program would be something they’d want to shout from the rooftops. Instead, it’s buried in a list.
To some degree, the lack of detailed information about Covid relief spending in Ohio isn’t surprising. The federal packages that awarded these funds included very few guardrails, and though Ohio established a few of its own, they weren’t particularly stringent. At the end of the day, without strict public transparency rules to follow, it’s unlikely that most Ohio school bureaucracies will be readily transparent. But that doesn’t mean that districts and the state shouldn’t do a far better job of being open with the public. Indeed, a district in another state has shown us what’s possible.
In mid-December, The 74published an in-depth look at Saint Paul Public Schools (SPPS) in Minnesota, where district officials are closely tracking and reporting their roughly $207 million in ARP spending. District leaders awarded federal funds to schools based on seven focus areas: systemic equity, positive school and district culture, effective and culturally responsive instruction, college and career readiness, program evaluation and resource allocation, family and community engagement, and safe schools. Within these focus areas, the district identified specific strategies aimed at addressing student outcomes impacted by the pandemic. Which schools are implementing each strategy, a list of activities associated with the strategy, and short-, medium-, and long-term target outcomes are all readily accessible on the district’s website. For example, in the college and career readiness focus area, the district has identified more than half a dozen strategies. Here’s a look at the information provided for one of them:
But SPPS didn’t stop there. The district’s budget dashboard also breaks down ARP funding by focus area and individual strategy. That means at any given time, anyone can visit the district’s website and click through an interactive tool that displays a brief description of what each strategy aims to accomplish, how much the district budgeted for it, and how much has been spent so far.
It’s important to note that based on what’s seen here, SPPS unfortunately didn’t establish any measurable, time-sensitive academic goals. But by identifying focus areas and strategies in such a specific way, the district did open the door for parents, advocates, and the general public to hold them accountable for results. For instance, as part of its systemic equity focus area, SPPS budgeted nearly $1.5 million for recruiting and retaining teachers of color. In a few years, if the district hasn’t diversified its teaching staff in a meaningful way, advocates will have the data to ask tough questions.
***
What’s happening in Saint Paul is a clear example of what’s possible when districts take transparency seriously. SPPS leaders could have settled for making a list of initiatives and costs, publishing it on their website, and then calling it a day. Instead, they identified focus areas aligned to the district’s strategic plan, broke them down into individual strategies, and then outlined what implementing those strategies would entail, how much it would cost, and how it could impact students. In doing so, they weren’t just fiscally transparent, but academically transparent, too. They clearly communicated to parents and families how they planned to address the pandemic’s impact on students, even if a focus on measurable academic achievement seems conspicuously absent. It remains to be seen whether the district’s efforts will indeed improve student outcomes. Initial data on several programs seems promising. But it matters that district leaders were so transparent about their spending.
As the excitement of a new year dwindles and Ohioans settle back into their familiar routines, policymakers and advocates are gearing up for yet another budget season. Governor DeWine is scheduled to release his proposed biennial state operating budget in just a few short weeks, and by June, a host of new legislative provisions will likely become law. Here’s a look at three education issues that deserve a lot of love and attention in the budget.
1. Early literacy
National andstate-level data indicate that pandemic-related disruptions have had a negative impact on student learning. The good news is that achievement has rebounded in elementary and middle school English language arts. The bad news is that 208 of the state’s 607 districts, roughly one-third, earned only one or two stars out of five on the early literacy component of state report cards. Ampledatashow that reading proficiently by the end of third grade is a “make-or-break” benchmark for kids. For the tens of thousands of students attending these struggling schools, a difficult road lies ahead unless state and local leaders step up their efforts.
Fordham outlined several ways that state lawmakers could do so in a policy brief published last fall. One idea is to create a state reimbursement program for schools that purchase high-quality instructional materials. Such a program was implemented with great success in Louisiana, and could be fruitful here, as well. Ohio lawmakers could also take cues from Mississippi, which has significantly improved its early literacy outcomes. One of their key strategies was investing in literacy coaches, who provide varying levels of support to schools based on their needs. Ohio legislators could follow suit by authorizing the Ohio Department of Education to hire, train, and deploy reading coaches to districts with two consecutive years of low ratings on the early literacy component of state report cards.
2. Education-to-workforce pathways
Results fromnationally representative surveys indicate that a large majority of teenagers wish their high schools provided more information about the post-secondary options available to them. Seventy-four percent said it’s important to have a career in mind before they graduate, but only 39 percent reported taking classes or participating in programs that allowed them to explore careers. And although many of the jobs that teenagers report being interested in have career and technical education (CTE) pathways associated with them, only 20 percent of high schoolers believe that CTE could lead to the career they want. The upshot? Teenagers both need and want more information about education-to-workforce pathways, as well as more opportunities to access them.
Engagement with employers and higher education is equally important. Ohio already has a solid foundation thanks to statelawsand grants aimed at funding collaboration between businesses, education and training providers, and community leaders. There are a few promising partnerships that already exist and are definitely worth celebrating. But there’s still a long way to go. To ensure that education-to-workforce pathways reach their full potential, Ohio needs more state-led programs and initiatives aimed at gaining buy-in from the business community and higher education institutions, as well as fostering collaboration with K–12 schools.
There are several ways Ohio lawmakers could address the needs of students and ratchet up their engagement with businesses and higher education:
Increase industry engagement by offering additional incentives for employers to participate in career exploration and work-based learning opportunities for students. For instance, the Governor’s Workforce Board in Rhode Island oversees a Work Immersion program that helps employers train prospective workers through paid internships by reimbursing them at a rate of 50 or 75 percent for wages paid to eligible participants. And in Delaware, a collaboration between education, business, and government leaders has produced a statewide program that offers K–12 students the opportunity to complete a program of study aligned with an in-demand career.
Link students’ K–12 and higher education records with workforce outcomes, such as wages, career fields, and unemployment records. This would help state leaders, educational institutions, and employers better understand gaps in student readiness and collaborate to create data-driven solutions.
Collect and analyze high-quality data on industry-recognized credentials (IRCs), which have the potential to offer a plethora of benefits for high school students and employers alike. Right now, though, there is significant misalignment between the credentials employers demand and those that Ohio students actually earn. To fix this, state leaders need to identify which credentials are part of high-quality career pathways, which students have access to those pathways, and which credentials have the best return on investment.
Establish a statewide informational campaign that focuses on reaching out to students and families and ensuring that they know about all the CTE options available to them.
Over the last few years, Ohio districts expressed growing concernover teachershortages. Although the pandemic has played a role in these staffing struggles, it’s important to recognize that they existed long before Covid. It’s also important to recognize that shortages don’t exist everywhere. A recent paper published by Brown University’s Annenberg Institute for School Reform found that teacher staffing issues are “highly localized,” which makes it possible for teacher shortages and surpluses to exist simultaneously.
There are plenty of ways Ohio could fix its teacher pipeline issues. Fordham offered several ideas in another policy brief published last year. But to effectively address the most pressing problem—a lack of teachers in some districts, grade levels, and subject areas—state leaders need detailed and consistent data. And right now, they don’t have it. Ohio doesn’t collect and maintain statewide data on teacher vacancies in a single, easily accessible place, so it's difficult to pinpoint the size and scope of the problem and identify potential solutions.
Lawmakers could easily fix this issue in the budget. The best way to do so would be to take a page out of North Carolina’s book and require an annual report about the state of Ohio’s teaching profession. The report could include data on teacher vacancies, attrition, and mobility, all of which could be disaggregated by region, district, grade level, subject area, teacher experience level, and teacher demographics. Consistent access to such detailed data would allow researchers to track trends over time, and would help state and local leaders proactively identify problems and solutions.
***
For better or worse, Ohio does the bulk of its education policymaking through the biennial budget. The next few months will be chaotic and potentially controversial. But amidst all the hustle and bustle, there will be opportunities to get things done. In these moments, lawmakers would be wise to remember early literacy, education-to-workforce pathways, and teacher shortages.
Over the past year, one of the most heavily debated topics in Ohio education has been the retention provision of the Third Grade Reading Guarantee, a decade-old package of early literacy reforms. Under the retention policy, schools must hold back students (with limited exceptions) who are struggling to read at the end of third grade and provide them intensive literacy supports. This requirement aims to ensure that all children have foundational reading skills before they are asked to tackle more challenging material in the middle and upper grades.
Despite the sound rationale, critics have long decried the policy as being hurtful to retained students. Their claims are often based in anecdote and crude interpretation of data. In November, the State Board of Education—a body that has been hostile to retention—presented data showing that less than one in six retained students achieve the state’s reading proficiency target in subsequent years. Based on these numbers, board members argued that the policy “has not achieved the desired result” and passed a resolution asking the legislature to scrap the requirement (which lawmakers have, so far, not done).
Yet such a brazen condemnation of Ohio’s Reading Guarantee is hardly warranted based on these data. Instead, as the debate continues, policymakers should heed more credible evidence about the effectiveness of retention, including a brand-new study that examines Indiana’s third-grade retention policy, to which we return a few paragraphs hence.
Let’s first review some problems with using raw proficiency numbers to make judgments about retention.
For starters, retained students could be making good progress in later grades, but focusing only on their “proficiency”—a relatively high bar that roughly 40 percent of Ohio students fall short of—would overlook those gains. Obviously, ensuring that every student is a proficient reader is an important goal for schools. But progress toward proficiency matters, too. Perhaps a retained student is moving from the 2nd to 15th percentile by fifth grade. That type of growth should also be part of any evaluation of the retention policy. Moreover, the raw numbers lack any context that could help us understand the actual impact of retention. How do retained students perform relative to other low-achieving students who narrowly pass the reading requirement? Do they make more or less progress than their close counterparts? Answers to such questions would provide a clearer picture of whether retention is better for low-achieving students than the alternative of “socially promoting” them.
Unfortunately, a careful evaluation of Ohio’s third-grade retention policy has not yet been undertaken. That should certainly change. But there has been strong empirical work from Florida that uncovers positive effects of retention under its early literacy law (those findings are discussed in an earlier piece). A recent report published by the Annenberg Institute at Brown University also reveals positive impacts of third grade retention in Indiana.
The analysis was conducted by Cory Koedel of the University of Missouri and NaYoung Hwang of the University of New Hampshire. Akin to Ohio’s and Florida’s reading policy, Indiana requires third graders to achieve a certain target on state reading exams in order to be promoted to fourth grade. The policy went into effect in 2011–12 and the analysts examine data through 2016–17. Using a “regression discontinuity” approach, Koedel and Hwang compare the fourth through seventh grade outcomes of retained students to their peers who just barely passed Indiana’s promotional threshold. This methodology (also used in the aforementioned Florida study) provides strong causal evidence—almost as good as a “gold standard” experiment—about the effect of holding back low-achieving third graders.
Here are Indiana’s impressive results:
In fourth grade, retained students achieve much higher state exam scores—in both math and reading—than students who just barely passed the promotional threshold in third grade. The academic boost for retained students persists through seventh grade, though the magnitude of the impact somewhat fades over time.
The results are consistently positive across student groups, with the average Black, Hispanic, and White student experiencing gains from retention. Likewise, both economically disadvantaged and non-disadvantaged retained students post higher subsequent scores than marginally promoted peers.
The study finds no significant impacts of retention on disciplinary or attendance outcomes—a finding that helps to alleviate concerns that retention demotivates students or leads to negative behavior at school.
The authors conclude, “Taken on the whole, our findings of positive achievement effects of the Indiana policy, coupled with the lack of negative effects on attendance and disciplinary outcomes, suggest grade retention is a promising intervention for students who are struggling academically early in their schooling careers.”
When lawmakers passed the Buckeye State’s Third Grade Reading Guarantee more than a decade ago, they did so because they recognized the importance of early literacy to students’ long-term success. Dropping the retention provision of the guarantee based on anecdotes and flimsy data would be reckless, potentially leaving thousands of Ohio students at risk of not receiving the extra time and support they need to read fluently. Holding back third graders struggling to read has worked in other states. It can work—and may very well be working—in Ohio, as well.
We need many more effective, “expert” teachers in our schools. Thus, a new report aiming to synthesize numerous past studies in order to identify attributes that make a top performing teacher should be of interest to practitioners and policymakers alike.
In undertaking this first-of-its-kind effort—and accepting at the outset that there is likely no simple definition of “expert”—researchers Jason Anderson and Gülden Taner want to include every possible resource available. They employ a “metasummary method,” which is meant to allow both qualitative and quantitative studies of teacher expertise to be included together in the metadata. This not only allows the largest number of studies to be considered at the outset, but also keeps more studies in the final mix. As long as the original findings can be abstracted to a common level, they will be included in the metasummary data.
Using four databases (ERIC, Proquest, Web of Science, and Google Scholar), Anderson and Taner cast a wide net around terms related to teacher expertise/expert teaching. The first pass yielded 5,323 works, including published and unpublished papers, reports, books, chapters and articles, as well as Ph.D. and master’s degree theses. Nearly 90 percent of these were rejected because they were non-empirical (opinion pieces, practical teaching guides, and the like) or were studies that did not involve teachers characterized as experts despite including a related term in the body. In their second pass, the researchers looked to include any work that featured a) empirical findings on aspects of one or more “expert” teachers’ cognition or practice in primary or secondary school settings, and b) some effort to define expertise beyond years on the job and whatever each school, state, or country being investigated marked as basic qualified teacher status—things like student outcome data, advanced certification, or status as a teacher educator.
Their final metasummary includes 106 studies. All were published between 1983 and 2021 and together included 1,124 expert teachers. Most came from the United States, but China, the United Kingdom, Australia, Germany, Estonia, Hong Kong, and India contributed some, as well. Sixty-six focused on teachers at the secondary level, thrity-one at the primary/elementary level, and the rest mixed grade levels. Eighty-three were qualitative in nature, thirteen quantitative, and the rest mixed methods. More than a quarter of the reports looked at general education teachers at the primary level and teachers of various subjects at both primary and secondary levels. The remainder covered teachers of specific subjects, with math teachers comprising the majority of these.
In the end, findings in seventy-three works made the cut. Pedagogic practice, professional practice, and knowledge base were the three most commonly cited domains of expertise. That is, an expert teacher knows what her students need, understands various means by which to deliver instruction to each of them, and continually refines her knowledge and practice. More specifically, Anderson and Taner explain that expert teachers are driven by “moral duty” toward their learners and exhibit unconditional care for them; have a passion for teaching, a positive self-image, and a desire to succeed in their profession; have a habit of reflection and a desire for lifelong learning; and work in collaboration with their peers. Expert teachers motivate their students to work hard, reflect critically, and cope with challenges on their way to success. They assess student progress, spot misunderstandings, and offer ongoing feedback and individualized support to erase any barriers to progress. Sounds like a high-quality practitioner for sure.
Anderson cautions that the attributes as summarized here should not be seen as a “prescriptive checklist” that school leaders should be looking for when hiring, but rather as a set of vital skills to be supported, nurtured, and developed in teachers as they progress from novices to veterans. It would be interesting to determine if any external measures (master’s degree, additional college coursework, etc.) correlated with expertise, but that is beyond the scope of this metasummary. At the least, schools of education should take heed of these attributes. “This study is more holistic than previous research in the field,” he says, and paints “a more complete picture of teacher expertise than individual studies have been able to do to date. It links aspects of the knowledge, cognition, practice, and personalities of expert teachers in ways that are likely to benefit stakeholders across primary and secondary education.”
One hallmark of charter schools—distinct from their traditional district peers—is flexibility in their HR practices. Ideally, this means that charters are free to manage their vital human capital in order to better attract and retain great teachers—particularly from non-traditional backgrounds—quickly and efficiently replace low performers, create incentive pay structures, and deploy talent with an eye toward maximum impact on students. However, it is an open question whether charter schools are able to actually capitalize on these flexibilities. A report recently published in the Journal of Public Economics looks at labor mobility within and across the charter and district school sectors in Massachusetts in an effort to provide an answer.
Researchers Jesse Bruhn, Scott Imberman, and Marcus Winters start by creating a simplified model of what is termed “regulatory arbitrage”—the extent to which an employer is able to maneuver within a given set of regulations—in the education sector. In the model, which is based on Bay State regulations, charter schools may hire licensed or unlicensed teachers at a wage of their choosing. Traditional district schools are assumed to have a continuous demand for teachers, to be required to hire only those who are fully licensed, and to be bound to wage levels determined by collective bargaining. Teachers are assumed to have an equal desire to remain in the field regardless of school type, and both charter and district teachers are assumed to be able to move freely between school types—in pursuit of optimal pay—even though full licensure is often onerous and expensive to obtain if pursued later in one’s career. The simplistic model indicates that, given the cost of licensure and hiring flexibility around it, charter schools will hire unlicensed teachers and have high attrition relative to traditional public schools. Ineffective charter school teachers will exit the profession entirely and effective charter school teachers will obtain licensure (if they don’t have it) and switch to traditional public schools. It is important to note that no differential is assumed between charter and district funding levels. While they may approach parity in the Bay State, it is very different in other states, a situation which would render the idea of incentive pay a distant dream.
Using data on the entire universe of public school students and teachers provided by the Massachusetts Department of Elementary and Secondary Education from the 2008–09 to the 2016–17 school years, the researchers’ findings largely reflect their model. Charter schools are more likely to lose both their highest- and lowest-performing teachers relative to traditional district schools. Where departing charter teachers end up also mimics the model: Experienced teachers that are high performing within their charter tend to move to other public school employment opportunities; inexperienced teachers that are low performing within their charter are more likely to exit the Massachusetts education system entirely. Among unlicensed teachers working in a charter school, the decision to obtain a license predicts a sudden and permanent increase in the probability of them moving to a traditional district school. Incidences of teachers moving from traditional district schools to charter schools are “nearly non-existent” in their data.
There are some obvious limitations to the study: Lack of data on private school hiring, no information on teachers that move out-of-state, and lack of generalizability beyond Massachusetts due to varying teacher regulations and funding schemes in each state.
But, in general, the researchers conclude that regulatory flexibility in terms of hiring rules and pay scales does not result in charter schools being able to recruit and retain the best teachers. If incentive pay structures exist in charters, they are not enough to hold onto talented educators. However, charters might be able to deploy the talent they have more efficiently than traditional district schools. The data show that charters are able to maintain their academic quality in spite of relatively high teacher turnover. The fact that charters tend to stay at the same performance level year after year while regularly losing their best teachers to higher-paying jobs indicates that something inherent in the charter model may be driving student results (for better or for worse) beyond a singular reliance on teacher quality. Perhaps a focus on culture and recruiting teachers who are simpatico could be the key flexibility rather than money.
Additionally, the researchers hypothesize that charter schools are creating a positive externality on traditional district schools by increasing the average quality of teacher labor available to them. That is, the lowest-performing charter teachers exit the profession quickly and permanently—a rare occurrence for low-performing district teachers due to tenure and union rules—while the highest-performing charter teachers are motivated to move on to districts, arriving with more experience than first-year teachers hired directly from college.
More study is required to focus in on each of these many possibilities, but it seems clear that there is more than meets the eye when it comes to regulatory flexibility and the teacher workforce.
The Thomas B. Fordham Institute seeks an energetic, detail-oriented, and highly motivated Research and Data Analyst. The individual will provide analytic and research support for the Institute’s Ohio-based team, which engages in K–12 education research/analysis and policy advocacy at the state level. This is a paid, part-time position based out of Fordham’s Columbus, Ohio office. The position is remote, but preference would be given to candidates with ties to Ohio or a strong interest in Ohio’s education system and state government. This is an ideal role for someone with a deep desire to learn about education policy and develop their analytical skills. Policy topics that the Research and Data Analyst may engage in include: school accountability, educational choice, school funding, and career-and-technical education.
Responsibilities:
Conduct original analyses of Ohio’s education data, with guidance from Fordham’s Ohio research director.
Keep track of and review reports published by state agencies and other education groups that address issues in K-12 education.
Review the academic literature on topics that are being debated.
Write policy and data analyses for the Ohio Gadfly Daily as well as shorter reviews of recently-published research.
Keep abreast of legislation that is moving through the Ohio General Assembly.
Pull data from national datasets or other states’ databases, as needed.
Preferred qualifications:
A passion for education reform and deep commitment to Fordham’s mission.
Proficiency in Excel required. Familiarity with other statistical programs a plus but not required.
Excellent verbal and written communication skills.
Committed to excellence, detail-oriented, and highly organized.
Pursuing or holding a Bachelor’s degree in economics, political science, public policy, or related field.
About the Thomas B. Fordham Institute:
The Thomas B. Fordham Institute and its affiliated Foundation promote educational excellence for every child in America via quality research, analysis, and commentary, as well as advocacy and exemplary charter school authorizing in Ohio. For more about the Institute, please visit: www.fordhaminstitute.org. The Fordham Institute is committed to providing equal opportunities and being an inclusive workplace that reflects the diverse backgrounds and experiences of the students and families we aim to serve.
To Apply:
Qualified candidates should submit a cover letter and resume as a single PDF here.
For two decades, Ohio’s district and school report cards have been the linchpin to a transparent and accountable public school system. Report cards provide key information about how students perform on the year’s state assessments and how they are growing academically over time. In more recent years, Ohio has added measures of high schoolers’ postsecondary readiness as well as elementary school students’ progress in reading. These data—along with user-friendly ratings based on them—assist parents who are making school decisions for their children, and they provide communities with annual checkups on the academic quality of their local schools. In some circumstances, state policymakers rely on report cards to identify low-performing schools that need intervention and additional supports.
Given these critical purposes, Ohio needs a report card that provides a clear and accurate picture of school performance across the state. We at Fordham have long been staunch supporters of the goals and aims of Ohio’s report card, but we’ve also shared concerns about how its prior version was functioning. In a 2017 paper, we outlined how the system had gotten bogged down with too many measures and ratings, didn’t adequately differentiate school performance, and placed excessive emphasis on “status” measures that tend to correlate with demographics.[1] Others, largely representing school administration groups, levelled harsher criticisms and some offered proposals that would have undermined the overarching goals of the report card.
After several years of debate, Ohio lawmakers overhauled the school report card in 2021. The reform legislation—House Bill 82 of the 134th General Assembly—won near-unanimous approval in both chambers, and Governor DeWine signed it into law.[2] We, along with other education groups, supported the legislative action. But it’s no secret that strong reforms on paper can go awry during implementation. This paper takes a closer look at Ohio’s redesigned report card in its first year of implementation (the 2021–22 school year). What are the key changes, and how were they put into practice?
In brief, the main revisions include the following.
First, it shifts from A–F school ratings to a five-star system. Although letter grades remain the most widely understood grading system, school officials criticized them and pushed for their removal. Feeling pressured, lawmakers weighed several alternatives and ultimately decided to go with star ratings, a system that continues to offer a clear sense of school performance, perhaps without being as politically charged. In 2021–22, Ohio assigned star ratings to schools on five components—Achievement, Progress, Graduation, Gap Closing, and Early Literacy—with a sixth component rating (based on postsecondary readiness) tentatively slated for 2024–25.
Second, the reforms wisely preserve and refine schools’ Overall rating, Ohio’s longstanding “bottom line” evaluation that combines schools’ performance on the various components. This mark continues to offer parents and the public a user-friendly summary of school quality, while, in a policy improvement, placing heavier weight on the Achievement and Progress ratings—the two central components of the state report card. Overall ratings were withheld in 2021–22 but will appear next year.
Third, the legislation makes dozens of technical adjustments to streamline the system and strengthen its various components. The notable revisions include (1) removing the duplicative “indicators-met” dimension of the Achievement component, thus yielding a clearer focus on the key performance-index (PI) measure; (2) adding a value-added “effect-size” growth measure in the Progress component that allows us to better pinpoint highly effective or ineffective schools; and (3) overhauling the Gap Closing calculations to ensure that schools are held accountable for both the achievement and academic growth of designated student groups (e.g., economically disadvantaged). Changes such as these are discussed further in the report.
This analysis also uncovers one issue that still needs more work: the insufficient rigor of the Gap Closing component. Last year, more than 60 percent of districts and schools received four- or five-star ratings on this measure, despite significant learning losses and widening achievement gaps coming out of the pandemic.[3] Such rosy results can be explained by a couple of decisions made by the state board of education and the department of education during implementation. First, while setting the grading scale was not easy given the significant changes to the component, the scale—in hindsight—ended up being too soft. Schools could meet less than half—just 45 percent—of the performance indicators and still receive four stars. Additionally, the achievement targets for subgroups were set too low. For instance, 75 percent of schools met the state’s PI goals for economically disadvantaged pupils, even as those very students suffered large learning losses. Moving forward, policymakers should increase the rigor of this component and make certain that it offers an honest picture of how effectively schools are educating all student groups.
To its credit, Ohio is moving to a report card that offers transparent ratings to parents and the public, treats schools more evenhandedly, and has a stronger technical foundation. It’s one that state leaders should be proud of and confidently stand behind. With some smart tweaks as the new framework is implemented, Ohio will finally have a report card that is built to last.
Analysis of Ohio’s revamped school report card
After several years of debate, Ohio lawmakers overhauled the state’s school report card in 2021 via House Bill 82 of the 134th General Assembly. The legislation preserves several core strengths of the previous system, including (1) upholding Ohio’s longstanding commitment to using objective measures of performance on state exams; (2) maintaining the PI and value-added measures as the key indicators of pupil achievement and growth, respectively; and (3) preserving the state’s Overall rating, which combines results across multiple report-card components to generate a user-friendly summary for parents and the public.
Yet it’s also different. The most visible shift is replacing A–F letter grades for districts and individual schools with a five-star rating system. Appearing on the 2012–13 to 2018–19 report cards, letter grades moved Ohio to a more widely understood rating system than the state’s prior approach, which had relied on ambiguous labels such as “continuous improvement.” School officials, however, bitterly criticized letter grades and pressed to scrap ratings altogether or reinstate vague labels. Others (including Fordham) raised concerns that such opaque approaches would hide the ball. In the end, state lawmakers reached a reasonable compromise. The new five-star system continues to offer a transparent picture of school quality, one that parents and the public can easily grasp, while taking some of the sting out of the ratings.
The reform legislation also made numerous technical changes that streamline and improve the system. These refinements were undertaken in response to concerns from Fordham and other education groups that the report card wasn’t functioning properly. Fortunately, legislators were willing to roll up their sleeves and make the fixes needed to strengthen the report card. Table 1 reviews the most important changes, with more detailed analysis of five key components in the pages that follow. Discussion about the Overall rating, including tweaks made to the methodology for calculating it, appears here.
Table 1: Summary of Ohio’s report-card components and recent changes to them
Component 1: Achievement
Student achievement on state assessments has long formed the backbone of Ohio’s report-card system—rightly so, as measures of achievement offer Ohioans a clear sense of where students in their local districts and schools currently stand academically. They shed light on the basic question of whether students are struggling in core academic subjects or exceeding state standards.
For many years, Ohio has deployed two measures—proficiency rates and PI scores—to present a picture of student achievement. As noted in the table above, the recent reforms eliminated the use of proficiency rates via “indicators met” in the rating system (though these rates are still reported). Instead, Ohio now relies entirely on the PI to determine Achievement ratings. The two measures are highly correlated—schools with high PI score tend to have high proficiency rates (and vice versa)—so removing one of them streamlined the rating system. In contrast to the more simplistic proficiency rate, the PI uses weights that provide more credit to schools when students achieve at higher levels. In an accountability setting, this structure encourages schools to pay attention to all students—including high and low achievers—rather than incentivizing a narrow focus on students around the proficiency bar. The table shows the calculations using 2021–22 data from Columbus City Schools, the largest district in Ohio.
Though the index’s weights have remained consistent over time, the recent legislation slightly alters the way scores translate into ratings. Previously, schools’ PI ratings were based on their score divided by the maximum number of points possible (120), not considering “Advanced Plus.” Now they are determined by dividing scores by the average of the top 2 percent of districts or schools statewide (107.3 for districts and 109.1 for schools in 2021–22).[5] This “curve,” which was championed by school administrator groups, slightly boosts Achievement ratings and explains why—despite the decline in scores statewide[6]—more schools received high marks on the Achievement component in 2021–22 compared to 2018–19.
Figure 1: Distribution of Achievement ratings in Ohio schools, 2018–19 and 2021–22
While achievement measures provide insight into where students stand, they disadvantage high-poverty schools. That reality reflects persistent achievement gaps that are partly driven by socioeconomic factors. The PI scores and Achievement ratings for 2021–22 continue to follow this pattern. Figure 2 shows that schools with higher percentages of economically disadvantaged students tend to have lower PI scores. Table 4 reports that most high-poverty schools receive one- or two-star Achievement ratings (86 percent), while just 5 percent of low-poverty schools receive such marks.
Figure 2: PI scores versus economically disadvantaged students in Ohio schools, 2021–22
Table 3: Achievement ratings by poverty tier in Ohio schools, 2021–22
Component 2: Progress
Recognizing the limitations of evaluating school performance solely based on achievement metrics, analysts have developed student growth measures as a way to create a more even playing field for schools serving children of differing backgrounds. These measures rely on statistical techniques that gauge a school’s contribution to changes in student achievement over time. Because these methodologies control for a student’s prior achievement, schools of all poverty levels have more equal opportunities to demonstrate academic growth. To offer a simplified illustration of this type of method, consider a high-poverty school whose average student starts the year at the twentieth percentile. At the end of the year, this student scores at the twenty-fifth percentile. That five-percentile gain is recognized under a growth model, even though this student still hasn’t reached proficiency.
For more than a decade, Ohio has used a “value-added” growth model.[7] Under Ohio’s former report-card system, the state relied on the value-added index score to determine Progress ratings. While that score indicates whether students’ gains or losses are statistically significant, it doesn’t offer a clear sense of their magnitude. This was less than ideal. A large district, for instance, could eke out a miniscule gain of one percentile yet receive a top rating because of the strong statistical evidence (due to its large numbers of students) that the gain was different from zero. To gauge the size of the impact more appropriately, Ohio began implementation of a value-added effect size in 2021–22, which is now paired with the traditional index score to determine Progress ratings. Taken together, the two measures now offer a better depiction of whether a school’s impact on pupil growth is both statistically and practically significant.
The following table displays how Ohio combines the two measures to determine Progress ratings for individual schools. It’s a two-step process in which the state first considers the index score—using a largely similar framework as before[8]—and then applies the effect size to differentiate four- versus five-star schools and one- versus two-star schools.
Table 4: Progress rating framework for Ohio schools
The Progress ratings confirm that the new framework better pinpoints high and low performers. Under the former system, Ohio identified almost four in five schools as either A’s or F’s. With the application of the effect size, fewer schools are now at the “tails” of the distribution: 13 percent were assigned one-star Progress ratings, and 16 percent received five stars. It’s important to note that in 2021–22, schools could post strong Progress ratings even though their students still lagged their prepandemic peers in achievement. If a school did comparatively well (relative to other schools) in moving their students ahead from 2020–21 to 2021–22, it would have received a solid rating.
Figure 3: Distribution of Progress component ratings in Ohio schools, 2018–19 and 2021–22
Historically, Ohio’s value-added system has produced poverty-neutral results that help to identify high- and low-performing schools serving students of varying demographics. That pattern surfaced again in the 2021–22 data. Figure 4 reveals almost no correlation—indicated by the flat regression line—between a school’s value-added index scores and its percentage of economically disadvantaged students. The same holds true for the effect size results.
Figure 4: Value-added index scores (top) and effect sizes (bottom) versus economically disadvantaged in Ohio schools, 2021–22[9]
As for the component ratings, schools of all poverty levels fare about equally well. Table 5 shows that 18 percent of high-poverty schools received five-star Progress ratings, a slightly larger proportion than low- and mid-poverty schools. Conversely, 15 percent of high-poverty schools received one-star ratings, while 9 percent of low-poverty schools did.
Table 5: Distribution of Progress ratings by poverty tier in Ohio schools, 2021–22
Component 3: Graduation
Very little changed within the Graduation component in the recent overhaul, as it continues to rely on both the four- and five-year graduation rates to determine the rating. The grading scale, however, was slightly tweaked—likely an effort to comply with a new provision that requires the state board of education to set grading scales in a way that avoids identifying more than 50 percent of schools in any single rating category.[10] As figure 5 shows, almost half of Ohio high schools received A’s on the Graduation component in 2018–19, so the board slightly raised the bar. This slight adjustment reduced the percentage of high schools receiving five-star ratings to 30 percent in 2021–22.[11]
Table 6: Grading scale for the Graduation component, 2018–19 and 2021–22[12]
Figure 5: Distribution of Graduation component ratings, Ohio high schools, 2018–19 and 2021–22
Component 4: Gap Closing
The Gap Closing component focuses on the outcomes of specific student subgroups that are identified in federal and state law (i.e., economically disadvantaged, English learners, special education, and race/ethnicity). This “disaggregated” look at results helps to ensure that historically disadvantaged students are not being overlooked and their results are not masked by overall school averages. Unfortunately, the structure of the old Gap Closing component was unnecessarily complex[13] and failed to properly account for both the achievement and growth of the various subgroups.[14] The term Gap Closing is also something of misnomer, as it also includes traditionally high-achieving subgroups and doesn’t directly gauge whether gaps are closing or not. There were efforts to change the component name in the reform legislation, but they did not pass.
Though conceptually similar—Gap Closing continues to focus on subgroup outcomes—state lawmakers overhauled the component in the recent legislation. Instead of a complicated scoring system, the new framework deploys a more straightforward methodology that looks at all the subgroups and gives schools credit when they meet an achievement or growth target.[15] Table 7 shows the system in action, using data from Columbus City Schools. To illustrate, the table shows only the English language arts (ELA) achievement and growth results—the same format is used to evaluate subgroup performance in math as well as four-year graduation rates. It also displays five “special” elements within the Gap Closing component, including three that focus on gifted students.
Table 7: Illustration of the new Gap Closing calculations [16][17][18][19]
Given the learning losses and widening achievement gaps, the most surprising result from the 2021–22 report card is the number of schools that received high Gap Closing ratings. As Figure 6 indicates, 61 percent of Ohio schools received four- or five-star ratings on this component and more schools received top marks on this component than in 2018–19, a year when achievement was higher.
Figure 6: Distribution of Gap Closing component ratings, 2018–19 and 2021–22
Two factors—both issues that emerged during implementation—help to explain such results:
First, a soft grading scale buoyed these ratings. While the restructuring of the component made it difficult to project scores and set a grading scale, the state board ended up setting the bar too low. Schools could miss 40 percent of the performance indicators and still earn five stars, and they could miss a majority of them (55 percent) and receive four.[20] Under administrative law, the board must review the component grading scales within the next two years. As part of that process, it should implement a more rigorous Gap Closing scale.[21]
Second, the subgroup PI targets set by the Ohio Department of Education (ODE) are also too low. As table 8 shows, more than 70 percent of Ohio schools met the achievement goal for economically disadvantaged students and more than 60 percent met goals for Black and Hispanic students, despite the learning losses and widening achievement gaps for these groups. While these goals will rise over time, they still don’t set rigorous expectations. None of the subgroup goals for 2024–25 match their pre-pandemic achievement levels, sending the message that the state is willing to tolerate learning losses into the second half of this decade (see the Appendix for the state’s amended performance goals).
Table 8: Percentage of schools meeting math and ELA PI goals by subgroup[22]
Overall, the implementation of the new Gap Closing measure was good but imperfect. Structurally, the component is clearer and simpler, making it easier for the public and policymakers to see which student groups are achieving performance goals and which ones need more help. The balanced emphasis on subgroup achievement and growth is also commendable. State policymakers, however, must increase the rigor of the component by recalibrating the grading scale and increasing the PI targets. This will better identify schools in which all student groups are faring well academically as well as which ones are falling short and require more assistance.
Component 5: Early literacy
Much like Gap Closing, legislators also overhauled the Early Literacy component in the recent reforms. Seeking to create a more comprehensive picture of early literacy, they added two measures to this rating: third-grade reading proficiency rates and grade-promotion rates. These two measures are now incorporated alongside the existing “off-track” progress measure. Schools’ results on these measures are then combined into a weighted average, with greater emphasis given to third-grade reading proficiency, promotional rates, and off-track progress in that order. When schools are exempt from the off-track progress dimension,[23] the weights are 60 percent proficiency and 40 percent promotion rates.
Table 9: An overview of the Early Literacy component[24]
In 2021–22, almost universal grade promotion occurred, as the state waived conventional third-grade reading standards for the year.[25] Just over three in five Ohio third graders passed their reading exams last year, while the off-track reader progress rates were rather low. The sky-high promotion rates offset the poor rates of progress, and the median district and school’s weighted average settled just above the reading-proficiency rate.
Figure 7: Median district and school-level early-literacy rates, 2021–22
Even with the “easy” promotion rate dimension, a slight majority of Ohio schools received one- or two-star ratings last year. High-poverty schools tended to struggle most on Early Literacy, with 63 percent of them receiving one star. As noted above, the disappointing rates of off-track readers’ progress in many Ohio schools account for these ratings—and the grading scale isn’t to blame. In fact, to achieve a three-star rating, a school would need to move just 35 percent of its off-track readers to on-track status, provided they register 95 percent promotion and 65 percent proficiency rates.[26] That performance expectation isn’t unreasonable, especially given the critical importance of ensuring that all children read fluently by the end of third grade.
Figure 8: Early Literacy ratings, Ohio schools, 2018–19 and 2021–22
Table 10: Distribution of Early Literacy ratings by poverty tier, Ohio schools, 2021–22
Coming soon: Overall ratings and an overhauled postsecondary readiness component
The component ratings—and data that underlie them—all help the public, educators, and policymakers dig into a district or school’s strengths and weaknesses. But for many Ohioans, especially those who aren’t as familiar with education metrics, the Overall rating offers an invaluable “bottom line” of school quality. Although the 2021–22 report card did not feature an Overall rating, it will return next year, having last appeared in 2018–19.
Lawmakers did tweak the methodology for combining the results into a single mark. The new weighting system rightly places more emphasis on the core Achievement and Progress components. By law, they are now given twice the weight of the other components, whereas they had previously received about 1.2 to 1.3 times the weight of the others.[27] Based on their component results, schools will receive a total number of “overall points,” which will then be used to determine the summative rating.[28] Unlike the component ratings, which are given as whole stars, the Overall mark will include half-star ratings.
Table 11: The Overall rating system for school districts and selected school grade spans
Further out, Ohioans can look forward to an overhauled College, Career, Workforce, and Military Readiness (CCWMR) component. This is a revamped version of the component formerly known as Prepared for Success that will include new indicators of college and career readiness, such as how many students enlist in the military after high school and how many students complete an apprenticeship during high school. Although Ohio released relevant data in 2021–22—e.g., ACT or SAT remediation free, industry credentials, and AP or IB pass rates—it did not use them to generate a component rating. ODE is tasked with reviewing these data and proposing a grading scale for the component; a legislative committee is responsible for approving it as a rated component starting in 2024–25.
Conclusion and recommendations
With legislative reforms now on the books, the 2021–22 school year marked a new day for Ohio’s school report card. This iteration promises fairer and more accurate school ratings and—with any luck—less political contention about the system. Through the five-star rating system, the updated framework continues to offer Ohio parents and the public a clear picture of schools’ academic quality. For policymakers and community leaders, the report card offers a trove of data and multiple ratings that allow them to track trends and gauge the strengths and weaknesses of local schools.
Overall, the new report card is big step in the right direction for Ohio. Hence, our first recommendation to state lawmakers is this: Please stay the course. On too many occasions, Ohio has adopted promising education policies only to backpedal in the early years of implementation. We hope this doesn’t happen here. To their credit, Ohio lawmakers, state board members, and Department of Education leadership have worked together to design and implement a high-quality report card, perhaps one of the finest in the nation. It would be a grave disservice to Ohioans if these efforts were weakened or undone by future policymakers.
Second, Ohio should follow through and complete the full implementation of the College, Career, Workforce, and Military Readiness component by making it a rated element starting in 2024–25. With significant changes to this dimension, the legislative reforms smartly call for a transitional period in which postsecondary readiness data are reported but not used to produce ratings. In order for a rating to appear in fall 2025, statute requires the Joint Committee on Agency Rule Review to approve ODE’s system for implementing this rating. When up for review, we urge approval. The component rating will offer the public a clear sense of how districts and high schools fulfill one their core academic missions—to ready young people for their next step in life, whether college or career. Approval will also ensure that postsecondary readiness contributes to the Overall rating of a district or high school.
Third, the state board of education and ODE need to increase the rigor of the Gap Closing component. This element plays an integral role in the report card, as it ensures that schools pay attention to academic needs of all student groups. Although the reforms establish a stronger Gap Closing framework, its implementation was imperfectly executed. To strengthen the component, the state board of education should increase the performance standards (i.e., “cut scores”) that schools must achieve to earn top Gap Closing ratings. In addition, ODE should also reevaluate its low subgroup PI targets. With these tweaks, a more honest picture of how effectively schools are serving all students should begin to emerge.
For two decades, the report card has shed light on how Ohio’s 1.6 million students fare academically and how effectively the state’s roughly 600 districts and 3,300 public schools move the achievement needle. That sunlight is still needed today, as achievement gaps continue to persist (and have even widened, in the post-pandemic era) and thousands of students still struggle to exit high school with the academic skills necessary for college and career. Fortunately, Ohio’s policymakers have recognized the critical role of the report card and worked hard to strengthen it. Their efforts have created a better-functioning report card that now offers parents and the public a clearer and better look at academic performance across Ohio. That’s something we can all cheer about.
Acknowledgments
I wish to thank to my Fordham Institute colleagues Michael J. Petrilli, Chester E. Finn, Jr., Chad L. Aldis, and Jessica Poiner for their thoughtful feedback during the drafting process. Jeff Murray provided expert assistance in report production and dissemination. Special thanks to Pamela Tatz who copy edited the manuscript and Andy Kittles who created the design. All errors, however, are my own.
[4] To be in the Advanced Plus category, students must be on a formal acceleration plan, take an above-grade-level assessment, and achieve a score of advanced. Just 0.9 percent of Ohio students achieved this level in 2021–22.
[5] The district top 2 percent average is used for district ratings; the school-level average is used for individual school ratings.
[6] In 2018–19, the statewide average index score was 84.7; in 2021–22, it was 79.3. The “grading scale” for the PI did not change during this period and doesn’t explain the bump in ratings.
[8] Under the former system, the cut points for value-added ratings were index scores of A = +2.0 or above; B = +1.0 to +2.0; C = -1.0 to +1.0; D = -1.0 to -2.0; and F = -2.0 or below.
[9] Districts must achieve an effect size of +0.1 or above to receive five stars and -0.1 or below to receive one star.
[11] Lower graduation rates are also a possible explanation, though the statewide four-year graduation rate was higher in 2021-22 than in 2018-19 (87 versus 85 percent), as were five-year rates (89 versus 86 percent).
[12] The previous report-card system relied on a “points” system to combine the four- and five-year graduation rates into a composite score to determine the component rating, so I calculate the weighted four- and five-year average graduation rates that are equivalent to the new system. The “cut points” for the old system are available at ODE, Appendix B: ESSA Sections A.1-A.4.
[13] The old system employed a “points” type system that awarded full credit if a particular subgroup achieved PI and graduation rate targets and then, through a complex calculation, provided “partial credit” if a subgroup missed the target but demonstrated year-to-year improvement on these measures. A full description of the former Gap Closing component is available at Ohio Department of Education, 2019–2020 AMO Gap Closing Measure (Columbus, OH: Ohio Department of Education, 2020).
[14] In the old system, a school that failed to achieve a PI goal could receive full credit if that subgroup had a value-added index score of +1.0 or above. The problem with this type of “either-or” system is that it allows poor student achievement or growth to be ignored.
[15] ODE significantly adjusted downward the PI goals in response to pandemic learning loss, and those goals are found in its amended ESSA plan. The goals, however, will gradually increase each year. The value-added goals are tied to the Progress rating system and do not change from year to year. Detailed information about the current Gap Closing component is available at Ohio Department of Education, 2021–2022 School Year: Gap Closing Component—Technical Documentation (Columbus, OH: Ohio Department of Education, 2022).
[16] No partial points are awarded, including for the special elements that are worth five points.
[17] The value-added effect size is not applied in the subgroup growth system.
[18] The chronic absenteeism element was not included in the 2021–22 Gap Closing calculations but will be added starting in 2022–23. It will be worth five points. Schools can meet the chronic-absenteeism indicator by either achieving a rate below the state’s target for the year or posting lower rates compared to the year prior.
[19] The English learner: Alt. assessment element is a federally required measure looks at the percentage of English learners who make progress on an alternative literacy assessment.
[20] The Gap Closing grading scale was overhauled in the new report card. The scale is as follows: 5 stars = 60–100 percent; 4 stars = 45–60 percent; 3 stars = 30–45 percent; 2 stars = 10–30 percent; and 1 star = 0–10 percent.
[22] Only one school had enough Native American or Native Alaskan students to have a PI score reported for this subgroup
[23] Under state law, if less than 10 percent of a school’s Kindergarten students are deemed off-track, this element does not apply.
[24] Ohio law exempts certain English learners and special-education students from the Third Grade Reading Guarantee’s grade promotion standards. In 2018–19, the most recent year the promotional requirements were in effect, 6.2 percent of third graders were exempt. For more about the Guarantee and its promotional standards, see Ohio Department of Education, Guidance Manual on the Third Grade Reading Guarantee: School Year 2019–2020 (Columbus, OH: Ohio Department of Education, 2020).
[25] Third-grade reading promotional standards go back into effect in 2022–23.
[26] The weighted average in this example is 68 percent. The Early Literacy grading scale is as follows: 5 stars = 88–100 percent; 4 stars = 78–88 percent; 3 stars = 68–78 percent; 2 stars = 58–68 percent; and 1 star = 0–58 percent.
[27] For the weights used in the old Overall rating system, see Aaron Churchill, “A new day for Ohio’s school report cards,” Ohio Gadfly Daily, Thomas B. Fordham Institute, July 1, 2021.