Seven ways state leaders can rigorously implement the science of reading in Ohio
With the ink dry on a historic state budget, attention now turns to implementing various components of the legislation.
With the ink dry on a historic state budget, attention now turns to implementing various components of the legislation.
With the ink dry on a historic state budget, attention now turns to implementing various components of the legislation. Not every provision requires a substantial implementation effort, but carrying out Ohio’s ambitious literacy reforms will be a heavy lift.
To recap, the legislation requires Ohio districts and charter schools to adopt curricula that aligns to the science of reading starting in fall 2024. Backed by a large body of research, this approach emphasizes explicit and systematic phonics instruction, as well as knowledge-rich curricula that builds vocabulary and comprehension. To support the transition, lawmakers set aside some $170 million to replace outdated curricula and provide professional development for teachers. The overarching goal is to ensure that all children are taught to read via proven methods by well-trained teachers, ultimately leading to stronger reading proficiency statewide.
Under the new policy, schools that have previously embraced popular but debunked approaches such as “three-cueing” or “balanced literacy” will need to change course. Doing so is crucial, but it could also invite pushback from those wedded to the status quo. Some schools may openly defy state requirements. More likely, however, resisters will seek to undermine state policy in subtler ways, such as claiming to follow scientifically based instruction but continuing to use disproven methods behind closed doors.
If state leaders aren’t attentive and hard-nosed about implementation, Ohio’s promising literacy efforts could turn into mush. How can they ensure rigorous implementation? Let’s take a look at seven ways.
1. Ensure a complete and thorough system-wide survey of curricula materials. While weak curricula are assuredly in use across Ohio, there is no systemic data on how many schools use them. This leaves us uncertain about the heft of the implementation. It’s going to be much heavier lift if three-quarters of Ohio schools are using disproven methods than if only half of schools are doing so. Fortunately, the budget bill requires the Ohio Department of Education (ODE) to field a reading curricula survey, with districts and charters required to respond. Recognizing the urgency of collecting these data, ODE, to its credit, sent out the survey this week. The agency should now make certain that schools respond promptly and completely. On the latter count, ODE should ensure that school-level information is obtained, as curricula may differ across schools within a larger district. And, as recommended in this piece, it should also insist on specificity, making sure to collect not only the title of the material, but also publisher and year, as older editions may be of differing quality.
2. Keep the bad stuff off the high-quality instructional materials lists. The budget bill tasks ODE with creating two lists of high-quality instructional materials: one for core curricula and the other for intervention programs. All districts and charter schools must select curricula and programs from these state-approved lists (with one exception, discussed in #5 below). Curating carefully vetted lists of materials is a crucial implementation step, as the whole effort could be undermined if state officials—perhaps under lobbying pressure from publishers—include low-quality programs such as Fountas & Pinnell’s Classroom, Lucy Calkins’ Units of Study, or Reading Recovery. Timeliness is also key, as schools need to know this year which materials have the green light. The good news is that EdReports, a well-regarded national organization, has conducted detailed evaluations of reading curricula that Ohio policymakers can rely on. States such as Colorado, Louisiana, and Massachusetts have also developed solid lists of high-quality materials.
3. Smartly allocate instructional materials funding. Lawmakers set aside $64 million to subsidize schools’ purchase of high-quality materials. These funds are critical, but the bill doesn’t provide any direction to ODE about how to allot them. Moreover, while the overall set-aside is significant, it may not cover curricula upgrades in all Ohio schools. What this means is that ODE will likely need to develop an allocation method that prioritizes funds. At the front of the line should be the districts and charters that absolutely need to change curricula because their current ones do not make the state-approved lists. If there is a sufficiently large number of schools that must change curricula, ODE may need to further prioritize dollars, perhaps by providing subsidies to higher-poverty schools first. (In this event, it should also request additional funding from the legislature.)[1] One final issue that ODE may need to iron out is whether to provide a per-pupil subsidy up front or reimburse districts after purchase. A per-pupil amount might be more sensible, as reimbursement could end up incentivizing unduly expensive purchases.
4. Bolster teacher professional development (PD) requirements. Retraining teachers who are accustomed to using debunked teaching methods is essential to the science of reading effort, as they’ll be the ones shifting to a different approach and using the new materials. To build the knowledge and skills of Ohio’s teaching force, the budget bill sets aside $86 million over the biennium to pay teacher stipends for completing PD.[2] Implementation details are left to ODE, so the agency will need to sort out several issues:
5. Scrutinize waiver requests. While the legislation requires schools to use state approved curricula and includes an explicit proscription on “three-cueing,” it also includes a loophole that could allow a school to use three-cueing in two circumstances: (a) if it receives a waiver from ODE to use it for a particular student[3] or (b) if a student’s IEP calls for the use of this method. To guard against abuse, ODE should carefully review waiver requests and likely reject most, if not all, of them. If it doesn’t, it runs the risk of becoming a rubber stamp that allows schools to circumvent the state’s science of reading requirements. ODE should also publicly report the number of waiver requests from each district and school, as well as how many were approved. The sunlight will also provide another safeguard against abuse.
6. Publicly report schools’ reading curricula on an ongoing basis. Beyond the survey mentioned above, Ohio’s new literacy laws require districts and charter schools to report core reading curricula and intervention programs on an ongoing, annual basis to ODE. While the legislation doesn’t explicitly require public reporting after data are sent to ODE, the agency can and should make them public à la Colorado’s curriculum transparency dashboard. This tool would provide communities a check on whether their local schools are following state law, and it would clearly flag any obvious cases of non-compliance. It may also allow communities and parents to advocate for changes if their schools are using programs that, while state-approved, are not up to their exacting standards. Lastly, public reporting could allow for analyses that link schools’ reading performance to their curricula selections, potentially shedding light on which are associated with the strongest learning gains.
7. Strictly enforce state literacy requirements. State officials shouldn’t turn a blind eye if schools are ignoring state law. As agencies in other states have done, ODE should step in and take corrective action if a district or school is using disapproved programs. The agency may also need to periodically conduct curricula reviews of schools (perhaps randomly selecting a small percentage each year) to verify implementation of high-quality curricula. All students deserve to learn via effective reading methods, and ODE should honor the intent of the literacy law through strong enforcement.
* * *
Led by Governor DeWine, the state budget bill greatly improves Ohio’s early literacy laws by requiring schools to follow the science of reading. For the effort to succeed and benefit Ohio students, implementation will be key. As this piece indicates, there’s a lot on the plate of state officials. But if Ohio can get the details right and rigorously implement its new policies, the payoff will be great: a better educated, more literate next generation of Ohioans.
[1] It’s also possible that fewer schools than expected will need to change curricula, allowing the $64 million set-aside to go further. If that’s the case, the state might consider creating another tier of instructional materials—“exceptionally high-quality”—and subsidize schools that seek to upgrade from a state-approved curricula to a top-tier curricula.
[2] All teachers in grades K–5; English teachers in grades 6–12; and intervention specialists, English learner teachers, reading specialists, and instructional coaches in grades pre-K–12 are eligible for $1,200 stipends upon course completion. Grades 6–12 teachers in other subjects (e.g., math or science) are eligible for $400 stipends. Though required to take PD, administrators are not eligible for a stipend.
[3] Provided he or she is not on a state-required reading improvement and monitoring plan.
The recently completed state budget includes historic education provisions that could have a tremendous impact on students and families. But throughout the budget process, plenty of other big-impact proposals ended up on the cutting-room floor. Some of these policy ideas deserve a second look from lawmakers, including a Direct Admissions (DA) pilot program proposed by Governor DeWine.
First, some background. Direct admissions is a process whereby colleges and universities reach out to students to make admissions offers (and sometimes financial aid offers, too) before students have formally applied for enrollment. The exact process varies by state and program, but for the most part, higher education institutions make these offers based on academic information provided by a state agency, a high school, or even students themselves via a third-party organization (student-provided data are vetted by a school counselor). Common App, for example, has piloted a direct admissions program since 2019, offering non-binding guaranteed admission to qualified students.
Proponents of DA programs argue that the traditional college application process is confusing for students and their families, and that as a result, it exacerbates inequality. DA programs cut down on this confusion by directly reaching out to students with a bonafide admissions offer. This can be especially powerful for low-income and first-generation students, who might otherwise be unaware of how many options they have, or incorrectly assume that they’re unprepared for college.
Despite what some critics contend, DA programs aren’t meant to pressure kids into higher education. Students are under no obligation to attend any of the colleges they receive direct admissions offers from because it’s not about college-for-all. It’s about putting clear and concise information directly into the hands of students and parents and empowering them to make their own choices. The goal of any well-designed DA program is to make sure that all students—not just those who are affluent or savvy enough to navigate the college admissions process—are aware of all of their options. That means going beyond traditional four-year universities and making sure students are also aware of community colleges and career-technical centers that offer courses and credentials in specific career fields.
Governor DeWine’s DA proposal checked most of those boxes. It would have charged the Chancellor of Higher Education with establishing a process to use student academic records—grade point averages, high school and college transcript information, standardized assessment scores, scores on end of course exams, and any other measure of post-secondary readiness deemed appropriate by the chancellor—to determine and then notify high school seniors if they met the admissions requirements at participating higher education institutions. The chancellor, “to the extent practicable,” would use existing information systems to automate the process and minimize the need for students to provide additional data. And while the legislative language didn’t outright say that participating post-secondary institutions would be extending admissions offers to students who met their criteria, it seems likely that was the goal, given that the program was explicitly identified as a direct admissions pilot.
The pilot would have been voluntary and the chancellor permitted to establish eligibility requirements, but participation would have been open to the vast majority of Ohio’s education institutions. At the secondary level, traditional K–12 districts; joint vocational school districts; and private, charter, and STEM schools were all included. And at the post-secondary level, the pilot was intended to be open to state institutions of higher education, authorized private nonprofit institutions, and Ohio Technical Centers (independently operated career-technical centers that offer adult learners training and credentials for in-demand jobs). Such broad eligibility determinations meant that not only would the vast majority of Ohio students have the opportunity to participate, but students could have been notified about a wide variety of post-secondary institutions, not just the familiar four-year ones.
Critically, the Chancellor of Higher Education would have been required to issue an annual report outlining student participation and the impact of the program on post-secondary outcomes for traditionally underserved student populations. This would have given state leaders and education researchers a chance to track progress over time, evaluate student outcomes, and improve the program’s effectiveness. Analyzing such information is crucial because DA programs are only effective if they improve student enrollment and outcomes. If more students—especially those from traditionally underserved backgrounds—aren’t persisting through college; earning certificates, credentials, and degrees; and landing well-paying jobs that off-set the cost of higher education, then DA programs aren’t fulfilling their promise.
It’s unclear why DeWine’s DA pilot didn’t survive the budget process. One possibility is that, like the majority of Ohioans, lawmakers were unaware of what DA programs are designed to do and why. That’s understandable, given that Ohio doesn’t have an existing similar program. But there’s plenty of precedent elsewhere. In fact, DeWine’s proposal seems to be modeled after a very similar pilot program in another midwestern state.
Direct Admissions Minnesota is a state-funded pilot program coordinated by the Office of Higher Education. It was first launched during the 2022–23 school year, and participants included forty high schools, 7,000 high school seniors, and over fifty higher education institutions. As part of the program, seniors at participating high schools who are on track to graduate receive personalized communication that lists every higher education institution in the state to which the student has received conditional or guaranteed admittance based on their academic record. To officially confirm their admittance to any of these schools, students are required to submit a (free) admissions application.
Minnesota isn’t the only proof point. In 2015, Idaho adopted the nation’s first DA system. The program admits all of the state’s public high school seniors to a minimum of six in-state colleges and universities each year. Like Minnesota, students are notified by the state in the fall and must submit an application to confirm their spot. But unlike Minnesota, Idaho’s program has been around long enough to give researchers a chance to examine some initial outcomes.
In a paper published last year, researchers found evidence that direct admissions increased first-time undergraduate enrollment by 4 to 8 percent, and in-state student enrollment by approximately 8 to 15 percent. Despite these increases, the researchers indicate that direct admissions should be part of the answer for states rather than the entire answer. DA programs have the most potential when they’re accompanied by additional supports for students, like increased financial aid, application fee waivers, and nudges. To harness their potential, policymakers and institutional leaders should “consider pairing a direct admissions system with complementary supports to help students overcome other barriers to college access.”
That seems like good advice for Ohio. Governor DeWine’s direct admissions program wouldn’t be a silver bullet. But it could expose Ohio high schoolers to post-secondary options they weren’t previously aware of, including career pathways programs at community colleges and Ohio Technical Centers. It could empower students to make fully-informed choices about their future, rather than half-informed ones colored by a lack of knowledge about higher education and the admissions process. Most importantly, if it’s paired with efforts to improve the quality of K–12 education and workforce development initiatives, it could help diminish the access and attainment gaps that exist between low-income and first-generation students and their more affluent and well-connected peers. That’s definitely something worth picking up from the cutting-room floor.
Between expanded voucher eligibility, funding increases for charter schools, and improved transportation guidelines, the recently finalized state budget was a boon for school choice. But tucked deep into the legislation is another, less heralded change that will help ensure that students who exercise school choice receive the supports they need to be successful.
The provision focuses on school records, which refers to “any academic records, student assessment data, or other information for which there is a legitimate educational interest.” In lay terms, that means class transcripts, state test results, and important documentation, such as individualized education programs (IEPs) and 504 plans. This information is vital for schools to effectively serve students. A high school guidance counselor can’t enroll students in the classes they need to meet graduation requirements or college admissions standards if they don’t know which courses students have already taken. Elementary and middle schools need access to state test results to ensure that struggling students receive intervention and to help teachers plan accordingly. And teachers in all grade levels and subjects need IEPs and 504 plans to provide students with the accommodations they’re entitled to under federal law.
For most students in Ohio, these data are readily available and easily accessible. When students progress to a new grade level or move on to middle and high school within the same district or charter network, their records follow. But for families that switch between school types, it’s more complicated. Students who transfer to a charter or private school, for example, don’t show up on registration day with records in hand. Administrators have to request them from the school that the student previously attended. Districts face a similar issue when students transfer from a charter school, private school, or a traditional public school in another district.
In theory, it should be a pretty simple exchange. The new school requests the records, and the previous school sends them as soon as possible. In practice, though, that’s not always what happens. Schools sometimes find themselves waiting weeks—even months—just to get copies of the records they request. Even worse, there’s nothing they can do to speed up the process. Administrators just have to keep calling and emailing and making requests, and hope that, at some point, the previous school will finally send over the records.
The recent budget tackles this issue by putting a deadline on student record requests. Going forward, all schools will be required to send a transfer student’s records within five school days of receiving a request. On the one hand, this is an example of policy working as it should. The failure of some schools to transfer records in a timely manner prevented others from serving their students well. Policymakers smartly recognized this problem and created a legislative solution to fix it. On the other hand, it’s also an example of the limits of policy. Districts and schools might now be required to transfer records within five days, but that doesn’t mean they actually will. Because the provision has no teeth—some kind of enforcement provision that holds schools accountable for following the law—it’s likely that inertia and general busyness will lead them to drag their heels.
In an ideal world, the state would have a secure and centralized database of student records that could be easily accessed by schools of all types. Administrators wouldn’t have to worry about making or responding to requests because they’d have automatic access to the records of their enrolled students. Given understandable data privacy concerns, that future seems a long way off. In the meantime, the deadline provision included in the budget could help. It hardly seems like too much to ask for schools to transfer records in a timely fashion. But to ensure that they actually do, lawmakers may need to add some accountability. One possibility is to mimic what they’ve already done with student transportation: Strictly enforce the law using warnings and consequences so that districts, charters, and private schools have to follow the rules and students aren’t shortchanged. Each violation, for example, could result in a $1,000 fine—not because the state is looking to cash in, but because there needs to be a sufficient incentive for the provision to work as intended. Providing schools with recourse to report when they aren’t getting records in time would also be a smart move.
Ohio’s school choice policies were designed to open educational doors for students and families. But to ensure these policies are effective, legislators have to get the details right. The transfer of school records is one of those details. Lawmakers took a good first step by adding a deadline provision to the budget. Now, it’s time to closely monitor implementation and, if necessary, add some consequences.
When classes moved abruptly online at Iowa State University in March 2020 as part of Covid-mitigation actions statewide, psychology professor Jason Chan expected big changes in student behavior. Specifically, he worried about his students being easily able to cheat on unproctored online exams. But he saw little evidence of that, with his students producing a fairly typical distribution of scores, comparable to the proctored, in-person exams he gave earlier the same semester. Intrigued, he and associate Dahwi Ahna undertook a deeper analysis of university-wide test scores to see whether his experience was typical.
Chan and Ahna obtained data from eighteen courses offered during the spring 2020 semester—everything from large introductory lecture classes to smaller, specialized major courses for upperclassmen. They calculated an average score for the in-person exams (i.e., first half of the semester) and an average score for the online exams (i.e., second half of the semester) for more than 2,000 students. They then computed the correlation between the two halves using a meta-analytic approach which treated each course as an individual study.
Across the board, scores from the unproctored online exams closely correlated with those from the traditional in-person exams. A positive correlation was observed for every course, and correlations did not vary significantly by types of questions asked on the exams, field of study, course level, exam duration, or enrollment. They ran a standardized effect size analysis to confirmed that score inflation via relaxed grading for Covid-impacted online exams was not driving the correlation.
The researchers also ran the same first-half/second-half comparison for a set of courses that were all in-person in the spring semesters of 2018, 2019, and 2021. Overall, these courses showed stronger correlation between first half and second half exam scores than the split in-person/online courses of 2020. When the comparison was restricted to courses taught by the same instructors in both sets of semesters, the difference became smaller and was no longer statistically significant. However, the sample size was just nine courses, which could have impacted the outcome of the analysis.
The data indicate that cheating was likely uncommon when students had to pivot and take exams online in that first Covid-disrupted semester, though it’s possible that cheating was actually widespread but simply not effective at boosting student outcomes beyond typical norms. The researchers equate the situation to the difference between a closed-book versus open-book proctored test: Students who have missed lectures, paid little attention in class, or have low understanding of the material despite attending will likely not to do well on a typical exam even if they have a textbook, notes, or even the entire internet at their fingertips. The corollary is typically true as well: Students with good attendance and a decent understanding of the material will generally fare well on an exam, with or without notes and other materials before them.
This feels like good news as far as it goes. Whatever effort these college students were putting forth prior to pandemic disruption appeared to continue immediately after their entire educational experience turned upside down. However, this does not mean that extended use of online exams—and indeed virtual teaching and learning writ large—will result in the same behavior when it is the everyday norm. Authors Chan and Ahna conclude from their research that online exams can provide a valid and reliable assessment of learning just the same as in-person exams, but that seems like too large a leap given the unique circumstances involved in spring 2020. Professors and institutions sticking to online exams in the absence of force majeure need to leverage the technology to permanently hinder cheating. After all, exams have traditionally been proctored for a reason.
SOURCE: Jason C. K. Chan and Dahwi Ahna, “Unproctored online exams provide meaningful assessment of student learning,” Proceedings of the National Academy of Sciences (July 2023).
Accurate property assessments are a basic requirement for many school funding systems to function properly. Unfortunately, data suggest that a variety of problems are introducing inaccuracies, exacerbating inequalities, and causing negative chain reactions in taxing jurisdictions all around the land. A trio of economics researchers conducted a deep dive into one state’s efforts to reform its property assessment infrastructure, and their report was recently published by the Board of Governors of the Federal Reserve System.
In 1989, a series of newspaper articles in the Lexington Herald-Leader exposed the existence of rampant problems in every corner of Kentucky’s K–12 education system, including numerous instances of tax evasion, inequitable treatment of property owners, and ineffectual safeguards around tax assessment protocols. Journalists and state investigators identified the most pressing problem as the underassessment of properties—specifically, high-end homes—brought on by a mix of mistakes and corruption. In response, lawmakers passed the Kentucky Education Reform Act (KERA) in 1990. Under that law, the state’s Department of Revenue was given expanded power over locally-elected property assessors, and a technical assistance intervention was imposed on the worst offenders in an effort to improve their practice. Additionally, KERA introduced a means by which to double check the accuracy of assessments.
To evaluate Kentucky’s efforts, researchers from the Federal Reserve examined administrative data from all 120 counties in the state. The state had deemed ninety-three counties to need intervention: twenty-five were emergency reassessment (ER) counties, and sixty-eight were technical assistance (TA) counties. Twenty-seven additional counties weren’t given any direct intervention under KERA and serve as a control for certain analyses. Data for pre-treatment investigation cover the years between 1982 and 1989, and post-treatment data cover 1990 through 1998. The researchers use a difference-in-differences approach to conduct their analyses.
Overall, they found that the treatment (training, oversight, double-checking) substantially increased aggregate assessed real property values. By the end of the active intervention in 1994, per-pupil property value in ER counties had increased by 32 percentage points over untreated counties compared to 1989. For TA counties, the increase was a more-modest 11 percent, as expected due to the lower levels of intervention needed. By 1996, districts in ER counties were receiving approximately $100 additional per pupil from local property tax revenue. (The researchers looked at commercial and farm property, as well as residential, but the report focuses mainly on residential, as does this review.)
They also found that interventions had large positive effects on assessment inequity. Using a new protocol called coefficient of dispersion (COD), state overseers measured the average difference between the sales ratios and the median. When properties are equitably assessed, assessment-to-sales ratios within a jurisdiction should be similar across properties, regardless of size or sales price. A low COD indicates an equitable assessment across properties (e.g., an accurate and properly-functioning system), while a high COD indicates a flawed assessment (though it is not enough to isolate a cause). The intervention reduced the COD in ER counties by an average of 15 points, indicating that the system had become more accurate. A similar pattern of COD reduction occurred in TA counties, though it was less pronounced.
The report describes in detail how these results indicate an increase in assessment accuracy and equity, rather than, say, an increase in home prices within treatment counties over the same period. Data indicate that internal factors (such as understaffed assessors’ offices and corruption) likely drove the underassessment problem, rather than external factors such as population density changes and local economic conditions. Hence, they deem the cure highly effective in treating the disease: The more inequity that existed before the intervention, the greater corrective the intervention applied.
Finally, the researchers use their estimates of the treatment effects along with the state aid formula to simulate the impacts of the intervention on state aid to the treated districts. Predictably, increasing local revenue to schools decreases the need for state dollars to level up district funding, although changes to the overall funding formula, which were also part of KERA, limit that impact slightly. Still, millions in state funding that should rightly have gone to districts playing by the property assessment rules ended up in ER and TA districts in each pre-treatment year, a situation significantly remediated post-treatment.
What does all this mean? The Fed’s research team suggests that COD double checking could be applied to any property tax assessment system to ensure its accuracy. This recommendation applies not only to schools, but to public transit, police and fire service, and libraries. Any system funded by property taxes, and run by public servants who may toil in understaffed offices or who could succumb to corruptive elements, should take advantage of this simple—and proven—double-check protocol.
SOURCE: Alex Combs, John Foster, and Erin Troland, “The Role of Property Assessment Oversight in School Finance Inequality,” Board of Governors of the Federal Reserve System (July 2023).