The Cupp-Patterson funding plan gives school choice the cold shoulder
Note: This is the fifth in a series of blog posts on school funding in Ohio; for the previous posts, see here,
Note: This is the fifth in a series of blog posts on school funding in Ohio; for the previous posts, see here,
Note: This is the fifth in a series of blog posts on school funding in Ohio; for the previous posts, see here, here, here, and here.
This week, the Ohio House Finance Committee begins hearings on the high-profile school funding plan crafted by Representatives Robert Cupp, John Patterson, and an array of district superintendents and treasurers. As a refresher, the fully implemented proposal would:
If these issues weren’t enough to give reform-minded legislators pause, the Cupp-Patterson plan’s icy approach to school choice might tip them over the edge. It should dishearten advocates of school choice, whether public charter, independent STEM, or private schools, who have long sought fairer, more equitable funding. And not only advocates. Ohioans more broadly should be worried about a plan that shows too little concern for the needs of 250,000 students—five times the size of the Columbus school district—who exercise these options. Consider the following three problems.
First, the plan fails to eliminate funding disparities between district and charter schools
Ohio has shortchanged charters since the first of these independently-run public schools opened two decades ago. My recent analysis, released earlier this year, indicates that urban charters receive approximately $4,000 less per pupil in overall funding (state, local, and federal) than districts nearby that serve children of similar demographics. Although both Cupp and Patterson have acknowledged this severe funding gap, their plan does not rectify it. According to the latest estimates released by the Cupp-Patterson workgroup from July, charter schools in the Ohio Big Eight—the cities where most charters are located[1]—are expected to receive an average bump of $758 per pupil by FY 2021 (relative to FY 2019).[2] At first blush, this doesn’t sound like a bad start. But we must remember that Big Eight districts would also see extra state aid. Under the Cupp-Patterson plan, these eight districts on average are projected to receive of $383 per pupil in additional funding by FY 2021. Thus, the charter-district funding gap is estimated to close by a less impressive $375 per pupil by FY 2021, barely denting the $4,000 per student gap that currently exists.
Second, the plan will widen charter funding gaps over the long run
Due to the staggering costs of a new funding model for districts, their increases are phased in over a six-year period, with FYs 2020–21 being the first two years of implementation. Districts, therefore, should continue to see rising funding levels in FYs 2022–25. Charter funding amounts, however, are frozen after FY 2021. While no projections exist beyond FY 2021, this likely means that charter funding gaps will widen as the implementation of district funding continues.
Not only will the phase-in cause further rifts in charter-district funding, but so will the plan’s proposal to periodically adjust district funding for inflation. On the charter school side, the plan maintains “fixed” base amounts that determine the bulk of state aid. In FY 2019, charters received a fixed $6,020 per pupil base amount,[3] which under the Cupp-Patterson plan climbs to $6,179 and $6,338 per pupil in FYs 2020 and 2021, respectively. But then, the plan maintains charters’ FY 2021 base amount at the same level for perpetuity. Unless legislators make regular changes to charters’ base amounts, their base per-pupil funding will remain stuck at FY 2021 amounts, even as their costs rise with inflation.
In stark contrast, the Cupp-Patterson plan provides school districts with regular funding adjustments. One of the major shifts proposed in this plan is to move districts from a fixed base amount[4] to an incredibly elaborate inputs-driven calculation that is supposed to gauge the cost of educating an average student (no effort was undertaken to estimate charter costs). While this new approach would take pages to describe, it’s important to know that these “inputs” are based on variables such as statewide average salaries and benefits. These inputs costs would remain constant throughout FYs 2020–23. But in FY 2024, districts’ input costs would be adjusted for inflation, specifically using the CPI-U measure. Hence, the average teacher salary, for instance, used for the purposes of calculating district teacher costs would rise. In the years thereafter, the state would again make cost adjustments. Every fourth year the input costs would be reassessed based on actual data about salaries, etc., reported two years prior. Because these costs are sure to rise over time, districts would in turn enjoy increased state funding.
In sum, because charters’ per-pupil base amounts remain static after FY 2021 while districts’ are adjusted, the Cupp-Patterson plan will over time widen the gap between charter and district funding.
Third, it does nothing to bridge the enormous gap between voucher and public school funding
Ohio has also significantly underfunded scholarship, or “voucher,” programs that enable needy students to attend private schools. The current amounts for the EdChoice (both the failing-schools and income-based model) and the Cleveland scholarship are just $4,650 for K–8 students and $6,000 for high school, funding levels that fall short of those for even charters, and miles behind district funding. While some private schools can provide a quality education to students on such meager amounts, many others can’t fundraise enough to cover those costs. For families seeking private school alternatives, this results in a more limited range of choices.
Unfortunately, the plan does nothing to improve this woeful funding picture, and in fact worsens it. By freezing current scholarship amounts—while escalating public school spending considerably—the voucher funding gap will only widen.
***
The architects of the Cupp-Patterson plan have dubbed it the “Fair School Funding Plan.” Seen through the eyes of school districts, that may be true. But it doesn’t create a fair model for school choice. It perpetuates the chronic funding gaps between districts and choice programs, both charters and vouchers, and even more worryingly, is sure to widen them over the long run.
A truly “fair” funding plan would show concern for all schools, no matter what banner they fly under. It would also work toward fairness to Ohio parents who, for many various reasons, long for something different for their children. On school choice, then, the Cupp-Patterson plan can only be viewed as a failure. The plan preserves—even further advantages—the status quo, rather than putting Ohio families, and their preferences and choices, at the heart of education funding.
[1] The “Big Eight” refer to Akron, Canton, Cincinnati, Cleveland, Columbus, Dayton, Toledo, and Youngstown. In 2018–19, 78 percent of Ohio charters were located within these eight high-poverty urban districts.
[2] These estimates were released prior to the passage of the state budget bill (House Bill 166) and do not incorporate changes to the funding system for FYs 2020–21, including the additional funds for the state’s new student wellness and success or quality charter school programs. The bill form of the Cupp-Patterson plan (HB 305), which this blog post analyzes, was also introduced prior to the enactment of HB 166 and proposes FY 2020 as the first year of implementation.
[3] This amount is multiplied by charters’ enrollment to calculate their “opportunity grant,” the main component of the state’s funding formula.
[4] Under current policy, Ohio sets district base amounts at $6,020 per pupil and multiplies this by each district’s “state share index,” which varies based on local wealth. This is then multiplied by district enrollment to yield their opportunity grant (which may be capped, however). Charters’ base amounts are not multiplied by the SSI because they have no local taxing authority.
Lorain City Schools is no stranger to negative headlines. As one of three districts currently under the control of an academic distress commission (ADC), it has been a hotbed for controversy.
Fortunately for the students in the district, this autumn seems to be bringing a nice change of pace. First, what you’ve probably heard about: September’s state report card release revealed that Lorain earned an overall D (up from an F last year), raised its Gap Closing grade all the way up to a B, and improved its performance index scores.
But here’s what you probably don’t know: Earlier this month, the U.S. Department of Education announced that the district, along with several other industry and education partners, won a federal grant to modernize workforce training.
To understand the importance of this grant, some background is necessary. Last summer, President Trump signed the Strengthening Career and Technical Education for the 21st Century Act into law. The legislation, referred to as Perkins V, is the long-awaited reauthorization of the federal law that governs how states fund and oversee career and technical education (CTE) programs. Tucked into the pages of the new law are provisions creating the Innovation and Modernization Grant, a competitive program aimed at creating, implementing, replicating, and taking to scale “evidence-based, field-initiated innovations to modernize and improve” CTE, with the goal of improving student outcomes.
The U.S. Department of Education announced in mid-April that $2 million dollars would be available for the program. To be considered for the grant, applicants had to propose a plan that would modernize CTE, develop the effectiveness and alignment of CTE with labor market needs, and improve student outcomes. The law also identified several competitive preference priorities: promoting STEM and computer science education, serving students from low-income families, and serving students in qualified Opportunity Zones.
Earlier this month, the department announced nine winners out of sixty-four nationwide applicants. Among the victors was an application from Lorain County Community College, which partnered with—you guessed it—Lorain City Schools to win a three-year grant worth just under half a million dollars. Lorain City Schools is one of nearly a dozen partners, including:
There isn’t a ton of information available about the specific project itself, but the information that is available seems promising. The project aims to “engage and support” students in computer science CTE pathways that lead to applied associate’s degrees in computer science and result in employment in in-demand fields. Working together, the partners will work to achieve five goals:
If accomplished, each of these goals could be a boon for the students in Lorain City Schools and other partnering districts. Supporting students to meet postsecondary admittance standards, for example, ensures that students will avoid expensive remedial courses when they get to college. Earn-and-learn opportunities have become increasingly popular both nationally and in Ohio, and for good reason—they offer students meaningful opportunities to develop academic and career skills. It’s also important that the project will focus on computer science. According to Code.org, a nonprofit dedicated to expanding access to computer science, Ohio has thousands of open computing jobs, and average salaries for computing occupations in Ohio are significantly higher than the average salary in the state.
As with all new projects, the devil will be in the details. Rigorous implementation and a focus on meaningful outcomes for students should be priority. But kudos to the folks in Lorain County for taking the initiative to create a project that could have such a significant impact on the students in several districts—including Lorain City Schools.
Author’s correction and update: The original version of this post incorrectly stated that Columbus had increased the minimum test score needed to be classified as “on track” in third-grade reading and to not be placed on a reading improvement and monitoring plan (RIMP). In fact, according to information presented to the school board on Oct. 15, the district has increased only the “cut score” for RIMP eligibility but not the threshold for being identified as “on track.” This means the district is not engaging in the most egregious form of state report card manipulation, as the piece originally suggested, but several other Ohio districts do appear to have done so.
EDITOR’S NOTE: The Thomas B. Fordham Institute occasionally publishes guest commentaries on its blogs. The views expressed by guest authors do not necessarily reflect those of Fordham.
As I often like to remind my Ohio State University students, the road to hell is paved with good intentions. One illustration of this aphorism is how well-meaning efforts to increase accountability pressures on public schools in hopes of improving learning can backfire by perversely incentivizing school administrators to take actions that only end up hurting students. Although we’d hope that school leaders would respond to greater accountability by taking steps to improve educational quality—as they should be doing all along—it seems that local officials often react by finding ways to game the system instead.
Ohio’s current school report cards—which assign letter grades to each of the state’s public and charter schools and districts—are vulnerable to precisely this kind of gaming, as evidenced by recent efforts by Columbus leaders to raise the district’s report card grade without actually improving student performance.
At issue is a section of the report card titled “Improving At-Risk K-3 Readers,” which grew out of a 2012 law requiring all Ohio third graders to demonstrate proficiency in reading before advancing to fourth grade. The intentions behind this law were quite well-meaning and sincere; there is clear evidence that mastering reading by the end of third grade is a strong predictor of future academic success. This report card section is designed to hold schools and districts accountable for ensuring that all students stay on track to become proficient readers by the time they take their third-grade state exams.
Specifically, districts are required to administer reading assessments starting in kindergarten to identify students who are not on track to reach proficiency and to put in place reading improvement and monitoring plans (RIMPs) designed to get these students back on track. The “Improving At-Risk K-3 Readers” component of the state report card essentially awards points for each initially at-risk student whom schools move back on course to achieve proficiency.
Since state testing does not begin until third grade, school districts have considerable discretion in how they implement these requirements. Fortunately, there are some guardrails built in by the state. For example, school districts may use only the diagnostic assessments that have been approved by the Ohio Department of Education, and for each available assessment, the state also sets minimum scores that districts must follow in flagging “off-track” students. In addition, districts and schools are penalized for each student who fails to achieve proficiency on the third-grade reading test but who was not previously placed on a RIMP.
While these safeguards prevent districts from setting their bar too low (which would help them avoid losing points for at-risk students who remain off-track in earlier grades) and incentivize identifying students likely to struggle on the third grade reading assessment, Columbus has figured out that it can improve the calculation by setting the bar too high. Under a directive issued by the district’s new superintendent, third graders will be placed on RIMPs even if they are reading at the sixth-grade reading level at the beginning of the year! Indeed, under the district’s new minimum reading “cut score,” third graders scoring below the 93rd percentile nationally would be placed on such improvement and monitoring plans.
How does setting the bar so ridiculously high help inflate the district’s grade? Putting nearly every student on a RIMP at the beginning of the year will avoid the penalties for the subset of these students who not already identified as “off track” but who fail to attain proficiency on the state exam, a deduction that cost the district more than 400 points on last year’s report card calculation
When confronted by teachers angry that they must now write RIMPs for many more students, district officials denied trying to game the system and claimed there were merely “raising the bar” and trying to help all students improve their reading, even high-performing students. However, the district’s other actions (or lack thereof) suggest that this explanation should be taken with skepticism. The superintendent increased only the district’s third-grade reading requirements, leaving the minimum diagnostic “cut scores” that identify students in need of reading intervention woefully low for kindergarten, first, and second grade. At the most recent school board meeting, one board member asked (at 3:24:00 of the video) what to tell community members who accuse the district of making the change solely to improve the district’s report card. “A concise answer for me,” responded the district’s chief accountability officer, “honestly, would be, ‘Let’s give the students the interventions they need to succeed and achieve.’”
Unfortunately, what looks like an effort to “juke the stats” is likely to come at the expense of student learning. Third grade teachers estimate that writing each student’s RIMP takes between thirty and forty-five minutes, not counting the time necessary to administer ongoing assessments also required for each student placed on a reading plan. Requiring teachers to complete the required paperwork and follow this process for their highest-achieving students, who are already reading well above grade level, is likely to come at the expense of time and resources that could otherwise be spent helping the district’s lowest-performing third graders, who are most at risk of getting retained.
In addition, the policy change has also created mass confusion among both teachers and parents, many of whom assume that every student placed on a RIMP has also been identified as “off track.” (Both the coverage in the Columbus Dispatch and the original version of this blog post repeated this mistake, likely further exacerbating the confusion.)
The attention the district’s decision attracted has also revealed an important weakness in the calculation used to assign grades under this section of the report card. Had the district also raised the cut score for identifying “off track” third graders to the same high level (as early reports indicated), it would’ve earned even more points on the state report card. Labeling more high-achieving, already-proficient third graders as “off track” in the beginning of the year would have let the district earn credit for moving the same students back “on track” when they passed the state’s reading exam. This would result in a better report card grade without actually improving student reading.
Columbus did not take the additional step of increasing its third grade on-track cut score, but it appears that a handful of other districts may have. Examining last year’s state data, at least three other districts saw a big jump in the share of students identified as being off track between second and third grade. At the same time, these districts also brought larger fractions of third graders back on track than had occurred for the same cohort in second grade. This pattern—more kids classified as behind in the fall, but also more brought back on track during the year—strongly suggests that the districts changed their cut scores between second and third grade and had set a higher bar in third grade than was necessary to pass the state test later in the year.
In an ideal world, the state accountability system would not incentivize such numbers games. And all school districts would have leaders who spend their time and ingenuity dreaming up ways to actually improve student learning, instead of coming up with new strategies for manipulating the district’s report card grades.
Vladimir Kogan is an associate professor at the Ohio State University Department of Political Science and (by courtesy) the John Glenn College of Public Affairs. The opinions and recommendations presented in this editorial are those of the author and do not necessarily represent policy positions or views of the John Glenn College of Public Affairs, the Department of Political Science, or the Ohio State University.
For years now, Ohio has been caught in the throes of a fierce debate over how best to improve low-performing school districts. The state currently intervenes in chronically underperforming districts through academic distress commissions (ADCs), but this model has been met with intense opposition and is currently being debated by the General Assembly.
In the midst of all the district-level controversy, it’s easy to forget that Ohio has also been implementing school-level improvements for years. Under No Child Left Behind (NCLB), schools had to make adequate yearly progress or face a set of escalating consequences. The federal law required states to identify a group of priority schools—elementary and secondary schools that ranked in the bottom 5 percent for proficiency in math and reading, and high schools with graduation rates below 60 percent—that had to undergo rigorous improvement efforts.
When President Obama signed the Every Student Succeeds Act (ESSA)—the successor to NCLB—into law in 2015, states were required to create new plans that outlined how they would identify and intervene in chronically low-performing schools. (Ohio’s plan is here.) A thorough analysis of ESSA’s school improvement provisions would take pages, but here’s the gist of what states must do:
In addition to these requirements, ESSA explicitly permits states to “take action to initiate additional improvement” in any district with a significant number of schools that are consistently identified as low performing by the state’s accountability system.
There are several reasons—other than just the federal mandate—why Ohio is committed to keeping a close eye on school performance. First, as Governor DeWine has noted previously in reference to the state’s ADC policy, the state has a “moral obligation to help intervene on behalf of students stuck in failing schools.” Schools with consistently weak academic growth and few students reaching proficiency in basic subjects produce graduates who aren’t prepared for college or the workplace. State leaders have a responsibility to step in on behalf of the families, communities, and local businesses that are impacted by this lack of readiness. Second, the threat of intervention can spur schools into more urgent improvement efforts. Ohio is seeing this firsthand at the district level, where the threat of an ADC designation has led to progress. It stands to reason that similar accountability measures could also spur improvement at the school level. Third, there is rigorous research that indicates that it is possible to improve outcomes at the school level.
It’s too early to tell for certain if Ohio’s school-level interventions under ESSA are leading to improvements. Priority schools are identified once every three years. The state identified the first group at the end of the 2017–18 school year (you can find the list here), so we won’t know for sure how many schools improved enough to move off the list until 2020–21. But we can use the most recent state report cards to determine if performance index (PI) scores—which measure the achievement of students regardless of whether they are proficient—increased in priority schools between 2017–18 and 2018–19.
Of the 107 schools that ended up on the priority list because they were ranked in the bottom 5 percent, eighty-five had PI scores available for both years. Fifty of those eighty-five schools earned higher PI scores during the most recent year. While it’s not conclusive evidence, this seems to indicate that either the identification as a priority school or the intervention efforts that come along with identification are having a positive impact in the majority of priority schools.
When the new priority list is finally released, it will be important for lawmakers and advocates to take a close look. It would also be beneficial for the media to cover building level improvement efforts with at least half of the same breathless energy they use to track ADC controversies. If a significant number of schools improve enough to get off the list in 2020–21, that’s a good sign. But if the same schools remain mired in poor performance, it may be time to up the ante.