The most problematic public expenditure in K-12 education
Ohio faces a significant budget crunch. This is forcing state lawmakers to scrutinize expenditures—even more closely than usual—to create a balanced budget by the end of June.
Ohio faces a significant budget crunch. This is forcing state lawmakers to scrutinize expenditures—even more closely than usual—to create a balanced budget by the end of June.
Ohio faces a significant budget crunch. This is forcing state lawmakers to scrutinize expenditures—even more closely than usual—to create a balanced budget by the end of June. A good place to start would be examining the projected $378 million in “guarantee funding” over the next biennium (fiscal years 2018 and 19). Guarantees award some districts with millions more than they would otherwise receive under the state’s own funding formula. Guarantees are a poor use of limited resources and undermine the formula. Here’s why.
Funding guarantees aren’t a trivial expense
Under the Administration’s budget proposal, the state would spend $181 million on the guarantee in FY 18 and $197 million in FY 19. This represents about 2.1 percent of overall state spending on K-12 education over the biennium. While that may not sound like a large slice of the funding pie, it is roughly equivalent to the state funding specifically allocated to gifted, career and technical education, and English language learning combined.
For non-budget-obsessed readers, let us recall what the guarantee is. Guarantee funds are allocated to districts that are either losing students or experiencing gains in local wealth relative to the rest of the state (sometimes both). The idea is that districts need budgetary stability and should be shielded from funding reductions under the state formula.
In the former case, the state funds districts for “phantom students,” or kids who aren’t even being educated by the district. Economists Jon Fullerton and Marguerite Roza have argued that propping up declining districts through a guarantee could actually be counterproductive: “Funding phantom students delivers the message that school districts should continue delivering education the way they have for the last century.” They explain that guarantees allow districts to avoid making decisions that could improve the performance of their schools, such as re-prioritizing educational programs, seeking ways to innovate or form partnerships, or shifting to more flexible cost structures.
In other cases, the state provides additional dollars to districts with improving economic conditions. Several districts are projected to be on the guarantee in FY 18-19 due in part to reductions in their state share index (SSI). A lower SSI means that as local wealth increases, the state picks up less of the funding tab all else equal. For example, among the districts on the guarantee, the largest negative change in SSI is Noble Local (-28 percent). A closer look at its recent financial statements indicates that the district is generating more property tax revenue due to oil and gas development. A similar phenomenon is likely happening in other districts that are also reaping the benefits of the fracking boom. Lawmakers should ask themselves whether it’s an efficient use of limited state education dollars to drive additional aid to districts that have more local wealth they can (literally) tap into.
The guarantee is becoming a permanent fixture for many districts
One common argument for the guarantee—also known as “temporary transitional aid”—is that districts need time to adjust to changing circumstances. Yet guarantee funding is sometimes anything but temporary. The table shows the breakdown of districts expected to be on the guarantee in FY 18-19 and whether they’ve been on the guarantee in prior years. Interestingly, eighty-five districts out of 328 districts projected to be on the guarantee have been on it for each of the four prior years. Together, the districts are slated to receive more than $218 million or almost 58 percent of the guarantee funding over the next two years. In per-pupil terms, they also receive more than the other “guarantee” districts: on average almost $1,500 per-pupil in guarantee funding. Though some districts are “new” to the guarantee, the bulk of the guarantee aid would be directed to districts that have drawn the extra funding for year after year.
Ohio districts by the number of years on the guarantee (FY 2014-17)
[[{"fid":"118422","view_mode":"default","fields":{"format":"default"},"type":"media","link_text":null,"attributes":{"height":"338","width":"1254","style":"width: 600px; height: 162px;","class":"media-element file-default"}}]]
* Includes districts projected to be on the guarantee in either FY 18 or 19 (almost all are on the guarantee both years). Source: Author’s calculations based on OBM, School Foundation Funding Estimates and ODE, District Payment Reports (for FY 14-17 guarantee amounts).
The guarantee pulls money away from other uses
Rather than providing $378 million to help fund phantom students or districts with diminishing need, lawmakers should consider whether there are better uses for those dollars. One option might be to help reduce the impact of Ohio’s school funding cap, which under the Administration’s proposal would deny $825.5 million in funding to hundreds of districts in FY 18-19. Such districts have increasing enrollments and/or weakening tax bases, leading to the growth in state funding (which is then limited by the cap). Some districts on the cap serve significant numbers of low-income children—like Canton, Columbus, and Dayton—and the repurposed funds could be used to ensure that they receive the full formula amounts rather than shortchanging their already-needy pupils.
The guarantee doesn’t effectively create financial certainty for districts
One idea behind the guarantee is that districts require stability and certainty as they create their budgets. But the biennial debates around the guarantee and its lack of fairness likely make it less than a “certainty” for districts. Plus, there are better ways to manage fluctuations than the guarantee.
First, as suggested in A Formula That Works, the state could base its funding formula on prior year enrollments instead of current year. In essence, this would allow districts to better predict their state funding for budgetary purposes, as they would know in advance how many kids they would be funded for. This would work almost like a guarantee, but it would apply to all districts and not just select ones. It would indeed “hold harmless” districts that lose students from one year to the next, though it would underfund districts with enrollment gains.
Second, in extraordinary situations, the state could step in to provide temporary support. For example, state lawmakers are considering emergency relief for Adams County districts where property taxes are set to plunge due to two power plant closures. The state already has a solvency assistance fund and policymakers could ensure that enough rainy-day dollars are available for rare cases in which districts truly need temporary emergency aid.
* * *
Perhaps one way of thinking about the guarantee is like unemployment compensation. It’s a reasonable public expenditure to help individuals who need temporary assistance, but few would argue it should be indefinite. Likewise, the state should provide some extra help to districts in emergency situations not of their own making. Yet it need not offer open-ended access to hundreds of millions of dollars in additional funding—all outside of the formula—through the guarantee. Ohio lawmakers face tough budget decisions in the coming months and they should consider unwinding the guarantee. Dollars spent on children not actually being educated by a district is not money well spent.
NOTE: An addendum to this blog post, incorporating important new information, was published on Ohio Gadfly Daily on 4/17/17.
College Credit Plus (CCP) provides qualified Ohio students with the opportunity to undertake college coursework while still in high school. Students in grades 7-12 can earn college credit in three ways: by taking a course on a university campus; at the student’s high school where it’s taught by a credentialed teacher; or online.
As the program’s popularity has surged, there have been growing pains and calls to scale it back. Other folks claim that it exemplifies and perhaps even fosters inequitable access—it’s just too hard for some students to qualify. It’s true that not everyone interested in CCP is permitted to enroll. But the program wasn’t designed for totally open access; it was built for middle and high schoolers who could demonstrate that they were ready for college-level content. Other pathways are available for students who fall just below CCP standards but are still interested in challenging courses like AP, IB, and honors courses.
Unfortunately, proposals in Ohio’s pending budget bill would make it easier for unqualified students to enroll in CCP. Here’s what is being proposed:
Lowering CCP’s admission bar
Under current law, students interested in CCP must meet a participating college’s “established standards for admission and course placement” in order to enroll. Generally this means applicants must prove they’re ready for the rigors of a college course and thus not in need of remediation. In addition to entrance exams, colleges and universities are permitted to use course placement tests (such as ACCUPLACER) to determine whether students meet these standards—and it’s through that latter route that the pending budget would lower the bar.
The budget deems students who score “within one standard error of measurement below the remediation-free threshold” on these assessments to have met the remediation-free standard provided—even though they didn’t. In an effort to appear as though high expectations have been maintained, the budget adds requirements—students are only considered remediation-free if they have either a cumulative high school grade point average of at least 3.0 or a recommendation from a school counselor, principal, or career technical (CTE) program advisor.
Lowering CCP’s admission bar is problematic for several reasons. Postsecondary institutions selected remediation-free thresholds for good reason. Easing this standard undermines their decisions and disregards their experience and expertise in judging what students must know in order to succeed at the college level in their institutions. Students who are unable to score above these thresholds are likely to struggle and perhaps even fail the college course. This matters because CCP courses are factored into a student’s high school GPA. This means that if one earns a D in a CCP course, that grade appears on one’s high school and college transcripts, and is calculated into their GPA at both institutions.
Note, too, that the additional “requirements” that have been added (a 3.0 GPA or a recommendation) aren’t necessarily indicative of college readiness or future success. Grade inflation is rampant in today’s schools—and on many college campuses, too—and GPAs can hide a lot of issues. For instance a 3.0 could represent passing grades in easy classes that don’t come close to the rigor of college level coursework. The administrator recommendation requirement is even worse. All a student (or her parents) has to do is plead with—and convince—a school staffer that she is “ready” for college absent any hard evidence of such readiness, particularly since teachers didn’t make the list of those who can provide a recommendation.
The college-ready program
As worrisome as lowering the admissions bar is, it only allows some additional students to access CCP. Lest anyone else feel left out, Ohio’s budget has also addressed those students for whom even a lower admissions bar won’t help.
The new college-ready program is meant to “provide high school students with college-ready transitional courses.” It aims to serve students who need “additional coursework to qualify to take courses to earn college credit while enrolled in high school and/or be prepared for college upon graduation.” Basically, it will offer newly created remedial courses to students who are able to determine that the classes their schools are currently offering aren’t preparing them to complete college-level work during—or after—high school graduation. It’s tantamount to the state openly admitting that its high schools are failing to do their foremost job.
Yes, a program such as this could be an intriguing solution to the college remediation problem. Rather than forcing students to take expensive, non-credit bearing courses after reaching the college campus, why not get remediation out of the way in high school?
But the college-ready program laid out in legislation has significant problems. For starters, the timeline for developing it is far too short. The budget calls for all program requirements, instructional models, and application guidelines for interested schools to be published by February 1, 2018 and for courses to launch in the fall of 2018. Ohio’s finalized budget is typically signed in June. This means that the workgroup responsible for developing this program has only eight months to create an entire system from scratch. Considering that it took the state over a year to review academic standards that had already been written, it seems like a tall order to ask a workgroup to create an entire system of remedial courses so swiftly.
More worrisome still is the lack of evidence that this program will actually get more kids college ready. If this provision becomes law, thousands of students could spend time and energy taking courses that haven’t been tested for rigor or proven to be effective with early remediation. Ohio would be far better off stretching the timeline and starting the program as a small-scale pilot that measures student achievement and impacts. During the pilot, the state could also examine schools that already incorporate effective remediation.
***
These proposals, while generally well intentioned, aren’t in the best interest of Buckeye students. While we hail the impulse to offer free college classes to as many students as possible, the state already has multiple options. To succeed, CCP participation should be limited to those who can actually demonstrate their college readiness. Furthermore, while policymakers should absolutely continue considering ways to increase the number of high schoolers ready for credit-bearing college work prior to exiting 12th grade, rushing to create a statewide system of courses that could end up as watered down as those currently offered isn’t the way to go about it.
Are you a school choice supporter or just interested in learning more about this issue that is gaining national prominence? Ohio parents, students, schools, and advocates will be holding a rally on Tuesday, May 2, 2017, at 11 a.m. on the steps of the Ohio Statehouse. And you’re invited to attend.
The event, supporting school choice in all of its many forms, is happening during National Charter Schools Week.
Image courtesy of School Choice Ohio
You can find more details about the event here. And you can register by clicking here.
In March, Ohio’s Educator Standards Board (ESB) released six recommendations for revising the Ohio Teacher Evaluation System. In a previous piece, I explained why its two most significant recommendations are a solid solution to a myriad of problems within the system. These suggestions were 1) to update the observational rubric in collaboration with a national expert and 2) to embed student growth measures into the revised rubric. In this piece, I’ll investigate the remaining proposals.
Of the four remaining recommendations, two are intertwined with the ESB’s call to embed student growth measures into a revised rubric. The first is to eliminate shared attribution, the flawed practice of evaluating non-core teachers based on test scores from subjects they don’t actually teach, such as reading and math. Policymakers should heed this recommendation and ditch shared attribution as soon as possible.
The other recommendation seeks to incorporate aspects of Ohio’s current alternative framework into the newly revised observational rubric. This includes student portfolios, student surveys, peer review, self-evaluation, and other district-determined measures. Several of these methods—like student surveys and peer observations—have research to support their use. The revised evaluation rubric should definitely include these options.
The final two ESB recommendations require some more in-depth discussion. Let’s take a look.
Streamlining the observation process
Under the current system, the timeline for the classroom observation cycle doesn’t explicitly differentiate between first and second semester requirements. And although the current system does refer to different types of observations—formal observations and informal classroom walkthroughs—there isn’t a clear explanation of the differences between the two or the purposes for each. In addition, current law does not require pre- or post-conferences between the evaluator and the teacher. This is unfortunate because conferences—particularly those that take place after an observation—are the ideal opportunity for teachers to determine how to move forward with their development.
The ESB fixes these issues by offering a clear, chronological description of required observations and conferences and explaining the purpose of each. It calls for two formal classroom observations; periodic, informal “walkthrough” observations;[1] and a final conference to discuss teacher performance against their goals. Here’s a summary:
This setup is a significant improvement on the current system. It creates a more holistic feedback system that connects each observation to the next and to an overarching improvement goal. In the current system, observations are isolated—they’re similar to the one-off professional development sessions that many teachers find unhelpful. By allowing teachers and evaluators to jointly identify areas for growth at the start of the year, and then focusing on those specific areas throughout the year during a variety of observations, the system becomes far more coherent and meaningful. The ESB proposal also requires post-observation conferences. This is another move in the right direction, since as the Ohio Department of Education has said, “growth comes from the conversations about practice between observer and teacher.”
Exempting highly rated teachers
During the summer of 2014, a change in law allowed less frequent observations for teachers who received the two highest ratings. This change only applied to teachers whose student academic growth measure, which is determined by student test scores, was average or higher during the most recent school year. Teachers with an “accomplished” rating (the highest rating) could be evaluated once every three years, and teachers with a “skilled’ rating could be evaluated once every two years. In both cases, teachers are still required to receive at least one observation and conference each year.
In their recommendations, the ESB suggests that this practice be maintained with a few key exceptions. They are:
The rationale behind these recommendations—and behind the original law change—is understandable. Highly effective teachers should be trusted with more freedom and responsibility. And shrinking the number of full-cycle evaluations that principals must conduct each year does lessen administrative burden.
But the ESB recommendations are about transitioning from an ineffective evaluation system to an effective teacher feedback system. Revising the observation rubric, embedding student growth measures that are more closely aligned with classroom practice, creating a more holistic observation cycle—these are changes aimed at giving teachers more and better feedback. Exempting highly rated teachers from a system designed to give them the frequent and meaningful feedback they deserve doesn’t just contradict these other recommendations, it’s unfair to the teachers who want every opportunity to get better. Sure, principals and other evaluators can still stop in and give feedback. But will they?
There’s also the not-so-small matter of how many teachers would be exempt. Back in 2013-14—the only year that Ohio’s evaluation system was actually used as intended—approximately 90 percent of Ohio teachers were rated either “accomplished” or “skilled.” If the ESB recommendations become law, and this percentage holds, that means that the overwhelming majority of Ohio’s teachers will have very limited interactions with the new system. And if that’s the case, what’s the point of having one at all?
***
Overall, the ESB’s recommendations are solid suggestions. Eliminating shared attribution and streamlining the observation cycle into a more holistic and coherent system are great ideas that could lead to a lot of professional growth for teachers. But there are devils in the details. Exempting highly rated teachers from the system jeopardizes their possibility for growth and robs teachers of what could be their only opportunity to receive meaningful, observation-based feedback each year. As the General Assembly deliberates how to revise the teacher evaluation system moving forward, they would be wise to follow all of the ESB’s recommendations except this one.
[1] Under the current system, walkthroughs may be unannounced and completed frequently. The ESB makes no mention of changing this policy.
A new meta-analysis of studies examining the relationship between homework and student achievement looks at 30 years of data involving over 312,000 students worldwide. It was published in the journal Educational Research Review in March.
The researchers noted a split in the findings of the studies reviewed: about as many studies found a positive relationship between homework and student achievement as found a negative relationship. But the researchers wondered if there were confounding factors among the various studies that might explain the disparate results. To look into this hypothesis, they narrowed the more than 8,000 studies under review down to twenty-eight that contained sufficient data to compare along eight variables identified as having the potential to lead to the inconsistent results.
These eight are interesting and worth noting: grade level (could the findings be more consistent in high school vs. elementary school?), subject matter (science vs. math?), homework indicators (homework measured by time on task vs. effort or grade received), publication type (perhaps the bar for publication in a peer-reviewed journal vs. a dissertation lead to inconsistent reporting of findings?), publication year (might the see-sawing reputation of homework as help or hindrance over time lead to inconsistent reporting of findings?), sampling method (random assignment studies vs. others?), geographical region (does regional or national differences in approach to homework explain the inconsistent findings?), and measure of achievement (could the findings be more consistent when achievement is measured by standardized vs. non-standardized assessments?). Whew!
After considering the eight variables, analysts concluded that a small and positive relationship between homework and academic achievement in math and science exists. That relationship is stronger in elementary and high school over middle school (bucking a finding noted in previous meta-analyses) and is stronger for U.S. students than for their Asian counterparts, which the largely China-based researchers attribute to the rise of private tutoring in a number of Asian countries.
With each report of schools and parents opting out of homework so as not to stress out their kids, the arguments begin anew: What is the purpose of homework, how much should be given, what form should it take, how do we know it’s valuable? This research creates some coherence between fragmented findings that could be useful in answering these questions. For now, the age-old notion that homework can help continues to ring true.
SOURCE: Huiyong Fan, Jianzhong Xu, Zhihui Cai, Jinbo He, and Xitao Fan “Homework and students' achievement in math and science: A 30-year meta-analysis, 1986-2015,” Educational Research Review (March 2017).