Straightening the record on charters and local tax revenue
Clearing up common misconceptions
Clearing up common misconceptions
Financing public education has historically been the joint responsibility of state and local governments. But while traditional districts have long had access to both state and local sources of revenue, nearly all Ohio charter schools tap state funds alone. The reason: Unlike districts, charters do not have the independent authority to levy taxes on local property. Meanwhile, districts have been loath to share local funding with charters. The only exceptions in Ohio are eleven Cleveland charters, which together received $2.2 million in local revenue for 2012–13 as part of a revenue-sharing plan with the district. As a result, Ohio charters operate on less overall taxpayer support than districts.
Despite the stark fact that charters rarely receive local funds, a few groups are mounting attempts to claim that somehow charters receive proceeds from local taxes. Their claims are false. First, state data contradict any proposition that local funding directly flows to charters. Second, while some charters may receive more state aid than districts, on a per-student basis, this difference in state funding is simply a product of the state funding formula. It is not a result of local funds indirectly going to charters, as some have suggested.
The facts are the facts
Let’s lay out the facts. Practically all public-charter schools in Ohio receive zero funding from local taxes. State law does not give charters taxing authority, so they are wholly reliant on state and federal funds and on charitable donations. Meanwhile, Ohio vests school districts—their boards, specifically—with taxing authority. All districts in Ohio exercise that power, raising funds primarily through property taxes, a portion of which must be approved by voters.
The state’s financial data confirm that charters receive no local revenue. First, Ohio Department of Education data show that charters receive no local tax revenue.[1] Zilch. Goose eggs. Bubkis. In contrast, the state’s 613 traditional districts raised a total of $8 billion in local tax revenue in 2012–13 (click on this link for a spreadsheet containing the local revenue data). Second, consider the Ohio Auditor of State’s database, which posts the financial reports of charters and districts. Charter financial reports show no local tax revenue; for instance, Columbus Collegiate Academy (charter) reported zero local tax revenue for fiscal year 2013. Meanwhile, Columbus City Schools (district) received $363 million generated from local property taxes (the equivalent of $7,300 per student, on top of roughly $5,000 in state aid for a typical student). Tables 1 and 2 below display the revenue side of these organizations’ financial statements.
Table 1: Columbus Collegiate Academy (CCA) – Statement of Revenues, FY 2013
[[{"fid":"113959","view_mode":"default","fields":{"format":"default"},"type":"media","attributes":{"height":"363","width":"1223","style":"width: 600px; height: 178px;","class":"media-element file-default"},"link_text":null}]]
Source: Ohio Auditor of State, Columbus Collegiate Academy, Statement of Revenues, Expenses and Changes in Net Positions for the Fiscal Year Ended June 30, 2013, pg. 8.
Table 2: Columbus City Schools (CCS) – Statement of Revenues, FY 2013
[[{"fid":"113960","view_mode":"default","fields":{"format":"default"},"type":"media","attributes":{"height":"467","width":"1224","style":"width: 600px; height: 229px;","class":"media-element file-default"},"link_text":null}]]
Source: Ohio Auditor of State, Columbus City Schools, Change in Net Position (Fiscal Year 2013), pg. 8.
State taxes fund charter schools
Despite these straightforward facts, claims have been made that charters somehow receive local tax revenue indirectly. The argument goes something like this: Charters receive more state funding per student than districts; therefore, local taxes indirectly “subsidize” those charter schools.
The first part of their claim is typically true because the state adjusts the amount of aid that a district receives based on its local wealth. The adjustment is made to ensure that greater aid goes to districts with fewer local resources (property and income wealth).
Consequently, relatively affluent districts may receive less state aid than the charters who enroll children from that district. Consider the following example: Bexley School District (a high-wealth district near Columbus), receives roughly $3,000 per student from the state. But a charter that enrolls an average student from Bexley would receive approximately $6,500 in state aid. Bexley, however, raises around $12,000 per student through local tax efforts, whereas charters drawing students from Bexley receive no local tax support.
(In contrast, the amount of state aid appropriated to districts and charters becomes more comparable in higher-poverty areas. For example, Columbus City Schools receives approximately $5,000 per pupil from the state for a typical student, while charters receive about $6,500 per average student. But again, local revenue does not flow to Columbus charters.)
Because charters are shut out of local tax revenue, they have zero local wealth to fund their students’ education. Thus, the state makes no wealth adjustment when determining charters’ amount of state aid.[2] That is why some charters may receive more state funding than their neighboring district. The state—note, the state—funding program partially compensates for charters’ absence of local tax revenue.
The state does not fully offset the absence of local revenue; in fact, the most recent research shows that Ohio charters receive on average roughly 22 percent less overall taxpayer support, and the disparity is even greater in Cleveland and Dayton (over 40 percent).
Yes, districts can feel a fiscal pinch—through state revenue loss
Ohio charters are entirely funded by state and federal funds and non-tax contributions—not local taxes. As such, a district’s local revenue does not vary based on the number of students attending a charter. Think of it this way: If every student were to leave Columbus City Schools for charter schools tomorrow, the district would still be entitled to the full allotment of local tax revenue. However, districts’ state revenue does depend on student enrollment, and that is how charters could have a fiscal impact on districts. But note: it’s a similar impact that any student out-migration would have on a district, such as when families move to the suburbs, to private schools, or to another state.
A district can respond to shrinking state revenues due to enrollment decline in several ways, including altering their cost structure or winning back the students who left. Obviously, these options are not easily done; but to survive, school districts—like any organization—must adapt to changing external conditions. The presence of competition—including, but not limited to, charters—should motivate districts to improve organizationally, ideally working toward greater efficiency (in terms of resource allocation) and effectiveness (in terms of service delivery).
Conclusion
Charter schools in Ohio do not have access to local tax revenues by any mechanism. Instead, charters are almost wholly reliant on state funding. Anyone who claims that charters receive local tax revenue (save for the exceptions in Cleveland) is simply making a false assertion.
Current policy results in less overall public financing for the education of thousands of needy children who attend charters. Inequitable funding also places charter schools at a disadvantage in teacher pay and in obtaining facilities. Finally, current school funding policy forces taxpayers to direct their tax payment to the district—even if that district has failed as a public institution for decades or if taxpayers have strong preferences for charter schools.
Ohio policymakers and the public need to be informed when evaluating spurious claims that suggest public charter schools receive more funding than traditional public schools. Moreover, serious thought should be given as to whether the education of a certain category of students is worth less; unfortunately, that’s exactly what is happening to charter school students in Ohio.
[1] The only exemptions are the eleven Cleveland charters mentioned earlier, plus six charters that reported a trivial amount of local tax revenue (less than $6,000 total).
[2] Unlike districts, charters do not have the main component of their state aid adjusted (the “opportunity grant”) or their categorical funding adjusted (e.g., additional LEP and special-education funding). However, charters’ “targeted assistance” (a smaller component of the funding formula) is adjusted in the same way as their serving district(s).
Over the course of 2014, a series of reports from the National Council on Teacher Quality (NCTQ) spotlighted some serious issues with education schools in Ohio. The Buckeye State boasts performance reports that analyze teacher preparation programs, but these reports merely show how little is expected of candidates prior to their acceptance into a teacher preparation program. Furthermore, Ohio doesn’t have minimum standards or clear consequences for poor programs. NCTQ is right that our teacher preparation programs need to get better in two key ways: improving candidate selection and strengthening teacher training. Here’s how.
Candidate selection
Right now, Ohio sets a low bar for admission into ed schools. Countries with the highest scores on PISA[1]—like Singapore and Finland—restrict admissions into teacher preparation programs to only their best students. In fact, in Finland, becoming a teacher is such a competitive process that only about one in every ten applicants will be accepted to study to become a primary school teacher. This is similar to what Teach For America does: In 2014, more than 50,000 people applied to join TFA, and only 5,300 were admitted—an 11 percent acceptance rate.
The intense screening process is designed to select only the candidates who are most likely to succeed. In Finland, for example, teacher candidates are chosen based on exam results, their high school diploma, and their out-of-school activities. They then complete a written exam on assigned pedagogy books, participate in an observed clinical activity that mirrors school situations, and undergo an interview. And this is simply to enter the teacher preparation program. Having been through the TFA process myself, I can attest that it is much the same.
According to Ohio law, no such screening process is required for the state’s teacher prep programs. Certain ed schools may have GPA requirements for their teacher candidates (Miami of Ohio, whose programs are ranked as some of the best in the country, requires a minimum 2.75 GPA for admission into their cohorts), but GPA isn’t everything. It certainly shouldn’t be the only thing. Rigorous standardized test score requirements, interviews, sample lessons that help trained observers gauge whether candidates already have an aptitude for the less tangible aspects of effective teaching—these are important factors in determining whether an applicant should be accepted into a teacher prep program and trusted with the responsibility of educating children. Unfortunately, they’re also aspects that the state of Ohio—and therefore teacher training programs in Ohio—don’t require.
Candidate training
Simply raising the standards for acceptance into teacher prep programs isn’t enough. Ohio colleges of education also need to strengthen their program design. NCTQ’s 2014 Teacher Prep Review (see here for Ohio-specific findings), a national ranking of teacher preparation programs, reveals that far too many Ohio teacher preparation programs miss the mark. Eleven programs did receive top rankings (see here and here), including Miami University of Ohio’s undergraduate and graduate programs and Ohio State’s graduate programs. That being said, twenty fully evaluated programs earned scores too low to even qualify for a ranking. These include undergraduate and graduate elementary programs at Cleveland State, the undergraduate secondary program at Ohio University, and both graduate programs at Kent State. In addition, thirty-one programs could not be fully evaluated due to insufficient data.
Get beyond these headlines and examine the specific criteria NCTQ used to judge programs and the findings are even more troublesome. When it comes to training teacher candidates on lesson planning, for example, not a single program meets the standard. In classroom management techniques, only 11 percent of programs meet the standard (to a maddening degree, teacher education programs still resist incorporating into their programs anything that feels like “training”). In terms of training candidates on how to assess learning and use data to drive instruction, a whopping 77 percent of programs only partly meet the standard—not a single program meets the standard completely. Only 10 percent of programs meet the standard for ensuring that teaching candidates have a strong student-teaching experience. In other words, low expectations don’t stop after selection—they continue into training.
***
Teaching is hard, and too many first-year teachers struggle with the demands of the classroom. Given the NCTQ’s findings about Ohio teacher preparation programs, it isn’t a stretch to think that lackluster selection and preparation is the reason why. If Ohio wants to attract the best and brightest to its schools, teacher prep programs need to become as prestigious as medical and law schools—or at least as prestigious as Teach For America. There are other important incentives to keep in mind (TFA does provide additional benefits), but raising the bar for selection and making the programs as rigorous as possible is the key first step.
James R. Delisle took aim at differentiated instruction (DI) in his commentary in the latest issue of Education Week, noting the challenge of making this nice-sounding idea work with the reality of many of today’s classrooms.
As our own Mike Petrilli wrote in 2011: “[T]he enormous variation in the academic level of students coming into any given classroom” is the greatest challenge facing America’s schools. The implication is that those teachers seeing success with differentiated instruction—however few they may be—simply have less variation in learning levels among their students and, therefore, have less differentiation to do. (Oh, and that they have the right training, full understanding, endless diligence, and loads of time.)
So what’s the answer? Delisle wants to bring back ability grouping to fully replace DI. It is hard to deny that America’s classrooms have changed greatly over the last few decades, so perhaps it’s time to toss out “one or the other” thinking and go for something new—a hybrid of sorts.
How about curriculum-based mastery instead? A content sequence with multiple check points along the way (yes, that’s testing). Master it, move on. Don’t master it, remediate until you do. In such a case, you can get the advantages of both DI and ability grouping. Students at both the high and low ability levels start at the same point in a new content area. Groups of students with similar achievement move forward together; those needing similar remediation work do so together. All with the same end goal of content mastery. Onward and upward. Such an approach works well for my own kids and their diverse classmates.
The nineteenth edition of Education Week’s Quality Counts report is out, and while Ohio outperforms over thirty states, the results show that there is still much work to be done. The 2015 report, which has a new evaluation system that focuses on outcomes rather than policies and processes, indicates that the nation as a whole declined from a C+ in 2013 (when grades were last given) to a C in 2015. Ohio also declined, moving from a B- in 2013 to a C in 2015. The report rates states’ quality along three key dimensions: Chances for Success, which takes into account indicators like family characteristics, high school graduation rates, and workforce opportunities; K–12 Achievement, which rates academic performance, performance changes over time, and poverty-based gaps (as measured by the NAEP assessments); and school finance, which includes measures of funding equity across schools. Ohio’s overall score, which is the average of the three categories, was 75.8 out of 100 possible points, which earned a ranking of eighteenth in the nation. In the Chances for Success category, Ohio earned a B-. Most indicators in this category show that Ohio is close to the national average, including preschool enrollment (46.5 percent of Ohio three- and four-year-olds compared to 47.3 percent nationally) and percentage of adults with a two- or four-year postsecondary degree (37 percent of Ohio adults compared to 39.9 percent nationally). In the K–12 Achievement category, Ohio earned a C-. Although this places the Buckeye State at sixteenth in the nation in achievement, the relatively high ranking hides low percentages: just over 37 percent of fourth-grade public school students were proficient on the 2013 NAEP reading test, and only 40 percent of eighth-grade public school students were proficient on the NAEP math test. Ohio fared slightly better in the school finance category with an overall grade of C+. This category determined that Ohio spends $12,010 per pupil (adjusted for regional cost variation), compared to a national average of $11,735. Elsewhere in the nation, consistently high-performing Massachusetts nabbed the top overall spot with a solid 86.2 out of 100 (B), while Mississippi earned a dismal 64.2 out of 100 (D), making it last in the nation. Overall, although Ohio performed reasonably well in terms of rankings, a C overall means there’s ample room for improvement in the Buckeye State. (For another take on this report, see here.)
Source: “Education Week's Quality Counts.” Education Week Resource Center (January 2015).
In the past year, Ohio policymakers have turned their attention to strengthening vocational education. Rightly so; too many non-college-bound students exit high school without the skills to enter the workforce. Blue-collar businesses in Ohio, for example, continue to express concerns about the “skills gap”—the mismatch between the technical abilities they need and the actual skills of their workers. But retrofitting vocational education to meet the demands of today’s employers remains a work in progress. As Ohio schools retool vocational education, they should seek examples of those who have accomplished this very task, and a new paper from the Pioneer Institute provides five case studies of technical high schools in Massachusetts that are well worth reading. A common thread emerges: All of the schools are thriving with the support of their local businesses. These companies have advised the schools on program design (e.g., what skills and jobs merit emphasis), and they have driven fundraising efforts. A couple examples are worth highlighting. One technical school worked closely with advanced manufacturing companies in the area to raise half a million dollars to outfit the school with cutting-edge metal working machines. (Previously, the school had provided technical computer skills, but not actual hands-on machinery experience, leaving manufacturers frustrated.) Another school partnered with the community bank to open an actual retail branch within the school building. High-school students had the opportunity to work alongside full-time employees to learn banking, retail, and marketing skills. Vocational programs that both match local business needs and receive business support are already developing in Ohio. (See here for a charter and here for a district example.) Here’s hoping vocational programs like these continue to expand, putting more Buckeye students on track for success in career and in life.
Source: Alison L. Fraser and William Donovan, Filling the Skills Gap: Massachusetts Vocational-Technical Schools and Business Partnerships (Boston: Pioneer Institute, November 2014).