School closures and student achievement
Closing bad schools is politically unpopular. But now there’s good evidence that it works. Aaron Churchill and Michael J. Petrilli
Closing bad schools is politically unpopular. But now there’s good evidence that it works. Aaron Churchill and Michael J. Petrilli
Bad schools rarely die. This was the conclusion of Fordham’s 2010 report Are Bad Schools Immortal?, which discovered that out of two thousand low-performing schools across ten states, only 10 percent actually closed over a five-year period. On reflection, the finding was not too surprising: Shuttering schools nearly always sets off a torrent of political backlash, as authorities in Chicago, Philadelphia, and other urban districts have learned in recent years. And the reasons are understandable: Schools are integral parts of communities. They’re built into families’ routines and expectations, and closing them inevitably causes pain, disruption, and sadness, even when it’s best for students.
However, we also recognize that closing schools is sometimes necessary. In the charter sector, in particular, closure is an essential part of the model: Schools are supposed to perform or lose their contracts. That’s the bargain. And in the district sector, experience has taught us that some schools have been so dysfunctional, for so long, that efforts to “turn them around” are virtually destined to fail.
That doesn’t mean it’s easy to put bad schools out of their misery. Part of the difficulty is political, but it’s also a genuine moral dilemma: Are we sure that kids will be better off after their schools close? What is the quality of the remaining schools in their neighborhoods? Most importantly, do students gain or lose ground academically when their schools close and they are obliged to enroll somewhere else?
We know from personal experience how important, even agonizing, these questions are. In our role as a charter school authorizer in Ohio, we have blinked on a few occasions—choosing to keep marginal schools open because we worried that the children attending them might be even worse off if they had to move elsewhere. Were we right to do so?
To date, policymakers and practitioners have had precious little research to anchor their thinking and inform their decision making. We could only locate three relevant studies, and their conclusions differed on whether closures positively or negatively affected students.
The high stakes associated with school closures, and the paucity of prior research, led us to explore this terrain ourselves. The result is Fordham’s new study School Closures and Student Achievement: An Analysis of Ohio’s Urban Districts and Charter Schools, which brings to bear fresh empirical evidence on this critical issue. As it turns out, our home state of Ohio is fertile ground. Its large urban districts, referred to as the “Big Eight,” have faced sharply declining enrollment due to both shrinking populations and an influx of charter schools. Confronting the loss of more than fifty thousand pupils in just eight years, these districts have been forced to close scores of schools.
During the same period, dozens of charter schools have also closed for a variety of reasons, including financial difficulties and academic underperformance. In fact, Ohio’s automatic closure law, which is based on academic results, required twenty-three charters to close during the period of study.
Our study examined the achievement trends of 22,722 students in grades 3–8 who attended one of the 198 urban schools in Ohio that shut their doors between 2006 and 2012. These closures disproportionately affected low-income, low-achieving, and black students. To our knowledge, this is the first study to investigate separately the academic impact of closing charter schools. The study was conducted by Dr. Stéphane Lavertu of the Ohio State University and Dr. Deven Carlson of the University of Oklahoma, who used state records to examine the impact of closure.
Their most important finding is that school closure has significant positive impacts on the achievement of displaced students. The following figure displays the cumulative learning-gain estimates of displaced students by the third year after their schools closed. Displaced students from district schools that closed in urban areas gained, on average, forty-nine extra days of learning in reading relative to the comparison group; in math, it was thirty-four days. In the charter sector, students displaced from a closed school also made substantial gains in math—forty-six additional days—but did not make statistically significant gains in reading.
Figure 1: Impact of closure on displaced students, measured as cumulative student learning gains by the third year after closure
The analysts then focused on charter and district students who landed in higher-quality schools after closure, and there they found even larger cumulative learning gains. (We defined quality as a school’s contributions to student growth—its “value added,” in education parlance.) District students who landed in higher-quality schools gained an equivalent of sixty-nine extra days of learning in reading and sixty-three extra days of learning in math. When charter students moved to higher-quality schools, they gained an additional fifty-eight days of learning in reading and eighty-eight days of learning in math by the third year after their school closed.
We must register one caveat that tempers the positive findings on closures: When students displaced by closures enter their new schools, it is possible that they negatively impact the learning of students who had previously attended the school. Think of this as a possible “side effect” of the closure “treatment.” The study provides suggestive (but not conclusive) evidence that there might be minor side effects—the value-added scores of schools absorbing displaced students fall slightly. The net effect of closure remains an open empirical question.
These findings have two implications for policymakers. First, they should not shy away from closures as one way to improve urban education; they are a viable alternative to “turnarounds.” As Andy Smarick and others have argued, fixing a chronically low-performing school is often more wishful thinking than promising strategy. Although successful school turnarounds are not impossible, Smarick is correct when he writes, “Today’s fixation with fix-it efforts is misguided.” This study adds hard evidence that shutting down low-quality schools could better serve students’ interests than endless (and fruitless) efforts to improve them.
Second, policymakers have to grapple with the mechanism of closing schools—whether they ought to shutter schools via top-down decisions or the marketplace. Interestingly, save for Ohio’s automatic closure law that was applied to a handful of charters, state policy did not directly shutter the schools in this study. Rather, population loss and the proliferation of school choice forced districts to close unneeded schools, while most charters closed due to stagnant enrollment, financial difficulties, or a combination of both.
In other words, Ohio’s experience with urban school closures was primarily market-driven. Families voted with their feet, and weaker schools withered and eventually died. And it worked. Most students—though not all—landed in higher-quality schools and made gains after closure. Could Ohio have done even better for its students, had school authorities closed schools more aggressively and strategically? Perhaps.
Though fraught with controversy and political peril, shuttering bad schools might just be a saving grace for students who need the best education they can get.
Intra-district choice has long been a type of school choice supported by many people who don’t really like school choice. Since neither students nor funding leave their boundaries, district officials have fewer problems allowing families to choose their schools. But intra-district choice is also complicated. A lack of quality information about available schools, the absence of a simple system-wide method of applying to those schools, and the added burden of transportation challenges can bring the potential of intra-district choice to a screeching halt. However, there are school districts that have taken these issues head-on and offered valuable, innovative solutions. Cincinnati Public Schools (CPS) is a shining example.
During the 2013–14 school year, CPS made the transition to high schools that serve students between the seventh and twelfth grades. CPS offers some compelling academic reasons for the switch, but they also utilized the transition to create high schools of choice. Instead of assigning sixth graders to a high school based on their home addresses, CPS permits students to choose their high school. Each high school offers a variety of programs, classes, extracurriculars, and services that represent unique learning environments and opportunities. All schools offer college preparatory curriculum aligned to Ohio’s new graduation requirements, many offer specialized programs, and every school is open to students with disabilities. Let’s take a look at some of the best aspects of the CPS high schools of choice structure.
Solving the usual intra-district problems
For starters, CPS does a good job of sharing information about their high schools of choice. The High School Guide on the CPS website doesn’t just outline the nuts and bolts of the system, it also offers a summary of each of the fifteen high schools (it does need some additional work, such as listing school report card grades, but it’s a good start). This kind of information—in one easily accessible document—is revolutionary enough on its own. But CPS doesn’t stop there. The district also offers one application system for all fifteen high schools, which can be completed online. Students select one high school as their top choice and four others in order of preference. Seats are awarded via lottery. Students who are enrolled in charter schools or private schools within CPS boundaries are also free to apply, as are out-of-district students (although district residents receive priority). And as for transportation troubles, the district provides Metro bus passes to all students from grades seven through twelve who live 1.25 miles or more away from the school they attend. Bus passes might not solve all transportation issues, but they’re a giant step in the right direction.
Unique experiences
The most exciting part of the CPS high schools of choice model is that each of their high schools is truly unique. This is not a group of high schools that offer core classes and sports and claim to be special and different when they’re really not. Of the fifteen schools outlined in the High School Guide, there are two Montessori schools, a Paideia school, a STEM school, a digital/online school, a New Tech school, a blended learning school, a career technical school, and a school for creative and performing arts (which requires a live audition for entry). There are also schools that offer the Special College Preparatory Program (which requires passing an entrance exam), dual enrollment, paid internships, and immersion courses (some of which include traveling abroad), as well as programs in engineering, culinary arts, health sciences, nursing, software development, and plant and animal sciences. Each school offers its own set of athletics, clubs, and even community service.
Of course, lots of cities (like Columbus) have lots of high schools with similar focuses and offerings. What’s special about Cincinnati, though, is that every student in the district from seventh to twelfth grade has the opportunity to apply to go to any of these schools. In CPS, high school choice isn’t just a pretty phrase that only applies to certain families—it’s a reality for all families.
Career and academic planning
CPS actively seeks to aid students in planning for their futures by providing interactive career- and academic-planning tools. These tools help students identify their interests and skills, determine career matches to those interests and skills, and outline the level of education and training that each career requires. CPS utilizes several different systems: the Ohio Career and Information System (OCIS), the Kuder Career Planning System (KCPS), FunWorks, Naviance, and OhioMeansSuccess. The systems offer a combination of information and exploratory tools that allow students to investigate occupations, post-secondary options, and financial aid. A few even allow students to create their own portfolios, career plans, and resumés, or search for specific jobs. Perhaps the best part of the CPS model, though, is that these tools are used by students in an advisory setting— a small group of students who, under the leadership of a teacher, investigate their skills and talents, explore college and career possibilities, set and track goals, and develop close relationships with their advisor and peers.
***
A recent community report card from the Strive Partnership shows that CPS has seen an eight-point jump in high school graduation since last year. While it’s too early to say that this is all because of expanded intra-district choice, continued increases will be a good indicator of success. The high schools of choice model isn’t perfect (the next step is getting school report card grades into the school guide), but it’s a phenomenal start that offers Cincinnati students and their families real choices. It’s about time all kids in Ohio were offered the same opportunities.
In a previous review, my colleagues examined a National Charter School Resource Center (NCSRC) report that analyzed states’ charter policies regarding access to district-owned facilities. In a new report, NCSRC narrows its focus to charter school facilities in California. Golden State charters were asked to complete a survey about their facilities and to allow an on-site measurement; these results were then supplemented by data on school enrollment, student demographics, and funding. The results offer a sobering picture of charter facilities in the state. Charter school facilities are generally smaller than the size recommended by the California Department of Education; classrooms for elementary, middle, and high schools are, on average, between 82 and 89 percent of the state standard size (it is worth nothing that state size standards might not be appropriate for all schools in all situations). Charter facilities as a whole are 60 percent smaller than state site size recommendations, even after adjustments are made for enrollment differences. California charters also spend varying amounts of their per-pupil funding on facilities; charters that own their buildings pay an average of $895 per pupil; charters located in a school district facility pay an average of $285 per pupil; and charters renting from a private organization pay an average of $570 per pupil. School district facilities are clearly the most cost effective, but since they are not consistently and readily available, many charters are forced to seek more expensive options. In addition, co-located charters have concerns about implementing their curricula and ensuring student safety in space controlled by others. Specialized spaces—kitchens, science labs, gymnasiums, and library/media rooms—are in even shorter supply than standard classroom space. Despite all of these facility shortcomings, there are approximately ninety-one thousand California students on charter school waitlists. To meet this demand, 85 percent of charters plan to grow their enrollment over the next five years, but 64 percent of those schools do not have the space to meet their desired enrollment growth. California may be on the other side of the country, but the facility struggles of its charters are not that different from those we see in Ohio. Ohio policymakers should take notice of the importance and scarcity of high-quality, affordable facilities—especially for high-performing charters that offer valuable choices to families.
SOURCE: “An Analysis of the Charter School Facility Landscape in California.” National Charter School Resource Center (April 2015).
School closures should never be undertaken lightly, be they district or charter schools. Academic troubles, a fall in enrollment, economic problems, and a myriad of other issues can push the issue to the forefront. Under such times of duress, policymakers and education officials are forced to ask a difficult question: Does closing a school cause more harm than good, especially for students?
Report Co-Author, Stéphane Lavertu
Today, Fordham released a new study called School Closures and Student Achievement that seeks to answer this very question. At a breakfast event on April 28th that attracted around fifty Ohio education leaders, the report’s co-author, Dr. Stéphane Lavertu, presented a summary of the study’s findings. These findings showed that three years after closure, displaced students typically make significant academic gains.
After Dr. Lavertu’s presentation, Chad moderated a panel of policymakers and practitioners who discussed the findings and policy implications. The panel consisted of: the Honorable Nan Whaley, Mayor of Dayton; Tracie Craft, Deputy Director of Advocacy, Black Alliance for Educational Options (BAEO); Stephanie Groce, former member Columbus City Schools Board of Education; Piet van Lier, Director of School Quality, Policy, and Communications, Cleveland Transformation Alliance; and Dr. Deven Carlson, co-author of Fordham’s School Closures and Student Achievement.
From Left to Right: Chad Aldis, Stephanie Groce, Piet van Lier, Dr. Deven Carlson, The Honorable Nan Whaley, and Tracie Craft
From their varied vantage points, panelists agreed that while closures are difficult and often unpopular, there is a clear educational benefit, especially if closures are done thoughtfully and with open communication to parents. This means that community input and involvement prior to and during the process of closure is critical. Finally, panelists emphasized that closing schools isn’t a silver bullet, and that closures must be accompanied by efforts to raise the number of high quality seats in both district and charter schools.
The Honorable Nan Whaley and Tracie Craft
The education components of Governor Kasich’s proposed budget—and the House's subsequent revisions—made a big splash in Ohio's news outlets. Much of the attention has been devoted to the House’s (unwise) moves to eliminate PARCC funding and their rewrite of Kasich’s funding formula changes. Amidst all this noise, however, are a few other education issues in the House’s revisions that have slipped by largely unnoticed. Let’s examine a few.
Nationally normed vs. criterion-referenced tests
As part of its attempt to get rid of PARCC, the House added text dictating that state assessments “shall be nationally normed, standardized assessments.” This is worrisome, as there is a big difference between norm-referenced and criterion-referenced tests.
A norm-referenced test determines scores by comparing a student’s performance to the entire pool of test takers. Each student’s test score is compared to other students in order to determine their percentile ranking in the distribution of test takers. Examples of norm-referenced tests are the Iowa Test of Basic Skills or the Stanford 10 exams. A criterion-referenced test, on the other hand, is scored on an absolute scale. Instead of being compared to other students, students are compared against a standard of achievement (i.e., a “proficiency cut score”). Ohio’s former standardized exams—the Ohio Achievement Assessments and Ohio Graduation Tests—were criterion-referenced. The PARCC and SBAC assessments are criterion-referenced exams, as are the nationally administered NAEP exams. (The latter three, importantly, produce scores that can be pegged to college readiness. They can answer the question, “Is this child on track to succeed in college without remediation?”)
In education, there’s room (and a need) for both of these types of tests. In fact, an argument could even be made for the inclusion of student comparisons on score reports (think of percentile rankings that indicate how a student is performing compared to other students across the state or nation). But we’re talking about state tests here--tests that determine report card grades and signal whether students are where they need to be academically. Do we really want the scores of these tests to be based on what other students know, instead what our students should know? Comparing students is useful, but it doesn’t offer a complete picture. Think of it this way: you could be the richest person in your city, but that doesn’t mean you have enough money to pay your bills. It’s a cold comfort to know that you’re better off than your neighbors if you still can’t buy groceries and pay your electric bill. Students and families deserve the absolute truth about academic achievement. That makes the issue here one of courage: Are Ohio policymakers courageous enough to evaluate student achievement against academic standards and a cut score? Or do they prefer the half-truth of student comparisons and the comforting illusions they offer?
EdChoice funding and eligibility
Kasich’s budget made a couple of changes to Ohio’s Educational Choice Scholarship Program. First, it raised the maximum amount allotted for students in grades nine through twelve from $5,000 to $5,700. This desperately needed increase brings EdChoice funding up to the same amount that’s awarded through the Cleveland Scholarship Program. It’s also common sense, as high schools are more expensive than elementary and middle schools. Second, it changed the basis of EdChoice eligibility. Current law dictates that one of the ways a student can establish eligibility for EdChoice is by being assigned to a school that has, for at least two of the previous three years, been ranked in the lowest 10 percent of all public school buildings—including charter schools and STEM schools—according to a performance index score. Kasich’s budget changes the law so that the list of schools used to calculate the lowest 10 percent of school buildings includes only those operated by school districts. After all, EdChoice eligibility depends on the district school a student is assigned to, so this change limits the calculation to eligible schools. The change could result in a small increase in the number of eligible students, since poor-performing charters and STEM schools were bumping some of the worst district schools off the eligibility list. Kasich’s proposal is an effort to increase options for families. To its credit, the House opted not to change it.
The biggest potential change to EdChoice, though, isn’t really about vouchers or school choice—it’s about the effects of safe harbor provisions. The House’s version of the budget extended safe harbor protections from PARCC (or whatever our future assessments will be if PARCC is jettisoned) for an additional two years for school districts, teachers, and students. This is problematic because EdChoice eligibility is largely tied to school building performance. If school grades aren’t calculated or don’t have ramifications because of safe harbor provisions, then Ohio will essentially be freezing eligibility for its flagship voucher program. This must be addressed by the governor or the General Assembly. Students attending Ohio’s lowest-rated schools simply can’t afford—and shouldn’t be asked—to wait for the state to transition to a new assessment. This might even be a good time for Ohio to move to a means-tested scholarship—like most states with vouchers already use—and away from what is typically referred to as a “failing schools model” voucher, which comes with a complicated eligibility framework.
Shared attribution
Shared attribution is the practice of evaluating teachers based on test scores from subjects other than those they teach. It’s an unfair method of evaluation (what teacher—or any professional—would want to be evaluated based on something they have no control over?), particularly when there are far better ways to evaluate teachers. Kasich’s budget explicitly states that if a class is outside the core subjects and value-added or alternative measures are not available for teacher evaluation purposes (as in the case of art or music teachers), boards must use shared attribution. Luckily, the House removed this provision.
***
While the media pays close attention to hot topics like the “winners and losers”—or the “winners and bigger winners,” as is the case this year—of school funding, it’s important to pay attention to these other seemingly small issues that could have a massive impact on education in the Buckeye State.
How should city-level leaders manage a portfolio of schools? The first thing they should do is take stock of the city’s supply of public schools. A new report from IFF, a nonprofit community development financial institution, provides a helpful look at Cleveland’s public schools, both district and charter. In an effort to uncover those with the highest need for quality seats, the analysis slices the city into thirty neighborhoods based on several variables: schools’ academic performance, facility utilization and physical condition, and commuting patterns. The facility analyses are the major contribution of this work, principally the schools’ utilization rates—the ratio of student enrollment to the physical capacity of the building. The utilization rates are needed to determine the actual number of available high-quality seats. The analysts obtained building-capacity statistics through the district; they estimated charter capacity by using the schools’ highest enrollment point (perhaps underreporting charters’ capacity—especially for new schools). Happily, the study reports that Cleveland’s highly rated K–8 schools are at 90 percent capacity. Yet it is less satisfying to learn that its highest-rated high schools are at only 68 percent capacity (the report does not suggest any reasons why). Meanwhile, most of the city’s poorly rated schools are under capacity, averaging 71 percent utilization. The study could be significantly improved in one regard. Its academic measure consists solely of an achievement-based metric—the state’s performance index. But in urban areas in particular, where achievement tends to be low, considering learning gains (a.k.a. “value-added”) is also crucial when evaluating school performance. Cleveland’s policymaking community would be wise to reconsider the school-quality analysis by looking through both the student achievement and value-added lenses. An analysis of that type would still reveal that Cleveland has far too many low-quality seats. But it would also shine a brighter light on which schools help their students make extraordinary gains—and are worth investing in.
Source: IFF, A Shared Responsibility: Ensuring Quality Education in Every Cleveland Neighborhood (Chicago: Author, 2015).