Innovation Ohio and OEA fail to help anyone 'know' anything
Attempt at "transparency" looks more like data spin.
Attempt at "transparency" looks more like data spin.
With any luck, the “Know Your Charter” website from Innovation Ohio (IO) and the Ohio Education Association (OEA) will go the way of Pets.com and Geocities.com. The new website’s stated aim is to increase the transparency around charter-school spending and academic results by comparing them to traditional public schools. While greater transparency is a worthwhile goal, it looks like Innovation Ohio—a liberal advocacy group founded by former Strickland administration officials—and the Ohio Education Association (OEA)—the state-level affiliate for the nation’s largest labor union—let political spin get in the way of presenting information in a meaningful way.
The website misinforms the public by failing to report essential information about public schools, calling into question how much the website actually helps anyone “know” anything. In particular, Innovation Ohio (IO) and the OEA make the following crucial omissions in reporting basic school information:
1.) They ignore district funding from local property taxes. You’ll notice that the IO-OEA website reports only state per-pupil revenue for districts and public-charter schools. But remember, school districts are funded jointly through state and local revenue sources.[1] By reporting only state revenue, they flagrantly disregard the fact that school districts raise, on average, roughly half their revenue through local taxes (mainly property). Meanwhile, charters, with only a few exceptions in Cleveland, do not receive a single penny of local revenue, which leads to funding inequity between district and charter schools. When local, state, and federal revenue sources are combined, recent research from the University of Arkansas demonstrates that Cleveland charters received a staggering 46 percent less than the city’s district, while Dayton charters received 40 percent less on a per-pupil basis. (These were the only two Ohio cities where a deep-dive funding analysis was conducted for FY 2011.) With this in view, is it any wonder that IO-OEA would want to conceal districts’ local revenue to create the illusion of district cost-effectiveness relative to charters? The key measure when it comes to the cost of education is how many taxpayer dollars—from all sources—support the schools’ efforts.
2.) They ignore student-growth measures: Website users will also notice that IO-OEA use the state’s “performance index” (PI) as their primary measure for academic comparison. To be sure, the performance index is a critical component of school report cards—it indicates how students in a particular school achieve on state assessments. And when we look across Ohio’s large urban school districts, PI letter grades are depressed; at the same time, many charter schools also receive low PI ratings. For charter and urban district schools alike, achievement results (i.e., performance-index scores) are influenced by the characteristics of their students. The Toledo Blade went as far as to say that raw achievement measures “mislead” on school performance. The Blade is half right: By reporting on student-achievement results alone, all we learn is that students from Ohio’s urban communities struggle with achievement and too many students are performing below grade level. But we don’t learn much about the actual performance of the school.
As a result, outside observers also need to know whether public schools are making an impact on achievement, even when they enroll students who are grade-levels behind. That is why the state’s “value-added” measure is also essential. By estimating a school’s impact on student growth over time, we gain a much clearer view of how the school itself performs. But IO-OEA ignored value-added. Instead, they should have reported, in a high-profile way, both indicators of school quality—the performance index and value-added ratings. When people clearly see both measures, they gain a more-holistic view of school quality, be they charter or district schools. Meantime, reporting both measures isn’t a matter of political bent either. Urban schools of both types, district and charter, can and do receive strong value-added ratings. Rather, reporting value-added ratings, alongside a student-achievement measure, is just sound reporting practice.
3.) They ignore school-level data: School-level data is important for two reasons, one technical and the other practical. However, so far as I can tell, the IO-OEA website contains no school-level information for any district school in Ohio. First, the technical reason for school-level information. To be clear, charter schools are schools. As such, they should first be compared to other schools not to entire districts. For some charters, especially those in urban areas with large school districts, the comparison is especially odd: It is something like comparing the food quality of an individual Burger King restaurant to the food quality of the entire McDonald’s corporation. The scale is totally different. Now to a certain degree, a school-to-district comparison, or school-to-state comparison, is appropriate when the idea is to benchmark a school’s performance. But for the website to place charter-school performance side-by-side with their local district (without any acknowledgement that school-level data exists) lost an opportunity to compare similarly situated schools.
Second, the practical matter: For school-shopping parents—particularly those in urban areas where school-choice is most prevalent—the website is virtually useless. Parents select schools for their children, not districts. Moreover, students attend traditional public schools that, while part of a district, have their own strengths and weaknesses and are anything but uniform. So the absence of school-level data for district schools restricts the amount of information available to parents who have to make the important choice of which school to send their child, whether charter or district-run. It is unconscionable for IO-OEA to claim that their website allows parents and citizens to compare charters to “traditional public schools.” To entire districts, yes; to schools, no.
In addition to glaring data omissions (and misleading comparisons), the website uses average teacher experience as a high-level school-quality measure. The website reports—and implicitly compares—the average experience of teachers in charters to districts. It does no good to spread the myth to parents and taxpayers that teachers’ experience matters when it comes to their impact on achievement. (Research has shown that, on average, a teacher’s impact plateaus after roughly three to five years of teaching.) Perhaps some parents will feel reassured that their children have experienced teachers; but as for us, there is more reassurance in knowing that a school’s students are achieving against a rigorous standard.
It is perfectly acceptable—and essential—for organizations to report school finance and academic results. But the reporting also has to be done responsibly. We at Fordham recently published a report, Poised for Progress, which details the academic results of charter and district schools in the Ohio “Big Eight” urban areas. (It does not touch on school-finance issues, in part, because the state will not report 2013-14 fiscal data until later this year.) So in the end, IO-OEA missed an opportunity to constructively inform parents, taxpayers, and policymakers on public education in Ohio. They badly miss the mark when it comes to sound reporting practices on public schools; for that reason, parents, taxpayers, and policymakers should use their bandwidth to bookmark more informative websites on school information (as you can see there are much better sources of information).
[1] A relatively small share of public-school revenue—approximately 10 percent statewide—comes by way of federal grant programs.
A 2014 report from the Consortium for Policy Research in Education (CPRE) shows that the number of first-year teachers in the United States rose from 84,000 in 1987–88 to 147,000 in 2011–12. While this change is largely demographic (fueled by baby boomer retirements), it also means that over 1.7 million teachers—roughly half the workforce—has ten or fewer years of experience. While the new infusion of talent, energy, and ideas a new teacher can bring is positive, many aren’t sticking around for very long. In fact, the CPRE report notes that more than 41 percent of beginning teachers left the profession within five years. While not all teacher turnover is bad—no one wants to force weak teachers to stay merely to improve retention rates—there are also talented teachers who are leaving—and students are the ones paying the heaviest price.
Much ado has been made over why beginning teachers leave . You’ll hear different accounts of how to fix it on different “sides” of the education reform debate. One such argument provided by Richard Ingersoll, a professor from the University of Pennsylvania’s education school (and a former high school teacher), faults the isolating “sink or swim” experience that most beginning teachers face. Ingersoll notes that beginning teachers are “frequently left to succeed or fail on their own within the confines of their classrooms” and goes on to explain that some commentators even refer to teaching as a profession that “cannibalizes its young.” Perhaps Ellen Moir said it best: “I don’t know any successful businesses that would hire entry-level grads for the most difficult positions, isolate them from coworkers, and then expect them to perform as well as more experienced colleagues. But that’s exactly what we do with many new teachers.” Professional development for teachers as a whole is lackluster (over 90 percent of teachers participate in workshop-style training sessions and not much else, despite the fact that research shows it’s ineffective); but when you take into account the inexperience and challenges that come with the first few years of teaching, our inability to effectively support new teachers is a travesty.
While improving teacher training prior to entering the classroom could lessen the struggle for beginning teachers, anyone who has taught knows that the first year of teaching—once you are officially the teacher of record—is a whole different ballgame than student teaching and pre-training. We can’t just “fix” teacher training and expect novice teacher problems to disappear (particularly because fixing teacher training is very complicated). We have to provide better support during the first few years, too. Veteran teachers know this. Many still remember their first years with a shudder. That’s why mantras like “Don’t worry, it gets better” and “Wait until your second year, it’s much easier” are mainstays. Districts know it too; that’s why they send new teachers through orientations and inductions, sign them up for professional development (though it’s often useless), and pair them up with mentors. Hopefully, the general public (through media coverage like encouraging letters and survival guides for new teachers) is starting to catch on too. But the problem isn’t what we’re doing—it’s how we’re doing it.
Like many policies, supports for new teachers fall apart in implementation. In a recent Ed Week article, Texas principal David Kauffman admits that even though he “checked all the boxes” for his new teachers, many of them still left his school or the profession. The supports Mr. Kauffman lists (induction, observation and feedback, peer support) are all backed by research—so why the lack of success? Typically, districts go through the motions of supporting new teachers without actually investing and engaging in the process. It’s easy to send a new teacher through induction, but induction does little for a teacher’s growth if it’s merely new-hire paperwork, mandated professional development (such as instruction on how to handle suspected abuse or medical emergencies, which are standard and necessary but do nothing for a teacher who struggles with management or planning), and time to decorate the classroom. Observation and feedback—among the most effective ways to make teachers better— tends to be infrequent, uncoordinated, vague, or not actionable. It means little to a new teacher if a busy mentor or principal pops into the classroom for five minutes and then leaves feedback along the lines of: “Good job, don’t forget to write the lesson objective on the board.” Similarly, peer support and collaboration, which most teachers long for, is far more difficult to manage if veteran teachers don’t have time during the day to observe and collaborate with new teachers.
In short, the struggles of new teachers do often go unanswered. First-year teachers have to sink or swim—and that has to stop. Given the cost that students pay when new (or any) teachers don’t get the training and professional development they want and deserve, it’s about time that districts double down on faithfully implementing new teacher supports instead of just checking off the boxes. It’s being done successfully in some places. Schools in Austin, Texas, recently gave their new teacher supports an overhaul courtesy of a program developed by the Carnegie Foundation for the Advancement of Teaching called the Learning Teaching (LT) Program. If districts in Ohio are hesitant to put the entire program (it’s a ninety-day cycle) into use, they could easily pick a few components that are simple to implement but could revolutionize the first-year teacher experience. For example, consider schools that already have mentoring programs in place. These schools could provide new teachers and their mentors with an extra planning period designed for co-planning, observation, and feedback. This adjustment ensures that new teachers receive consistent feedback outside of the evaluation cycle, allows both veterans and beginners to reap the benefits of collaboration, and prevents mentors from feeling like they have to choose between helping the new teacher and devoting their planning period to work for their own students. Or schools could simply survey their new teachers every six weeks and allow school leaders to pinpoint specific need areas and provide tailored development (maybe even through the mentor). The changes don’t have to be huge—but they do have to be effective.
Just as ninth grade is the make or break year for the rest of a student’s high school career, so too is the first year of teaching for a new teacher. While crucible moments are certainly important for effective leaders (and the leadership it takes to effectively teach is substantial), there’s a fine line between allowing a teacher to struggle and inadvertently setting them up for failure. We have to do better by our first years. They deserve it—and so do our students.
These days, the words “Massachusetts standards” cause hearts to flutter among some in Ohio. And not without reason. The Bay State had solid pre-Common Core academic-content standards. But less known is how demanding Massachusetts set its performance standards—the cut-score for achieving “proficiency” on its state tests. This bold action bucked the No Child Left Behind trend, whereby many states, including Ohio, set dismal performance standards. (Under NCLB, states were allowed to set their own bar for “proficiency.”) In this new study, we see just how high Massachusetts set its performance standard relative to other states. To rate the “stringency” of state performance standards, Gary Phillips of AIR created a common scale by linking state NAEP results from 2011 to international tests. Looking at fourth-grade math and reading, Massachusetts had the most stringent performance standards in the land. And in eighth grade, Massachusetts tied with a few other states for the most-stringent standards. Meanwhile Ohio’s performance standards were woefully mediocre compared to other states. Importantly, the study also points out that higher performance standards also led to lower state-reported proficiency rates. Massachusetts, for example, reported roughly 40–55 percent proficient in these grades and subjects; in contrast, Ohio reported 70–85 percent proficient. But this doesn’t mean, of course, that Ohio students actually know more than Massachusetts students: The NAEP results—a standardized test given to a sample of students in all states—actually show the reverse. Higher “standards” are not just content standards—i.e., the expectations for what students should know at each grade level. It also matters how high policymakers set the bar for passing a test. When policymakers set high performance standards with solid content standards, students are more likely to achieve. That’s one lesson Buckeye State policymakers can draw from the Bay State experience.
SOURCE: Gary W. Phillips, International Benchmarking: State and National Education Performance Standards, American Institutes for Research (September 2014).
Always a hot topic for debate, charter school issues—especially those involving funding—are hotter than usual.
Let’s start at the national level, which we can see in the National Alliance for Public Charter Schools’ new report The Health of the Public Charter School Movement: A State-By-State Analysis. NAPCS reports on twenty-five states and the District of Columbia, assessing the overall “health of the movement” by focusing on eleven indicators, including student-enrollment growth, innovation, and academic quality. Washington, D.C.’s and Louisiana’s charter schools come out on top, in part because of equitable funding for charters compared to district schools. Oregon and Nevada finish last for a number of reasons, not least of which being poor learning gains. Ohio finished in the middle of the pack, getting high marks for charter growth but struggling with student achievement.
The state of New York ranked fifth in the NAPCS analysis, just ten points behind Louisiana, but has experienced some well-publicized tussles over charter school issues in recent months, including a lawsuit filed by a group supportive of charter schools alleging that New York’s method of funding charter schools violates the state constitution and disproportionately hurts minority students. Buckeye state officials should keep an eye on this case, as Ohio charter school students receive similarly disparate funding.
As reported elsewhere in this issue, funding of charter schools is being debated in Fordham’s home state of Ohio as well. The Ohio Alliance for Public Charter Schools has issued their analysis of recently released Ohio report card data in two parts, one focusing on absolute performance of schools and the other on student achievement gains—and both comparing charters to district schools. The bottom line: There are not enough high-quality schools of either type in Ohio’s urban areas.
John A. Dues is the Chief Learning Officer for United Schools Network in Columbus.
"There can be no keener revelation of a society’s soul than the way in which it treats its children."
-Nelson Mandela
As a society, we are in need of some serious soul searching. There is an urgent need to support and create as many outstanding schools as possible as a part of a larger plan for improving life outcomes in Columbus’s most challenged neighborhoods. In Central Ohio, outcomes for kids that grow up just a few miles from each other can vary immensely. Drive east on Main Street from Miller Avenue in the Near East Side to Capital University in Bexley and in the span of two miles you will get a snapshot of the different worlds that exist within our city. Take that same drive on Central Avenue from Dana Avenue in Franklinton to Grandview and you will have a similar experience.
Challenges facing our students
Over the last year, there have been a series of articles in the Columbus Dispatch that provide a lens into some of these troubled neighborhoods and the crises they face. Taken together, it starts to create a picture of the environment in which many children from neighborhoods like the Near East Side and Franklinton—neighborhoods where United Schools Network’s three schools are located—are living. These students come to school with needs that are far greater than students in less troubled areas.
In a 2013 special report entitled Kids Shooting Kids (see here), four compelling articles outline the firearms-related violence among teenagers in Columbus. Half of all the gun violence affecting this age group occurs in four zip codes: Two include parts of the Near East Side, and the other two include adjacent neighborhoods on the Near South Side.
More recently, a three-part series entitled Alarming Losses (see here) provided a heartbreaking look into infant mortality rates in Franklin County. The county’s rate is far above the national average, and in eight neighborhood hotspots, the numbers are staggering. The eight neighborhoods include the Near East Side, the Near South Side, and Franklinton.
And finally, a recent article entitled How Safe is Your Neighborhood? (see here) explored crime rates around Columbus after the shooting death of 14-year-old Amanda Kirwin in Franklinton. The highest rate of serious crime occurs in Cruiser District 83, which is a section within Franklinton. Cruiser District 122 in the Near East Side has the dubious distinction of the highest homicide rate in Columbus.
These are the stories that make the papers; many similar stories—the everyday realities of children that live in these neighborhoods—don’t. Walk into any school in these neighborhoods and ask, “How many of you heard gunshots outside your home in the last week?” I am willing to bet there will be more hands up than not. Living in areas where violence occurs frequently is similar to the experience of soldiers in war zones. In fact, one of the revelations that researchers have discovered, and which was outlined in heartbreaking fashion in the Kids Shooting Kids series, is that many children in these neighborhoods suffer from Post-traumatic stress disorder.
Funding Does Not Follow Need in Ohio
And yet, despite the fact that students in the Near East Side and Franklinton come to school with much greater needs and that these communities desperately need strong educational options, students in places like Bexley and Grandview are afforded better-funded schools. Purposeful or not, these funding disparities say a lot about where priorities are placed in our region.
Inequitable funding is even more dramatic if schools serving high-need student populations happen to be charters. Every day, thousands of students attend brick-and-mortar charter schools in Columbus where they receive a separate and unequally funded education. Charter schools in Ohio lack access to local levy dollars and state facilities funding, which amounts to thousands of additional dollars per pupil for traditional districts.
Nearly 80 percent of these students live in poverty, and poverty levels in schools are one of the best predictors of student achievement results. In other words, if you know a school’s poverty rate, you can predict their Performance Index score with reasonable accuracy. Charter opponents in Ohio often point out that test scores in traditional districts are better than scores in charters. This is generally true. However, when you compare schools with similar student demographics, charters outperform district schools. Because of the nature of Ohio’s laws, the vast majority of charters exist within the eight largest urban areas in Ohio. They are serving areas with high poverty rates and related neighborhood dysfunction.
Relationship between Poverty and School District Report Card Performance Index Score[1]
Poverty Rate | Performance Index |
13.97% | > 105 |
28.70% | 102.5 – 104.9 |
34.09% | 100.1 – 102.4 |
43.08% | 97.6 – 100.0 |
50.98% | 95.1 – 97.5 |
55.04% | 92.7 – 95.0 |
62.34% | 90.1 – 92.6 |
82.11% | 90 and Below |
United Schools Network students’ needs are great, but their potential is even greater. It is a shame that Ohio’s lawmakers have chosen to add inequitable school funding to the list of obstacles facing our students. When the funding issue is raised, charter school opponents offer the same litany of tired excuses, half-truths, and outright misinformation. However, they cannot hide from the numbers. It is quite clear in the table below that the students with the highest needs are receiving the lowest amounts of funding. The largest achievement gaps in Ohio are typically associated with economically disadvantaged students and students of color (because there is a strong association between these two measures), so it would stand to reason that schools serving large populations of these students would receive higher funding. But that is simply not the case.
School Funding vs. Rates of Disadvantage
District | Expenditure per Equivalent Pupil[2] | Economically Disadvantaged | Students of Color | 8th Gr Reading Prof. |
Upper Arlington | $13, 176 | 1.4% | 2.3% | 97.8% |
Grandview Heights | $13,167 | 11.3% | 3.2% | 97.7% |
Bexley | $12,750 | 11.1% | 9.3% | 95% |
Columbus City | $11,563 | 78.5% | 65.4% | 73% |
Dublin | $10,606 | 14.9% | 9.8% | 95.4% |
Westerville | $8,798 | 34.6% | 28.5% | 92% |
CCA – Dana Ave. | $8,329 | 100% | 46.9% | 89.3%[3] |
CCA – Main St. | $7,737 | 97% | 93.7% | 95.8% |
Why Equalize Funding?
Despite being grossly underfunded, both CCA middle school campuses far out-performed expected achievement levels based on their high economically disadvantaged numbers (the elementary schools is in its first year of operations, so they do not have any test results). On eighth-grade reading test results, which are one of the best predictors of future academic prospects, CCA students performed at levels similar to those in wealthier suburban districts. These results are nothing less than extraordinary.
We will never make excuses, and we will never settle for anything less than the best for our students. However, I know that these results are the product of superhuman efforts by our teachers and leaders, who are typically paid less than their traditional school counterparts. Keeping proven, high-quality educators should be a priority in any school that serves high-need students; current funding gaps make this challenging. I know that our students would be better served if we could provide nursing, counseling, and other health services. I know that our students deserve the same opportunities as their suburban peers and arguably need them more. We strive to make this happen by being fiscally efficient and partnering strategically. But only so much ground can be made up through these measures when the funding gap is so large.
Let’s do a bit of collective soul searching and fix this separate and unequal funding system—and let’s get it done now. Until this happens, many more souls in the Near East Side, Franklinton, and other similar neighborhoods will be lost. I, for one, cannot stand to see even one more lost soul.
[1]Fleeter, Howard B. “Apples to Apples”: Ohio School District Expenditure Per Equivalent Pupil. Prepared for Education Tax Policy Institute. November 2013.
[2] Expenditure per Equivalent Pupil provides a measure that offsets differences in spending across districts caused by characteristics of the student as opposed to the operations of the district. EPEP allows for more accurate “apples to apples” comparisons across districts.
[3] CCA – Dana Ave’s results are for 7th grade reading because they did not serve 8th graders last year.
Next-generation learning models—“technology-enabled” education, if you will—are designed to personalize education in any way necessary to help students at all performance levels meet and exceed goals. As with any innovation introduced into American education, next-generation models have met resistance and in many cases have been either halted altogether or subsumed into the by-the-book system. In their new issue brief, Public Impact’s Shonaka Ellison and Gillian Locke argue that charter schools are the ideal place for next-generation learning models. Charter-school autonomies, inherent in their DNA, provide the best platform for tech-driven innovations like ability grouping, mastery-based promotion, student-paced learning, separation of administrative and instructional duties for teachers, and online learning. The researchers show these practices are carried out in various combinations at a number of charter schools around the country. No mention is made in the brief about solely online schooling, whose model would seem to be synonymous with much of the innovation described here but whose results have too often fallen short of expectations. In fact, having a building in which to attend school seems to be an unstated requirement for creating the type of next-generation models the authors examine. And while Khan Academy and ASSISTments can extend the school day into the home, building a brick box just so students can come inside and use these tools inside seems somehow less than innovative. But the use of technology also requires the hard work of quality implementation. “Positive student results heavily depend on quality implementation,” the authors note. They make some effort to highlight the benefits of new technology-enabled educational models for educators (among them, more direct instructional time, more planning and team-teaching time, and assessments built into the lessons). And they also show how students can benefit from technology implementation done well, such as online work adaptable to student-ability level, schedule flexibility, and instant feedback. Absent from the entirety of the discussion is what parents think about technology-enabled education. Some parents might prefer more “traditional” instruction for their children than others; and so teachers and principals, wanting to use more technology in the classroom, will need to make clear that it isn’t just being used as the latest fad.
SOURCE: Public Impact: Shonaka Ellison and Gillian Locke, Breakthroughs in Time, Talent, and Technology: Next Generation Learning Models in Public Charter Schools, National Alliance for Public Charter Schools (September 2014).