Progress in Dayton Public Schools’ teacher-policy reforms
A look at the past year in teacher-policy reforms in Dayton.
A look at the past year in teacher-policy reforms in Dayton.
Almost a year has passed since the National Council on Teacher Quality (NCTQ) published Teacher Quality Roadmap: Improving Policies and Practices in the Dayton Public Schools. The report, funded jointly by the Thomas B. Fordham Institute and Learn to Earn Dayton, analyzed teacher policies and related practices within the district, with the goal of to identifying short- and long-term improvements to policy and practice that could in turn increase the quality of the teaching force.
As Learn to Earn Executive Director Tom Lasley noted in June of 2013, when the report was released, teachers’ impact matters immensely, especially in a region and district that has seen significant population declines and has confronted (and continues to confront) economic challenges.
NCTQ framed its analysis and findings around five key areas: staffing, evaluations, tenure, compensation, and work schedules. Analysts met with teachers, principals, community leaders, and other stakeholders, and they reviewed district policies and state law. A slate of recommendations—some easier to tackle (e.g., maintain the current schedule of teacher observations under the new evaluation framework) and some harder (e.g., giving principals the authority to decide who works in their buildings)—resulted.
District superintendent Lori Ward and her colleagues got to work and, by December of 2013, accomplished several significant improvements. Among them, principals are no longer forced to accept transferred teachers to fill vacancies; rather, principals have the ability to select the most qualified candidate (including new hires).
Additionally, reductions in force, which in the past were based on seniority and sometimes resulted in the removal of excellent teachers from the classroom, are now based on teacher evaluations and district needs. Seniority is still considered, but only as a tiebreaker in situations where evaluations are similar.
The district also acted on NCTQ recommendations related to tenure and compensation. Two committees—a tenure committee and a compensation committee—were established. They are each comprised in equal number of teachers and administrators. The former committee focuses on awards of tenure, mandatory professional development, and building-level academic teams. The latter committee works on teacher incentives for hard-to-staff buildings and content areas, salary increases based on evaluations, expanded career ladders that enable teachers to increase their compensation while remaining in the classroom and teaching (i.e., teacher leaders).
At the time the NCTQ report was released, there was scant news coverage—and what coverage existed focused on the need to improve. To the best of our knowledge, nobody covered the improvements that have been made, none of which made the easy-to-tackle list. These improvements are a significant accomplishment, and the work of Superintendent Ward, her district colleagues, and members of the teacher union who worked on the changes merits recognition.
A couple of years ago, Fordham held a contest to determine the most reformed state in the land. To almost no one’s surprise, Indiana—under the leadership of Governor Mitch Daniels and State Superintendent Tony Bennett—raced to victory. Indiana was held up as a model of education reform, and we encouraged other states to follow its path. Today, we again ask you to look to Indiana—but for precisely the opposite reason.
Hoosier State legislators, like those in Ohio, have come under increasing pressure from a small, vocal set of Common Core State Standards (CCSS) critics urging the state to repeal their adoption of the standards. Indiana acceded to their demands as Governor Mike Pence signed legislation on March 24 making Indiana the only state in the nation to formally withdraw its participation in the CCSS. And in what happened next, there are lessons to be learned for Ohio legislators who think there are political or educational benefits associated with exiting the CCSS.
First, states need to have standards in place, but good standards take time to develop. Indiana’s crash course in standards-writing over the past couple of months, aimed at having new standards in place this fall, has left almost everyone disappointed and frustrated. Critics of Indiana’s go-it-alone approach have suggested that the changes were nothing more than a rebranding of the CCSS. Educators, meanwhile, are also feeling the pressure: the Republic quoted Indiana State Teachers Association Vice President Keith Gambill as saying, “Any delay past that time (April 28 meeting to approve new standards) really then puts the professionals in a compromised position. At some point in time, there has to be: This is it.”
Second, when a state changes its standards, no assessments aligned with the newly adopted standards exist. This is the case right now in Indiana. Stop for a moment and think about the myriad ways we use assessment scores today. The primary reason remains to evaluate whether students are learning what they are expected to at each grade level. This is especially critical for key points, such as the third grade (due to the third-grade reading guarantee) and upon high-school graduation, but it also allows teachers to identify students who are struggling or excelling and to provide extra supports as necessary. Ohio also uses assessment results to calculate school and district grades and as a factor in teacher evaluations. Backing away from results-based accountability would do nothing for Ohio’s children in the long run and would be frustrating to educators, parents, and taxpayers alike.
Third, simply rewriting standards doesn’t ensure the result will be higher-quality standards. In fact, Indiana’s substitute standards are not only worse in terms of content and rigor than the Common Core, but they are also worse than Indiana’s old, pre-CCSS standards (which were quite good, though never well implemented). And if you think CCSS critics think differently, think again: Stanford professor James Milgram, a high-profile CCSS critic, called Indiana’s new standards “pretty much a complete mess” and noted that “they are repetitive and horribly disorganized.”
It is worth noting that because Indiana had high-quality standards before its adoption of the CCSS, reverting to its previous standards is a viable option. Ohio, on the other hand, had mediocre state standards previously and would pretty much be starting anew, were it to abandon the CCSS in math and English language arts.
Fourth, switching standards without really knowing what to do next is a waste of money. A report last year from Indiana’s nonpartisan legislative staff estimated tens of millions of dollars in costs to adopt new tests, plus ongoing costs to administer them. That’s not even counting money already spent—by districts as well as the state—on teacher training, textbook purchases, and so on, in addition to the funds that must be found to do it all over again to match the new standards.
Ohio legislators should learn from their brethren to the west. Indiana, at the behest of critics, decided to stop utilizing the CCSS, and the results—while predictable—have left many shaking their heads. Nearly everyone is unhappy with the new draft standards, be they friend or foe to the CCSS. Educators are concerned about the lack of standards and aligned assessments, the new standards themselves are generally seen as inferior, and the state looks likely to spend a lot of money on implementation without knowing exactly where its next step will fall. There don’t appear to have been any gains educationally or politically in Indiana. Ohio would do well to avoid such chaos.
After a controversial change to a state law, what happens on the ground? This piece, from last month’s meeting of the Association for Education Finance and Policy, delves into one such case. In 2012, Ohio lawmakers approved the Ohio Teacher Evaluation System (OTES), which requires evaluations be based on student-academic-growth measures, formal observations, and classroom walkthroughs. This study examines whether local teacher-collective-bargaining agreements negotiated after OTES was adopted allow the evaluation results to be used in personnel decisions (the authors called this “bridging”)—or if they protect experienced or tenured teachers’ jobs regardless of their evaluation scores (“buffering”). The authors found that all of the fifteen contracts they studied are essentially bridging when it came to evaluation policies, meaning that the contracts match well with state law and allow principals to use growth measures, observations, and walkthroughs when evaluating teachers. However, results were quite different when it came to actually using OTES to make decisions: the researchers discovered that four of five contracts are buffering when they examined a variety of specific provisions. For example, most contracts contain buffering provisions that keep seniority as a consideration when making reductions in force. Some even keep seniority as the primary or sole means of deciding who is laid off first, in spite of state law to the contrary. Regarding transfers, only three districts have bridging contracts that give administrators discretion to fill vacancies; most keep seniority as a consideration, and none explicitly require the use of OTES scores in transfer decisions. For tenure decisions, only three districts include OTES in their procedures for removing poorly performing teachers. Finally, only three districts have bridging provisions for compensation, meaning they consider teachers’ OTES scores when deciding how much to pay them. While this doesn’t seem promising for OTES implementation, these fifteen districts are the first to renegotiate their contracts after the adoption of the new system. Many more districts have yet to negotiate their new agreements. The findings do indicate that these districts are clearly hesitant to codify the use of OTES when making personnel decisions. By excluding OTES from their contracts, they might be undermining something they don't like (perhaps with the prodding of the teacher union), or—less nefariously—avoiding something they don't yet trust.
W. Kyle Ingle, P. Christian Willis, and James Fritz, “Collective Bargaining Agreement Provisions in the Wake of Ohio Teacher Evaluation System Legislation,” presented at the Annual Meeting of the Association for Education Finance and Policy, San Antonio, TX, March 2014.
In 2013, there were a shocking number of charter-school failures across Ohio, including seventeen in Columbus—most of them first-year startups. In response, the Ohio Department of Education required additional paperwork from six authorizers (often referred to as sponsors) looking to start new schools in the 2014–15 school year, hoping to zero in on weak structures and poor advance planning before startup funds were released and students began attending the schools. Last Friday, the department took an unprecedented step and issued a stern warning to three authorizers that they will be “shut down” if they proceed with plans to open six new community schools. The deficiencies identified had one similarity: connections or similarities to other charters that had ceased operation voluntarily or had been shut down. It’s a shame that this step was necessary, but the recent track record of Ohio’s authorizers suggests there was a need for additional scrutiny. We applaud this bold step and commend State Superintendent Richard Ross and his team for swift and decisive action.
Former Ohio governor Jim Rhodes wrote in 1969, “Many of today’s social and economic ills result from a lack of employment among the able-bodied. The lack of employment stems directly from inadequate education and training.” Governor Rhodes continued, asserting that vocational-training programs for young women and men could help to meet the demands of a changing modern-day economy.
Fast-forward forty-five years: Ohio has changed substantially, but as did Governor Rhodes, the state’s policymakers are again hitching their wagons to vocational education. Retro is in, and that’s a good thing: vocational education—a.k.a. “career and technical education”—has the potential to open new pathways of success for many teenagers.
Little, however, is widely known about how Ohio organizes its vocational-education programs or how students in them fare. Cue the state’s new report cards, which include helpful information about the state’s vocational programs. The following looks at the report cards, yielding five takeaways regarding Ohio’s vocational options.
Point 1: CTPDs and JVSDs are not the same
Ohio has two key entities in the realm of vocational education: (1) Career and Technical Planning Districts (CTPDs) and (2) Joint Vocational School Districts (JVSDs, also called “career-tech centers”). CTPDs are an administrative entity, while JVSDs are direct vocational-education providers. Both CTPDs and JVSDs are comprised of member school districts; however, while all districts are part of a CTPD, not all districts are part of a JVSD.
Throughout Ohio, ninety-one CTPDs oversee vocational programs. CTPDs have at least one member school district (often more), and each CTPD has a “lead” district that approves the vocational programs of its member districts, charters, and STEM schools. The Ohio Department of Education (ODE) assigns districts and schools to CTPDs, and it issues report cards to CTPDs.
JVSDs are regional vocational centers—“vo-techs”—that draw students, typically in grades 11 and 12, from their member districts. One can determine the members of a JVSD’s by searching here, and nearly all JVSDs have an open-enrollment policy. Some JVSDs have statewide open enrollment; others are open only to a student residing in a district adjacent to a member district. State regulations allow charter- and private-school students to enroll in a JVSD, too.
Ohio has forty-nine JVSDs, and each one is a member of a larger CTPD. For those CTPDs without a JVSD—there are forty-two such CTPDs—their member school districts provide vocational education within the district.
Point 2: Ohio released its first CTPD report cards in 2013
The state’s new CTPD report cards contain data on “concentrators”—students who have left high school, either as a graduate or withdrawal, and who have taken a majority of their high-school courses in a particular area of vocational study. (There are sixteen areas of vocational study, including agriculture, health services, manufacturing, hospitality, etc.) Unlike a district or charter school’s report card, which generally accounts for students still in the school, CTPD report cards (with the exception of the “nontraditional-participation” metric) only account for students in the year that they left high school.
Across the state’s ninety-one CTPDs, 32,403 students were included in their 2012–13 report cards. The graduation and post-placement rates receive an A–F letter grade, and the report cards include the following metrics”
A JVSD’s report-card results are displayed on the report card of its overseeing CTPD, though they do not receive any A–F letter grades. The report-card data for the two entities are practically equivalent in all cases; hence, it appears that the large majority of the vocational students in a CTPD with a JVSD receive their vocational education at that JVSD.
Point 3: Remarkably high graduation rates across CTPDs
Chart 1 shows that the average graduation rate among CTPDs was an exceptional 95 percent, above the statewide average of 84 percent. Odds are, then, that vocational students will cross the high-school finish line—and that’s great news. But due to the lack of variation across CTPDs, the graduation-rate indicator doesn’t help us compare the performance of one CTPD relative to another. In fact, when the graduation rates are converted into an A–F letter grade, eighty-one of the ninety-one CTPDs earned an A or B.
Chart 1: Five-year graduation rates are above state average and approach 100 percent
Source: Ohio Department of Education (and for Charts 2 and 3) Notes: Chart displays the five-year graduation rate for each CTPD (represented as a bar). Note the different vertical scale for chart 1 compared to charts 2 and 3. (No CTPDs had graduation rates under 75 percent.)
Point 4: Solid post-high-school placement rates, but…
CTPDs are required to survey students who left their schools in the previous year. The survey asks the students whether they were either in a job, an apprenticeship, post-secondary education, or the military within three to six months after leaving school. The survey response is strong: CTPDs report an average response rate of 94 percent. That being said, a few CTPDs reported low response rates. Cleveland Municipal CTPD, for example, reported a dismal 32 percent response rate.
Chart 2 displays the percentage of concentrators who reported a post-high-school placement on these surveys. The average placement rate is respectable 86 percent. There is one significant outlier: Lorain City CTPD reported a 26 percent placement rate (yikes!). Unfortunately, the report cards do not sort the survey data by whether concentrators landed in college, work, or military, and we don’t know anything about whether the employment, if applicable, is “gainful” or not. (I surmise that a fast-food job could count as “placement.”)
Chart 2: Post-placement rates for CTPDs are high, but “placement” seems broadly defined
Notes: Chart displays the placement rates—the percentage of concentrator students who report on a survey that they are employed, in a post-secondary education program, or in the military—for each CTPD (represented as a bar). The average survey response or “status-known” rate is 94 percent.
Point 5: Few students receive an industry credential (with wide variation across CTPDs)
The CTPD report cards attempt to capture the percentage of students who earned an industry credential while in high school or shortly after graduation. The report-card data, however, indicate that just one-in-four vocational students earned a credential. Yet, as Chart 3 displays, there is tremendous variation in credentials earned across the CTPDs. Some CTPDs reported zero or close to zero students earning a credential, while others reported that upwards of 75 percent of their students earned one. U.S. Grant CTPD in Southwest Ohio and Trumbull County CTPD in Northeast Ohio topped the list at 92 and 93 percent, respectively.
The variation may be explained by the uncertainty around what constitutes a “credential,” since its definition is evidently a work in progress. (The credentialing standards still appear to be in draft form.) Perhaps Ohio’s CTPDs were unclear about how to report a credential earned during the first round of report cards in 2012–13. Or perhaps the variation is a matter of some CTPDs providing greater credentialing opportunities than others. The “industry credential” is in its infancy, but it’s an indicator worth keeping an eye on.
Chart 3: Percentage of vocational students earning an industry credential varies widely across CTPDs
Notes: Chart displays the percentage of concentrator students who earned an “industry credential” during high school or immediately after. Each CTPD is represented as a bar.
Conclusion: New CTPD report cards are a big step forward, but performance metrics need fine-tuning
Vocational education is a vital cog in Ohio’s education system, and the CTPD report cards greatly improve our understanding of the state’s vocational choices. When it comes to quality, we know this much for certain: CTPDs do a superb job ensuring that their students graduate high school.
Beyond graduation rates, however, things get murky. The results from the post-placement surveys are solid, but the response rates for a few CTPDs are low. If non-respondents are more apt to be unemployed or out of school, the post-placement rates for those CTPDs could be less rosy. Moreover, we have nary a clue from the surveys about the short-term labor-market outcomes for vocational students in terms of salary, number of hours worked, or whether the job is in their field of study. Meanwhile, the “industry credentials” metric remains ill-defined and is clearly in an embryonic stage.
Kudos to state policymakers for lifting up vocational education and for implementing a brand-new report card for CTPDs. However, as the state refocuses attention on vocational education—and perhaps spends more money on it—we cannot neglect improvements to how outcomes are tracked and reported for the state’s CTPDs and vocational centers.
Innovation Ohio’s broadside on charter schools—and, by extension, the parents who select them and the children who attend them—is outrageous. The report is not necessarily flawed because of their critique of charters, per se, but because of the Swiss-cheese analysis that supposedly bolsters its conclusions. The author of this report makes two analytical faux pas, and each are discussed in turn.
First, the report’s suggestion that most charter students land in a lower-performing school, relative to the district-run school they came from, is bunk because of the absence of analysis at the student level. When Community Research Partners, a Columbus-based research organization, analyzed student data from the Ohio Department of Education in a project supported by Fordham and ten other organizations, its analysts discovered that the majority of charter students transferred to a charter rated the same or better than the district school they came from. Of the charter students in Cincinnati, Cleveland, Columbus, Dayton, and Toledo—locations with relatively large concentrations of charters—39 percent went to a higher-rated charter school and 26 percent went to a charter school rated the same as the district school they had previously attended. (The students’ transfer data were taken from October 2009 to May 2011; at that time, the state issued school buildings an overall rating.) In the meantime, if we wanted to conduct an empirical evaluation of Ohio’s charter-school effectiveness relative to district schools, the richest analysis outside of a randomized experiment would be a student-to-student comparison, using achievement results from district and charter students from very similar backgrounds.
Second, the report’s fiscal analysis is egregiously incomplete. The report ignores the fact that traditional school districts, on average, generate roughly half their revenue via local property-tax dollars. For wealthier districts, local taxpayers’ share is well over half the district’s revenue—hence their relatively small amounts of state aid. All told, Ohio property owners paid nearly $8 billion, or about $4,000 per pupil, in taxes to their local district last year. But with just a few exceptions, charter schools receive zero revenue from local property taxes. (This even as charter-school parents pay their local taxes, whether directly as a property owner or through higher rent.) In addition, the report ignores the fact that charter schools (a) receive virtually no state facility funding (the FY 2014–15 budget appropriated for the first time up to $100 per pupil for charter facilities); (b) receive no “guarantee” funding (typically, transitional state aid for shrinking districts); and (c) none of the $1.6 billion of property-tax reimbursements distributed to districts. If all revenue streams were to be accounted for, charter students are funded at approximately 15 to 20 percent less than their district-school peers.
We at Fordham have been candid about the performance of Ohio’s charters. The evidence—both from our own experience and from the data—indicates that charters, as a group, must improve significantly. Moving forward, Ohio’s charters must demonstrate their net worth to parents and taxpayers through student achievement that is better than the district schools from whence their students came. The starting point, however, for such improvements should be sound research; meantime, what Ohioans don’t need is fictional reports and the politically driven chatter based on them.