Deconstructing Ohio’s testing report recommendations
"Test mania" debunked - now for the real work to improve testing in Ohio
"Test mania" debunked - now for the real work to improve testing in Ohio
Chances are, you’ve heard something in the past year about test mania. Everyone from superintendents to parents to retired educators has an opinion; even Secretary of Education Arne Duncan suggested tests and test prep are dominating schools. Given all this attention, one might assume that students spend hundreds of hours each year taking tests—perhaps even more time than they spend actually learning. A recent report from Ohio Schools’ Superintendent Richard Ross paints a very different picture.
The report, required by state law, reveals that Ohio students spend, on average, almost twenty hours taking standardized tests during the school year. (This doesn’t include teacher-designed tests, but does include state tests.) Twenty hours is a good chunk of time, but when one considers that the school year in Ohio is about 1,080 hours total (it varies by district and grade level), that means testing only takes up about 2 percent of the year. (Report results show that students spend approximately fifteen additional hours practicing for tests, but this additional time only raises the total percentage to 3 percent).
Regardless of this small percentage, critics of standardized testing make some valid points. No one wants quality, in-depth learning to be pushed aside for superficial test prep, and a strong accountability system doesn’t have to mean a test-saturated system. That’s why Superintendent Ross’s report is so beneficial: While it reinforces testing’s role in monitoring and improving student achievement, it also makes recommendations for limiting the time spent taking and prepping for tests.
The idea of reducing testing is popular, but exactly what testing should be reduced? The report recommends eliminating student learning objectives (SLOs) as part of the teacher evaluation system for teachers in grades pre-K–3 and for teachers teaching in non-core subject areas in grades 4–12; and also eliminating the fall third-grade reading test. Let’s look at each of these.
The Ohio Teacher Evaluation System (OTES) requires that between 42 and 50 percent of a teacher’s evaluation be based upon objective academic growth measures. Since state assessments and their resulting data only cover certain grades and subjects, other means of estimating a teacher’s impact on learning have been developed for teachers in subjects like art, music, and gym. One of the most common is the student learning objective. The Ohio Department of Education defines a student learning objective as a measurable, long-term academic growth target that a teacher sets for students at the beginning of the year. SLOs are more than a stated learning objective: They involve a fall pre- and spring post-test for students; further, they detail how that target growth will be measured over time and why that level of growth is appropriate. According to the report, SLOs contribute as much as 26 percent of total student test-taking time. Replacing student learning objectives makes sense. What’s harder to buy is the report’s recommendation for the expanded use of shared attribution.
Shared attribution is the practice of evaluating teachers based on test scores from subjects other than those they teach. For example, ODE recommends using a building or district’s overall value-added rating as a shared attribution measure. In other words, the process assigns a non-core subject teacher (music, physical education, art, etc.) an evaluation score based on how well students perform in their core classes (like English and math) with other teachers. Ohio Federation of Teachers President Melissa Cropper made a valid point when she stated that shared attribution doesn’t determine if all the “shared” teachers are effective—it only points to the effectiveness of the core teachers. The purpose of teacher evaluations is to separate effective and ineffective teachers. Shared attribution, however, does the opposite.
Of course, ODE and Superintendent Ross are constrained by state law, which—as mentioned previously—requires at least 42 percent of a teacher’s evaluation to be based on an objective academic growth measure. Objective is the key word here, since it suggests that growth must be measured by tests (objective) instead of classroom evaluations (subjective, since they’re given by the principal). Maybe the testing burden of SLOs and the trouble with shared attribution makes it worth questioning why we don’t trust principals to evaluate teachers the way we trust supervisors in other fields to evaluate their employees. The legislature should be open to changing the law. Teachers are right to express concerns about being evaluated based on a colleague’s test scores rather than their own.
The report also suggests eliminating the fall third-grade reading test. This test is the first of many opportunities for third graders to demonstrate their reading proficiency as part of the Third Grade Reading Guarantee. Ross notes that, although Ohio has administered the third-grade reading test twice a year for the past decade (long before students were required to read on grade level as a condition of moving on to fourth grade), it is “impractical” to administer Ohio's new tests (which have two parts instead of one) within the first two months of the school year. The practicality argument is a good one, particularly given how important it is to establish routines for students in those early months and how many other diagnostic tests occur in that time frame. In fact, eliminating the fall third-grade reading test reduces test time by 4.75 hours. Students will still have an additional chance to pass the test in the summer, and districts still have the option of using a state-approved alternative test throughout the year as a way to gauge student progress; in other words, teachers won’t sacrifice vital performance data, and students will still have more than one chance to pass, while overall testing time will decrease.
Furthermore, this elimination offers a solution to the misleading headlines that accompany the release of fall scores. When fall 2013 scores were released, they were accompanied by gloom-and-doom headlines from the press. Unsurprisingly, when spring results came out, many media outlets focused on the increase in passage rates from fall to spring. What’s misleading about this emphasis is that it fails to take into account the nature of learning during a school year. Of course more third graders are going to pass the test in the spring than in the fall—by the spring, they’ve had several more months of schooling under their belts. Releasing fall score reports so that the media can whip up a premature fear-of-retention frenzy doesn’t do kids any favors. It only stirs up false fears. Teachers should, of course, assess their students along the way, monitor students’ progress and needs, and share that information with families. But having a formal assessment only two months into the year, accompanied by data that measures students who are closer to the end of second grade than the end of third grade? Unfair and unneeded.
***
In the coming months, it will be interesting to see how the legislature, educators, and other stakeholders react to Ross’s recommendations. If there’s a good-faith effort to maintain accountability while limiting redundant and unnecessary testing, students will benefit—and parents will be relieved. However, the growing undercurrent to remove all testing raises questions about who stands to benefit in such a scenario. Let’s not forget that while students deserve an education that isn’t consumed by standardized tests, they also deserve schools that are held accountable for living up to their responsibility of providing an excellent education.
In spring 2013, Ohio policymakers approved a two-year, $250 million investment aimed at spurring innovation in public schools. Known as the Straight A Fund, this competitive grant program has since catalyzed sixty new projects throughout the state, many of which are joint ventures between schools, vocational centers, ESCs, colleges, and businesses.
As a member of the grant advisory committee, I gained a firsthand view of the exciting projects happening around the state, everything from “fab labs” (a computer center outfitted with computer-aided drawing software and 3-D printers), outdoor greenhouses, and robotics workshops. Those who are interested in these projects should plan to attend this conference in Columbus on February 5.
In the upcoming legislative session, lawmakers should continue to invest in innovation by reauthorizing the Straight A Fund. At the same time, the legislature should also consider a few alterations that could give an even stronger boost to the most innovative project ideas. The suggestions are as follows:
Remove the cost-reduction mandate.
A small provision in the Straight A legislation required grantees to show “verifiable, credible, and permanent” cost reductions that would result from the grant. As a result, applications were evaluated significantly on the cost-reduction criteria. (You can read applications online here.) Although well-intended, this provision created two problems:
First, applicants clearly struggled to quantify the cost reductions attributable to their project proposals. In some of the applications, the proposals made half-baked or underwhelming cost-reduction claims. For example, some described how a million-dollar project would decrease their copying and utility expenses. (Nice to do, but hardly the overhaul in cost structure that many districts need.) Others claimed future cost reductions that would have occurred regardless of the grant. One district, for instance, claimed its expected teacher retirements as a cost saving.
The cost-reduction mandate also stifled innovative project ideas. It effectively prohibited anyone from proposing the construction of a new school or starting a brand-new program from scratch. In cases like these, it proved difficult (if not impossible) for applicants to demonstrate future cost reductions, as there were no present costs to speak of.[1]
Reducing avoidable costs is a great thing to do. But mandating cost reductions as part of the Straight A grant criteria proved impossible to implement with fidelity and only stifled the possibilities of forward-thinking entrepreneurs.
Simplify the program goal to advancing innovation and student achievement.
The legislature should also remove other mumbo-jumbo in the grant requirements that arose out of Straight A’s appropriation language (e.g., ambiguous goals that grantees must show “sustainability” and “utilize a greater share of resources in the classroom”). Instead, each application should be judged based solely on its “innovative” merits—and how that innovation relates to student achievement.
By boiling down the goal of Straight A to innovation and achievement, the grant application could be greatly simplified. (The 2015 grant application had twenty-five questions designed around the goals set forth in legislation.) A simplified application could, in turn, strengthen the proposals—more focused on the innovative nature of the project and on achievement gains—and down the road, we may see projects that are more authentically innovative and academically impactful.
Create a separate competitive grant program within Straight A for charter schools (and potential charters).
In the coming year, the state should set aside a portion of the Straight A grant appropriation for a competitive program just for charter schools (and whomever they want to partner with). There are two reasons to do this:
First, a competitive grant program for charters would put them on a more level playing field for grant funding. Out of the 423 entities that won a grant, only twenty-six were charters.[2] (Seventeen of these won on a single charter-consortium grant.) Charters got the short end of the stick competing against their big-brother districts—alone or in consortia of multiple districts, ESCs, vocational centers, and institutions of higher education. This was truly a David-versus-Goliath scenario, and in this case, Goliath usually won.
Second, a fund geared toward charters could provide seed funding for those seeking to open (or replicate) a new charter. Starting a school from scratch is a challenging undertaking that takes much planning and thought; but unfortunately, charter school entrepreneurs have precious few supports to help them create successful new schools. (For instance, Ohio failed to receive federal funding to nurture charter startups in the last round of grants for which they were eligible; as a result, virtually no federal funds are available to kick-start charters in Ohio).
Ohio policymakers must help entrepreneurs during the startup phase, and a competitive grant fund would reward the charter proposals that are the most compelling, innovative, and likely to succeed. The state needs to help high-performing charters replicate and expand their reach—too many Buckeye students continue to languish in low-quality schools. Let’s use the Straight A Fund to hasten the growth of high-performing charters in Ohio.
Hats off to Governor Kasich and the state legislature for encouraging and emboldening Ohio’s innovators. The winds of innovation have begun to blow. But as the state legislature considers reauthorization of Straight A (which I hope they do), they should also make some changes that could strengthen the implementation of the program, which might just stoke the flames even more. As economist Joseph Schumpeter argued long ago, “Innovation by the entrepreneur leads to gales of ‘creative destruction’ as innovations cause old inventories, ideas, technologies, skills, and equipment to become obsolete.” May that happen in Ohio’s public school system.
[1] In an interesting twist, applicants were not allowed to show how the grant could increase revenue. It does not appear that this restriction was set forth in the legislation that enacted the Straight A Fund (it may be in another part of Ohio law or code that governs grant programs like this).
[2] This statistic double counts organizations that were awarded funding under more than one grant.
If you could redesign a city’s education system from scratch, what would it look like? In New Orleans, a terrible tragedy created the need to do just that. Today, education in the city bears very little resemblance to what existed ten years ago. School types, locations, information systems, and application processes are now almost entirely market-driven to give parents the information they need and the schools they want. The unprecedented landscape change in New Orleans has also given rise to a unique opportunity to study school choice in “revealed preferences”: what schools parents actually choose, and not just what they claim to want in a survey, when they must make a choice. The new report from Education Research Alliance for New Orleans compares choice data from immediately pre-Katrina with data collected two different years post-Katrina, as additional information and options settled into place over time. First the good news: After Katrina, the lowest-income families had greater access to schools with high test scores, average test scores increased across all students in the city, and even school bus transportation systems expanded (there’s no choice if you can’t get there). However, very-low-income families were less likely to choose schools with high test scores—even when those schools are easier to access than in a typical district system. But this is not entirely bad news; it is important, useful, and potentially game-changing for choice advocates. The New Orleans study shows that a number of non-academic considerations (bus transportation, afterschool care, etc.) were not only valued by families, but were often seen as mandatory. Families would “trade off” academic quality in measurable degrees to find what they really needed in a school. However, even some of these non-academic factors were overridden in favor of academic quality when 1) school-quality information became simply and prominently available to families, 2) schools of persistently low quality were closed, and 3) almost all schools were available to families through the OneApp application system. All three of these factors only appeared in the last time period from which data were taken, begging the question of how much more important academic quality would have been had those factors been part of the choice landscape earlier. So the takeaway for choice advocates must be that quality can be made to matter more highly to more families than it does in a typical district/choice hybrid system; and that non-academic factors involved in choosing schools can actually be leveraged in service of quality (e.g., universal transportation, non-school-based aftercare). It shouldn’t take a natural disaster to embrace this level of market-driven choice, especially here in Ohio, where we have academic disasters of our own to address.
SOURCE: Douglas N. Harris and Matthew F. Larsen, “What Schools Do Families Want (And Why)?” Education Research Alliance for New Orleans (January 2015).
Cheers to State Auditor Dave Yost. Ohio’s Auditor last week released the results of unannounced visits his staff made to thirty charter schools back in October looking to compare reported student enrollment numbers with actual on-site counts. Nearly a quarter of schools showed “unusually high” discrepancies between the two numbers. Some will cry “witch hunt,” but this is really just one more bit of evidence that it’s time to review and revamp (as necessary) Ohio’s charter school laws.
Cheers to Ohio Representative Bill Hayes. In his first interview upon taking the chairmanship of the House Education Committee, Hayes was asked about the prospect of more Common Core repeal efforts in the General Assembly. His response was a study in open-minded fairness on an issue where lightning bolts and flames are expected. He expressed interest in hearing from both sides on the issue, while not equivocating on his position as “a supporter of local control for school districts.”
Jeers to Lorain City Schools’ new Board President Tony DiMacchia. Mr. DiMacchia is a proud native of Lorain and a cheerleader for his district, as you might expect from a school board president. But what message does it send when the leader of one of only two districts under the control of a state Academic Distress Commission demeans charter school quality (and families who choose them) with no supporting information and blames those families’ choices for financial woes in his district? Additionally, Mr. DiMacchia describes how, long before charter schools ever came on the scene, his parents gave him the option to choose a private school over Lorain City Schools when the district was forced to cut sports due to budget issues. He picked the sports and his parents got the bill. Folks who can’t afford tuition or to move out of Lorain get a district in academic distress.
Cheers to Cynthia C. Dungey, director of the Ohio Department of Job and Family Services. As an instructive counterpoint to the above, Director Dungey’s stroke of educational good fortune years ago—a scholarship in the inaugural class of what would become one of the most prestigious private schools in the Columbus area—is still a driving force behind her mission today. “Services to one person won’t necessarily look like services for another,” she says of her lifelong work in human services. “It needs to be individualized.” The same can be said of education and should be said more often.
Jeers to the “trust gap” that appears to exist in Akron City Schools. A survey of Akron teachers conducted by their union and published recently produced some fairly predictable responses on topics such as teacher morale and standardized testing. Teachers feel beaten down and dislike tests. But responses on the issues of trust in and collaboration with district administration were downright disturbing (84.4 percent of educators feel they do not have a voice in the direction of the district). With new standards and tests, teacher-evaluation implementation underway, and discipline policy changes ahead, such a lack of trust bodes ill for the district and its students.
Today marks the start of National School Choice Week. Across the country, over 11,000 events will take place from the intimate (school open houses and homeschool how-to sessions) to the enormous (Capitol Rallies across the country); from our own gathering to online events. It is one week of the year during which the focus is on the benefits parents and children gain from having the opportunity to choose the school that best fits their needs.
School choice in Ohio comes in many forms, including public charter schools, private schools (and voucher programs that help needy students pay private tuition), open enrollment, STEM schools, vocational centers, post-secondary enrollment options, and home schooling. Among these choice options, charter schools have clearly become the most prominent feature of Ohio’s school-choice environment; they educate over 120,000 students, many of whom come from low-income families.
Given the high profile of charter schools, it is worth pausing on School Choice Week to honor the very best of Ohio’s charter schools. The table below is an honor roll of Ohio charter schools. It displays twenty-two charter schools that were ranked in the top ten percent in either the state’s performance-index score (student achievement) or value-added-index score (student growth over time). One school, Columbus Preparatory Academy, was rated in the top ten percent in both categories. An asterisk next to a school name indicates that the charter school made our top-quality charters list in 2012–13 (fourteen of the twenty-two schools are second-time recipients).
Table: The top-performing charter schools in Ohio, 2013-14
[[{"fid":"114012","view_mode":"default","fields":{"format":"default"},"type":"media","attributes":{"height":"309","width":"626","class":"media-element file-default"},"link_text":null}]]
Source: Ohio Department of Education, Ohio School Report Cards (school ratings), and the National Alliance for Public Charter Schools (management organizations)
We are proud to recognize and honor these schools during School Choice Week and offer them as proof that charter schools can and do succeed, adding to the number of high-quality seats available to parents especially in Ohio’s urban areas. We need many more such seats, no matter what form of school provides them, and that’s what School Choice Week is all about.
A standard argument of those who downplay strong results among children in urban charters is that families who are motivated enough to exercise school choice are simply different, and their kids’ success is nearly preordained. This recent paper out of the National Bureau of Economic Research tests this assumption and studies the causal effect of takeover schools on student achievement in New Orleans’s Recovery School District (RSD). Specifically, it looks not at the impact of charter school admissions lotteries on the performance of kids who apply, but rather at the impact on the kids who don’t make a choice to apply—passive participants who are simply grandfathered into the newly constituted school. The sample includes eleven middle schools in the RSD that were slated for closure (called “legacy schools”) and subject to a full charter takeover, meaning they had all grades converted to a new school in a single academic year, typically in the same building. The comparison group is a group of same-grade students enrolled in schools that are not yet closed who, in the prior grade, went to a school that was similar to the one the legacy school students attended. Schools are “similar” if their performance scores are comparable to the legacy schools’. And students are matched based on race, sex, age, poverty, and other demographics. The “pre-takeover trajectories” of both groups of students are quite similar. They find that attending an RSD takeover charter substantially increases math and ELA scores (roughly .21 and .14 standard deviation, respectively, per year enrolled). Takeover effects are larger in seventh and eighth grade and in the first two years of takeover. The study was then replicated for a school in Boston where the authors also had lottery estimates, and the gains for grandfathered students were at least as large as the gains for those who got in via lottery. The analysts sum up the gist of the study quite well: Conventional wisdom says that “urban charter lottery applicants enjoy an usually large and therefore unrepresentative benefit from charter attendance because they’re highly motivated or uniquely primed to benefit from the education these schools offer. [Yet] Boston and RSD takeovers generate gains for their passively enrolled students that are broadly similar to, and sometimes even larger than the lottery estimates reported in [other research].” Very interesting. Charter takeovers of traditional schools are fraught with controversy, mostly among adults; this study says they are beneficial to kids.
SOURCE: Atila Abdulkadiro?lu, Joshua D. Angrist, Peter D. Hull, and Parag A. Pathak, "Charters Without Lotteries: Testing Takeovers in New Orleans and Boston," National Bureau of Economic Research, Working Paper 20792 (December 2014).