Ohio Senate maintains state's commitment to high-quality standards
MBR changes in the Senate maintain commitment to Common Core.
MBR changes in the Senate maintain commitment to Common Core.
Last week, the Ohio Senate passed House Bill 487, also known as the Education Mid Biennium Review (MBR) with overwhelming support (by a vote of twenty-seven to five). The MBR contains a wide variety of education-policy changes, including some modifications that affect Ohio’s academic content standards and assessments.
Ohio’s current learning standards, adopted in 2010 by the State Board of Education, include standards for students in grades K–12 in English language arts, math, science, and social studies. When the standards were adopted four years ago, there was public input but little fanfare or controversy. That changed about a year ago, when critics began focusing on the math and English language arts standards, a.k.a. the Common Core State Standards (CCSS).
As opposition to the CCSS heated up all over the country (the standards were adopted by forty-five states), the focal point in Ohio was House Bill 237, which proposed repealing CCSS completely. The bill, sponsored by Representative Andy Thompson, received two hearings in the House Education Committee, with the last hearing in November 2013 drawing more than 500 people to the Statehouse.
The Senate’s changes in the MBR address some of the chief concerns raised at the November bill hearing. The key proposed changes are described below.
It’s too early to tell which of these changes will become law as the MBR still has to go to conference committee to allow the House and Senate to work out their differences. However, with these changes, the Ohio Senate appears to have effectively threaded the needle. It has reasserted Ohio’s commitment both to high-quality standards designed to prepare our students for success after high school and to rigorous assessments aligned to those standards. Meanwhile, the Senate has rightly listened to the reasonable concerns of parents and teachers across the state. Hopefully, educators around the state can breathe a little easier knowing that the standards they’ve been working hard to implement over the past four years won’t be changed in the final hour.
Like the Cleveland Browns on a Sunday afternoon, the Ohio General Assembly is fumbling about with the state’s value-added system. One month ago, I described two bizarre provisions related to value-added (VAM) that the House tucked into the state’s mid-biennium budget bill (House Bill 487). The Senate has since struck down one of the House’s bad provisions—and kudos for that—but, regrettably, has blundered on the second one.
To recap briefly, the House proposals would have (1) excluded certain students from schools’ value-added computations and (2) changed the computation of value-added estimates—the state’s measure of a school’s impact on student growth—from a three-year to a one-year calculation.
I argued then that the House’s student-exclusion provision would water-down accountability, and that reverting to the one-year estimates would increase the uncertainty around schools’ value-added results.
The Senate has struck down the House’s exclusion provision. Good. But it has failed to rectify the matter of the one-versus-three-year computation. In fact, it has made things worse.
Here’s the Senate’s amendment:
In determining the value-added progress dimension score, the department shall use either up to three years of value-added data as available or value-added data from the most recent school year available, whichever results in a higher score for the district or building.
Now, under the Senate proposal, schools would receive a rating based on whichever VAM estimate is higher—either the one-year or the three-year computation. (Naturally, schools that just recently opened would not have three years of data; hence, the “as available” and “up to” clauses.)
Huh? How is this rational accountability? The Senate seems to have fallen into the Oprah zone: “you get an A, you get an A, everybody gets an A!”
I exaggerate, of course. Not everyone would get an A, based on the “higher score” policy. But let’s consider what happens to school ratings, under the three scenarios in play—the one-year value-added computation (House), the three-year computation (current policy), and the higher score of the two scores (Senate).
Chart 1 compares the letter-grade distribution under the one-year versus three-year (i.e., multi-year) estimates. As you can see, the three-year scores push schools toward the margins (As and Fs), while at the same time, diminishes the number of schools in the middle (Cs). This is to be expected, given what we know about the greater imprecision of the one-year value-added estimates. Greater imprecision tends to push schools toward the middle of the distribution, sans clear evidence to suggest they’ve had a significant impact, either positively (A) or negatively (F). In short, when the data are “noisier”—as they are under the one-year estimates—we’re more likely to wind up with more schools in the mushy middle.[1]
Chart 1: Clearer view of value-added impact under multi-year scores: Multi-year scores push schools toward the margins (A or F); One-year scores push schools toward the middle (C)
Source: Ohio Department of Education. For 2012-13, multi-year VAM scores are available publicly; the author thanks the department for making schools’ one-year VAM scores accessible at his request. (One-year VAMs, school by school, are available here.) Notes: The one-year A-F ratings are simulated, based on schools’ one-year VAM scores for 2012-13. (The “cut points” for the ratings are here.) The multi-year A-F ratings for 2012-13 are actual letter grades, based on schools’ VAM scores (up to three years) from SY 2011, 2012, 2013. Chart displays the school-level (district and charter) distribution of letter grades (n = 2,558).
Now, let’s look at the Senate’s “higher-score” proposal—the real whopper of them all. Consider chart 2 which also includes the higher of the two value-added scores (the green bar). What you’ll notice is that the number of As would likely increase under the proposal, so that virtually half the schools in the state would receive an A. On the other end of the spectrum, the number of Fs would be cut in half, so that just one-in-ten of the schools in the state would receive an F.
Chart 2: Roughly half of schools would get A under “higher score” provision
Are half the schools in Ohio making significant—meaningfully significant—gains for their students? And are just one-in-ten schools failing to move the achievement needle in a significant way? Let’s get real.
As I’ve maintained, current policy—the three-year computation—is the best course for policymakers. It gives us the clearest look at a school’s impact, both good and bad, on student performance. The finagling of value-added isn’t just an academic exercise, either—it has considerable implications for the state’s automatic charter school closure law, voucher eligibility, academic distress commissions, and a number of other accountability policies. Can Ohio’s policymakers rectify this value-added mess? As with the Browns’ playoff chances, here’s hoping!
[1] Of course, not all schools in the C range are there because of imprecision—some schools may have had a clear, statistically insignificant impact on learning gains.
Last week, School Choice Ohio sued two Ohio school districts for their failure to comply with a public-records request. The organization is seeking directory information for students eligible for the EdChoice Scholarship Program from the Cincinnati and Springfield Public Schools. Actions to enforce public-records requests are rarely exciting, but the outcome of SCO’s effort could have important ramifications for tens of thousands of students and their families across the state.
Despite being a national leader in providing private-school choice options to students—Ohio has five separate voucher programs—there isn’t an established mechanism for informing families eligible for the EdChoice Scholarship program (Ohio’s largest voucher initiative) about their eligibility. The law doesn’t require school districts or the Ohio Department of Education to perform this vital function.
Enter School Choice Ohio (SCO), a Columbus-based nonprofit organization, which has worked tirelessly since the beginning of the EdChoice program to conduct outreach to families across the Buckeye State who are eligible to send their child to a private school via a voucher. SCO typically sends postcards and makes phone calls letting families know that their children may be eligible, giving them a toll-free number to call for an information packet and answering any questions families may have about eligibility and the private-school options in their area.
This is critical work, as the EdChoice Scholarship is designed to provide students in Ohio’s very lowest-performing schools the option to attend a private school.
To conduct this outreach, SCO makes a public-records request for directory information to superintendents of school districts whose students are eligible for the EdChoice Scholarship. “Directory information” can encompass a number of district-chosen parameters but typically includes a student’s name, address, phone number, grade level, and school-building assignment. It is the kind of information you might find in a student directory handed out to families along with a student handbook at the start of each school year.
The statutory language clearly states that if directory information is collected and distributed at all, it is a public record that can be requested by a nonprofit as long as it isn’t “for use in a profit-making plan or activity.” In other words, as long as it’s for a nonprofit exactly like SCO.
How do we know all this? We both worked at SCO for many years, making these public-records requests and helping interested families contacted via directory information.
In the main, districts grudgingly but professionally complied with public-records requests for directory information. One perennial resistor to SCO’s public records requests was Cincinnati City Schools. Every year they would send back a letter through their lawyer saying, in essence, “We know we’re supposed to collect directory information, but we don’t so we can’t give it to you.”
That means thousands of Cincinnati families every year whose children were eligible for a scholarship to a private school of their choice stayed put in their bottom-of-the-heap school simply because they didn’t know another option existed.
Springfield, meanwhile, has collected directory information of the type SCO requests, and they have provided it to SCO in the past. However, as the Springfield News-Sun notes, the board passed a policy change in 2013 which redefined “directory information” to remove anything that would identify a student and so they could not comply with SCO’s latest request.
Unfazed by its own policy constraints, the district continued releasing identifiable directory information to what it calls its “partners” after the policy change. When asked about the legal action last week, Springfield Superintendent David Estrop said, “We are trying to protect our own students from false and inaccurate information.” The characterization of SCO’s work aside, such ad hoc provision of data is not allowed under Ohio’s public-records law. Hence, SCO’s legal action.
We commend SCO for taking this bold and much-needed action. The EdChoice Scholarship program provides opportunities to some of Ohio’s most disadvantaged students who, through no fault of their own, have been assigned to a school that is not, under the state’s criteria, effectively educating its students. To deny these students and their parents information about their private-school options strikes us as a particularly low blow. If SCO isn’t successful, we’d urge the legislature to say enough is enough and to step in and require either school districts or (preferably) ODE to notify these families of their eligibility. After all, in Ohio, when your local school isn’t performing well, you have a choice. You deserve to know about it.
A great deal of hand-wringing has occurred in recent years concerning the United States’ poor academic performance relative to other nations. The anxiety is no doubt justified, as students from countries like South Korea, Japan, and Hong Kong are beating the pants off American pupils on international exams. It’s not just the East Asian countries: even the Swiss, Canucks, and Aussies are cleaning our clocks. But what about Ohio’s students? How does its achievement look in comparison to other industrialized nations? Like most states, not well, according to this new PEPG/Education Next study. To determine how states rank compared to the rest of the world, researchers link 2012 PISA results—international exams administered in thirty-four OECD countries including the U.S.—and state-level NAEP results for eighth graders in 2011. The researchers discovered that Ohio’s students fall well short of the world’s highest performers. When examining math results, Ohio’s proficiency rate (39 percent) falls 15 to 25 percentage points below the highest-achieving nations. (Korea, the worldwide leader in math, was at 65 percent proficiency; Japan was at 59 percent; Massachusetts, the U.S. leader, was at 51 percent). In fact, Ohio’s proficiency rate places us somewhere between Norway’s and Portugal’s achievement rates in this grade and subject. Moreover, Ohio’s weak international performance isn’t just a matter of our students having lower family resources relative to other nations. For example, among students whose parents had a high level of education, Ohio’s math proficiency rate (50 percent) still fell twenty points below the international leaders’ math proficiency rates (Korea, at 73 percent; Poland, at 71 percent). Ohio’s alarmingly mediocre achievement relative to the rest of the world only reinforces our need to raise educational standards so that students—from all family backgrounds—can compete with their international peers.
SOURCE: Eric A. Hanushek, Paul E. Peterson, and Ludger Woessmann, Not just the problem of other people’s children: U.S. student performance in global perspective (Program on Education Policy and Governance and Education Next, May 2014).