Last May, Achieve released a report showing that most states have created a false impression of student success in math and reading proficiency. Known as the “honesty gap” (or, as Fordham has long described it, The Proficiency Illusion), the discrepancy between reported and actual proficiency is found when state test results are compared with NAEP results.[1] For example, Achieve’s May report showed that over half of states showed discrepancies of more than thirty percentage points with NAEP’s gold standard. Ohio was one of the worst offenders: Our old state test scores (the OAA and OGTs) differed by thirty percentage points or more in each of NAEP’s main test subjects, with a whopping forty-nine-point difference in fourth-grade reading.
Less than one year later, new state test scores and biennial NAEP results have created an opportunity to revisit the honesty gap. In its latest report, Achieve finds that the gap has significantly narrowed in nearly half of states. Ohio is one of twenty-six states that has earned the commendation “Significantly Improved” for closing the honesty gap in either fourth-grade reading or eighth-grade math by at least ten percentage points since 2013. Unfortunately, the Buckeye State still isn’t among those who are “Top Truth Tellers” in 2015 because we still have honesty gaps of more than five percentage points. Considering our recent status at the bottom of the barrel, however, the improvement is worthy of acknowledgement.
Ohio’s improvement is a reflection of the State Board of Education’s recent adoption of performance levels for the 2015–16 English Language Arts (ELA) and math state tests. Anyone who has followed the rollercoaster of testing and accountability changes over the last year will know that the adoption of a higher proficiency threshold wasn’t a certainty. Just this past summer, when the state board adopted cut scores for the PARCC assessment that were below the recommendations of the consortium, the national media reacted with a frenzy of accusations that Ohio was taking a page from Lake Wobegon and engaging in score inflation, since the adopted scores nearly doubled the number of students deemed proficient. Unfazed by the criticism, the board promised to raise cut scores each year until performance levels accurately depicted the achievement and readiness of Ohio students.
With the new performance levels, it appears the department has kept its word and is worthy of being deemed “Significantly Improved.” The newly adopted cut scores are higher in both ELA and math. The difference is most pronounced in ELA, where the percentage of students projected to be proficient decreased. Consistent with the decrease in proficiency, the percentage of students who are projected to fall into the “limited” performance level (the designation beneath “basic”) has increased. In an interesting twist, the number of kids projected to score “advanced” actually increased in every test except English II.
A few caveats about this data: First, comparing the new performance levels to last year’s isn’t exactly an apples-to-apples comparison. For starters, last year’s performance levels were based on PARCC, and this year’s performance levels are based on Ohio’s new state tests (which were developed by AIR and more than one hundred Ohio teachers and content experts).[2] In addition, the projected student performance percentages for last year’s test were based on PARCC’s 2014 field test, while this year’s projections came from student data in states where AIR had already administered ELA and math tests—which doesn’t include Ohio. Finally, as previously noted, the only level where projected student performance is expected to rise instead of fall is at advanced. This could be attributed to one of two things: Either PARCC was too difficult, making it nearly impossible for students to score advanced, or Ohio’s new state tests are less difficult, making it easier for students to score advanced. Either way, it’s important to remember that the real results we’ll get this summer—rather than the projected numbers we have now—will be a far better indication of Ohio student achievement.
Although it’s not a perfect analysis, comparing the new cut scores to the old is a valuable measure of ODE’s transparency and honesty. The criticism that the department has taken over last year’s low cut scores—and months of controversy in other realms—makes their commitment to raising the bar in state testing welcome news. Higher cut scores are a step in the right direction toward ending the honesty gap, and the state board and ODE deserve credit.
That being said, the percentage of students we deem proficient is still far above what NAEP and other assessments find. That means the department must remain committed to truth telling in the future. When early state test results arrive, and performance levels have to be validated, the department must be careful not to balk if scores are lower than projected numbers indicate. The same is true for next school year, when ODE will need to hold true to their word and raise cut scores again. Other states have set an example for framing low scores, and Achieve’s new report indicates that many more are closing the honesty gap. Ohio should follow suit. Students and their families deserve to know the truth about their performance and readiness long before they’re faced with remediation in college or difficulty in the workplace. In all, while the Buckeye State is moving in the right direction, there is still much work to be done. Ohio is on the path to telling the truth, even if it hurts—and policy makers and the public must be patient through the pain.
[1] There’s good evidence that NAEP’s proficiency levels in reading are predictive of whether students are ready to succeed in college without taking remedial courses.
[2] It’s worth noting that many educators think the new tests are already off to a good start.