Since our special Ohio Education Gadfly last month about changes to Ohio’s value-added analysis methodology, we’ve received much interest from the media, local superintendents and school leaders, lawmakers, business leaders, and others on this issue. The Fordham Institute believes strongly in academic accountability based on high-quality standards, assessments, and data systems. We think Ohio has done well in this area, but acknowledge there is more to be done. Toward that end, we have engaged our colleagues at the education policy and research firm Public Impact to explore this issue for Gadfly readers and offer their insights and guidance about changes Ohio can make to how it measures student progress and, most importantly, how to use that data to evaluate students, teachers, and schools and help them improve.
Increasingly, states, districts, and parents are assessing educational quality using data. In Ohio, state-issued report cards contain a wealth of metrics and analyses designed to evaluate school performance. And since 2007, the state has included a value-added analysis in its report card, which measures how much schools are contributing to student learning.
Although Ohio’s value-added analysis can help us evaluate school quality, there are other ways to use value-added analysis and the data collected for it. Stakeholders want to know – and need to know – how teachers and students are doing, and the data Ohio already collects are rich with insights into teacher and student performance. Up until now, however, the state hasn’t taken full advantage of everything its data have to offer.
Just weeks ago, we learned that the Buckeye State won the federal Race to the Top competition and $400 million in federal funding to implement an ambitious set of education reforms. Ohio has budgeted seven percent, nearly $28 million, of this funding to develop more robust teacher evaluations, inform human resource decisions, accelerate student growth reporting at the classroom level, and improve access to student data – in part utilizing value-added data. With those goals in mind, we take a fresh look at how Ohio can use education data to learn more about teachers and students, and how the findings can help the state make better decisions on behalf of its children.
What can education data reveal about teachers?
The state’s data systems can tell us a lot about a teacher. How many years has she been in the classroom? How many students does she teach? What credentials does she hold? Unfortunately, this information doesn’t tell us what we really want to know: is she a good teacher? Is she raising student achievement and pushing students to new heights, or is she limping along, allowing her students’ academic potential to languish?
One of the best ways to determine how much impact a teacher has on her students’ education is by using a value-added model. Value-added models that assess teacher effectiveness work the same way as Ohio’s current value-added model for assessing schools. They compare the gains a teacher’s students make to the improvement made by similar students in other classes to evaluate how much “value” the teacher added, controlling for relevant factors. Information from a few students might not be enough to reach a definitive conclusion, but looking at an entire classroom or multiple years of results can give us a good sense of how effective she is.
But how is this information useful? Value-added analysis allows us to identify the most effective teachers, the least effective teachers, and those that fall in between. By differentiating teacher quality, districts and schools can use value-added data to allocate resources in a way that selectively retains the best teachers, develops or dismisses low performers, and provides greater opportunities for all students to have access to great educators.
Retaining the best teachers
The best teachers propel their students to make dramatic gains in a single year, making them invaluable to students, families, and schools. Yet, data show that our best teachers are just as likely to leave the profession as our worst teachers. By one estimate, we lose approximately 64,000 of our best teachers nationwide, each year. We can all agree that schools and districts ought to do their best to hold onto great teachers. Since 2007, the federal Teacher Incentive Fund has helped districts and states, including Chicago, Denver, and Ohio, to reward high performing teachers with financial bonuses, but these programs don’t go far enough. In other sectors, managers aggressively offer financial rewards and new work opportunities that target and retain their top performers. With value-added data that illuminate teacher effectiveness, school leaders can identify their best teachers and develop incentive programs to keep them in the classroom.
Expanding the influence of the best teachers
In addition to retaining the best teachers, we all stand to benefit when those teachers are able to extend their reach to more children. Currently though, our best teachers reach about the same number of students each year as our worst. With value-added data, school leaders can identify top performers and make sure that as many children as possible have the opportunity to learn from them. Reach extension can take several forms, such as redesigning jobs so that the best teachers spend more of their time instructing a greater number of students (and less time on administrative duties), increasing class size for those teachers, or using technology to bring high quality instruction into more classrooms via video and computer. Again though, effective reach extension requires districts to identify the teachers capable of driving students to achieve at the highest levels.
Developing or exiting low performers
In addition to identifying top performers, value-added results can also identify the lowest performers. By differentiating between teachers, states and districts can flag those who are low performing early on and either develop their skills or exit them from the system.
Data from value-added analyses are useful in this regard, but may be imperfect, as illustrated by corrections Ohio made to its value-added school analysis this year. Rather than immediately dismissing a teacher because of a single value-added rating, it is prudent to collect additional information from other sources, including observations and an analysis of classroom artifacts. Using a value-added analysis to identify low-performers, however, allows states and districts to provide additional support and mentoring to develop teachers that appear to be struggling, and eventually determine if those teachers are improving, or if they should exit the system.
Assigning students
But what about the children assigned to teachers identified as low-performing? They only get one chance to be in first grade, or fifth, or a senior in high school. Although some teachers are bound to be more effective than others, should children knowingly be placed in classes with teachers identified as sub-par? At least one state – Rhode Island – has promised to use teacher value-added data to ensure that no student has an ineffective teacher for more than one consecutive year. By tracking students and the value-added ratings their teachers earn, the Buckeye State can find ways to ensure that no student finds himself or herself in the classroom of a low-performing teacher year after year.
Identifying great future teachers
Although a value-added analysis can tell us how good a teacher is, it can’t tell us why one teacher is better than another. To understand why Mr. Jones is an exceptional teacher, we need to watch him teach, ask him questions, and compare his competencies – the underlying characteristics of a person that cause him to be successful in a given job or position – with those of other top performers. Schools and districts can then use that information during the selection of new teachers to better identify a promising candidate from one likely to flounder before either steps into the classroom.
What can education data tell us about individual students?
The goal of public education is ultimately to prepare students to become productive citizens. As students progress through school then, we want to know whether they are on track to that goal. And if students are not on track, we want to know why, so we can provide interventions that will get them where they need to be.
Presently, the state test provides a crude measure of student learning by indicating whether a child is on grade level, or proficient, in the appropriate skills. But there are many more aspects of student learning that serve as early indicators of student success, and which the state can evaluate using value-added data. Through Race to the Top reforms, Ohio has pledged to use value-added data to evaluate how much students are learning, better discern where they’re succeeding and where they’re struggling, and to share that information with parents and educators so they can intervene early on.
Are students learning enough?
We expect every student to improve, to learn more each year and strive to higher and higher levels of academic achievement. If not, it seems unlikely that they will be where they need to be to succeed in college or the workforce. To measure how much students are learning, we generally look at the improvement a student makes on the state assessment. There are several ways to analyze improvement, however, and each reveals something different about student performance.
Is my child making as much growth as his peers?
It is unrealistic to expect every student to perform at the same level. Some children come to school far behind their peers, while others arrive far ahead. Even if they start in different places, however, we expect every child to learn and improve. In fact, we might expect a struggling student to learn more since she is starting from a lower point and has more room to grow.
One way to determine whether students are making as much progress as they should is to use a value-added model. As it does for schools and teachers, a value-added model uses a statistical analysis to consider whether an individual student is making as much growth as other students with similar starting points.
This analysis may show that a low-performing student is actually making larger than expected gains – suggesting that new interventions may be working. Or, a value-added analysis might indicate that a high-performing student is not performing as well as he can be – or should be. In the latter case, value-added analysis shines a light on a student need absent in many other analyses – the need to push even our top performers to ever high levels of student achievement.
Is my child making enough growth to get "on track"?
Parents and educators often want to know whether struggling students, or those who start off behind their peers, are making enough improvement to get to where they need to be. That is, are students on track to reach the goals we have set for them? We can begin to answer this question using value-added data and what is called a growth-to-standard analysis.
The simplest growth-to-standard analyses look at the amount of progress a student makes over the course of the year and determines whether the student will reach the standard if she continues to make the same amount of progress each year. For example, if a third grader is performing on a second grade reading level, one year of progress is not enough to catch her up by fifth grade. To reach that benchmark, the student would have to make at least 1.5 years of progress in both fourth and fifth grades.
This information can be a powerful tool for educators and administrators. Schools that serve struggling students might not be able to get every student to grade level by the end of the year, but we would hope that every student in the school would make enough progress to reach proficiency before they graduate. A growth-to-standard analysis could give us that information. Additionally, with information about which students aren’t making enough progress to succeed, educators can target resources and interventions more effectively.
What do students know... and what don't they know?
It is not enough to identify under-performing students – Ohio must also be able to provide those students with the support needed to improve. But to do so, schools must be able to identify students’ strengths and weaknesses. In addition to knowing how much students are growing, schools and districts should also be interested in what students know – or don’t know.
Parents and schools need to look beyond test scores and analyze how each student does on specific items on the test. This kind of diagnostic information allows parents and educators to see where students are excelling and where they are struggling in considerable detail. In so doing, a diagnostic report provides a blueprint for providing targeted support. And if the state has information about other students with similar deficiencies, it can provide teachers with sample lesson plans, intervention methods, and other materials designed to meet those student needs.
Ohio already uses its value-added data to provide diagnostic reports for students in a small sub-set of schools – less than 10 percent of schools with value-added data. With the support of Race to the Top funding, however, Ohio plans to provide diagnostic reports to all schools for which it collects the appropriate data.
Looking forward
Despite significant progress, more work remains ahead for the Buckeye State. To date, Ohio, like many other states, has focused primarily on measuring student learning in reading and math in grades 3 – 8. Finding ways to evaluate student achievement before third grade and in high school, and developing new systems to monitor development in all of the subjects and disciplines we expect students to master will be a difficult, but essential. The state plans to address this challenge, but stakeholders must be vigilant to ensure that policymakers follow-through on their promises.
As the state moves to spend its winnings from Race to the Top, Ohioans stand to benefit greatly, both from new innovations and from the wealth of new information available to teachers and parents. Ultimately, however, the value Ohio derives from these efforts will depend on the state’s leaders and educators. If Ohio fails to mine education data to answer all of the questions outlined above, or refuses to act on conclusions drawn from the data, the state will be no better off for its investment, which would, of course, be a shame. The state has taken the first step, by developing the capacity and capability to collect meaningful data. Now Ohio must ensure that it puts its data to good use.