Into the messy and political world of teacher-effectiveness research enter Susanna Loeb and colleagues, who examine whether math and English-language-arts (ELA) teachers differ in how they impact students’ long-term knowledge. Specifically, they ask, among other questions, whether ELA and math teachers impact student performance in future years, not just in one—and whether that impact bleeds over by impacting not just knowledge in their own subject area but more generally in both subjects. They use extensive student, teacher, and administrative data from the NYC school system that includes roughly 700,000 third- and eighth-grade students from 2003–04 through 2011–12. There are three key findings: First, a teacher’s value added to ELA achievement has a crossover effect on long-term math performance, such that having a high-quality ELA teacher impacts not only ELA performance in a future year but future math performance, too; yet, math teachers have minimal impact on ELA performance in the long term. This may be due to the nature of ELA, since learning to read and think critically is likely to impact general knowledge, whereas math knowledge pertains more directly to the subject itself and math tests tend to be more aligned in content from year to year. Second, teachers in schools serving disadvantaged kids have less “persistence” (i.e., enduring impact) than their teaching peers with similar value-added scores in other schools, which could suggest that school-level curriculum choices make a difference—or perhaps that teachers in these schools prioritize short-term gains or teaching to the test. Third, within subjects, teachers who attended a more competitive undergraduate college tend to foster long-term knowledge in their students, such that more than a quarter of value-added effects persist into the next year for teachers from these institutions, compared to a rate of less than one-fifth for teachers from less competitive institutions. So in the end, value-added scores continue to be a useful gauge of teacher quality, but let’s not forget that things like subject area, the test itself, school type, and teacher background make a difference in how to think about and interpret those scores.
SOURCE: Benjamin Master, Susanna Loeb, and James Wyckoff, “Learning that Lasts: Unpacking Variation in Teachers’ Effects on Students’ Long-Term Knowledge,” Working Paper 104 (Washington, D.C.: CALDER and AIR, January 2014).