Maryland students were said to have made impressive gains this year on their state test. Naturally, our first reaction was to wonder how that happened when the state's NAEP scores are stagnant. And sure enough, we find out that this year's Maryland School Assessment was shorter than last year's, and that this year's test questions hewed more closely to the state curriculum than they previously had. Do these changes make a real difference? Do they call into question the results' integrity? Deputy State School Superintendent Ronald A. Peiffer doesn't think so. He said that a shorter test simply meant students "weren't as tired this year." But that "doesn't mean [the test] wasn't as difficult." According to the Baltimore Sun, Harcourt Assessment Inc., which oversees Maryland's state testing, found this year's "test was equivalent to the one given in 2003, the year the test was first used, and to subsequent tests. But the panel also concluded that the changes in the test had contributed to the large increases in the fifth- and seventh-grade scores." So, the changes did or did not affect test scores? Either way, as our own Amber Winkler pointed out in the Sun, Maryland's definition of "proficiency" is among the lowest in the nation. Who knows how much its kids actually know?
"MSA changes may have raised scores," by Liz Bowie, Baltimore Sun, July 18, 2008