A recent report from Education Northwest extends previous research by the same lead researcher, drilling down into the same dataset in order to fine-tune the original findings. That earlier study (June 2016) intended to test whether incoming University of Alaska freshmen were incorrectly placed in remedial courses when they were actually able to complete credit-bearing courses. It found that high school GPA was a stronger predictor of success in credit-bearing college courses in English language arts and math than college admissions test scores. The follow-up study deepens this examination by breaking down the results for students from urban versus rural high schools, and for students who delay entry into college.
In general, the latest study’s findings were the same. Except for the students who delayed college entry, GPA was generally found to be a better predictor of success in college coursework than were standardized test scores. It stands to reason that admissions test scores would better represent the current abilities of students who delayed entry into college (call it the final “summer slide” of one’s high school career), and indeed the previous study showed that students who delayed entry were several times more likely to be placed into developmental courses than were students who entered college directly after high school graduation. But does this mean that colleges err when they use such test scores to place incoming students? The Education Northwest researchers believe so, arguing that colleges should use high school GPAs in combination with test scores, with the former weighted more highly since GPAs can more effectively measure non-cognitive skills they deem more relevant to college success.
But it is worth noting that both of their studies are limited by a few factors: First, there are only about 128,000 K–12 students in all of Alaska, and its largest city, Anchorage, is about the same size as Cincinnati. A larger, more diverse sample (Baltimore, New York, Atlanta, or even the Education Northwest’s hometown of Portland, Oregon) could yield different results. Second, there is no indication that the University of Alaska students were admitted or placed solely on the basis of admissions test scores. Sure they’re important, but not every school puts Ivy League emphasis on test scores to weed out applicants. Third, the “college success” measured here is only a student’s first credit-bearing class in ELA and math. That seems like a limited definition of success for many students; depending on one’s major, math 102 is harder than math 101. Fourth, “success” in these studies merely means passing the class, not getting an A. If a student’s high school GPA of 2.5 was better at predicting his final grade in the college class (a D) than was his SAT score (in the 50th percentile), only Education Northwest’s statisticians should be happy about that. A more interesting and useful analysis would look at the difference in success rates between students with high versus low GPA, students with high versus low test scores, or students who earned As versus Ds in the college courses.
Previous studies have shown correlation between high GPA and high ACT scores. There’s lots of talk that test scores are (but shouldn’t be) the most important factor when it comes to college admissions decisions, and the “who needs testing?” backlash at the K–12 level appears to have reached upward to colleges. This study is not the silver bullet that’s going to slay the admissions testing beast, but more care must be taken at the college level to avoid incorrect and money-wasting developmental placements. It is to be hoped that at least part of the answer is already in development at the high school level (high standards, quality curricula, well aligned tests, remediation/mastery) and that colleges will be able to jump aboard and calibrate their admissions criteria to maximize high levels of performance, persistence, and ultimately degree attainment.
SOURCE: Michelle Hodara, Karyn Lewis, “How well does high school grade point average predict college performance by student urbanicity and timing of college entry?” Institute of Education Sciences, U.S. Department of Education (February, 2017).