When classes moved abruptly online at Iowa State University in March 2020 as part of Covid-mitigation actions statewide, psychology professor Jason Chan expected big changes in student behavior. Specifically, he worried about his students being easily able to cheat on unproctored online exams. But he saw little evidence of that, with his students producing a fairly typical distribution of scores, comparable to the proctored, in-person exams he gave earlier the same semester. Intrigued, he and associate Dahwi Ahna undertook a deeper analysis of university-wide test scores to see whether his experience was typical.
Chan and Ahna obtained data from eighteen courses offered during the spring 2020 semester—everything from large introductory lecture classes to smaller, specialized major courses for upperclassmen. They calculated an average score for the in-person exams (i.e., first half of the semester) and an average score for the online exams (i.e., second half of the semester) for more than 2,000 students. They then computed the correlation between the two halves using a meta-analytic approach which treated each course as an individual study.
Across the board, scores from the unproctored online exams closely correlated with those from the traditional in-person exams. A positive correlation was observed for every course, and correlations did not vary significantly by types of questions asked on the exams, field of study, course level, exam duration, or enrollment. They ran a standardized effect size analysis to confirmed that score inflation via relaxed grading for Covid-impacted online exams was not driving the correlation.
The researchers also ran the same first-half/second-half comparison for a set of courses that were all in-person in the spring semesters of 2018, 2019, and 2021. Overall, these courses showed stronger correlation between first half and second half exam scores than the split in-person/online courses of 2020. When the comparison was restricted to courses taught by the same instructors in both sets of semesters, the difference became smaller and was no longer statistically significant. However, the sample size was just nine courses, which could have impacted the outcome of the analysis.
The data indicate that cheating was likely uncommon when students had to pivot and take exams online in that first Covid-disrupted semester, though it’s possible that cheating was actually widespread but simply not effective at boosting student outcomes beyond typical norms. The researchers equate the situation to the difference between a closed-book versus open-book proctored test: Students who have missed lectures, paid little attention in class, or have low understanding of the material despite attending will likely not to do well on a typical exam even if they have a textbook, notes, or even the entire internet at their fingertips. The corollary is typically true as well: Students with good attendance and a decent understanding of the material will generally fare well on an exam, with or without notes and other materials before them.
This feels like good news as far as it goes. Whatever effort these college students were putting forth prior to pandemic disruption appeared to continue immediately after their entire educational experience turned upside down. However, this does not mean that extended use of online exams—and indeed virtual teaching and learning writ large—will result in the same behavior when it is the everyday norm. Authors Chan and Ahna conclude from their research that online exams can provide a valid and reliable assessment of learning just the same as in-person exams, but that seems like too large a leap given the unique circumstances involved in spring 2020. Professors and institutions sticking to online exams in the absence of force majeure need to leverage the technology to permanently hinder cheating. After all, exams have traditionally been proctored for a reason.
SOURCE: Jason C. K. Chan and Dahwi Ahna, “Unproctored online exams provide meaningful assessment of student learning,” Proceedings of the National Academy of Sciences (July 2023).