Higher education programs increasingly use algorithms based on students鈥 background, academic achievement, and other factors to predict whether they will complete a degree.
These tools can help direct resources to struggling students, but they can also give a biased picture of students鈥 potential, according to a published by AERA Open, a journal of the American Educational Research Association.
The study describes yet another way that predictive tools can be prone to 鈥渁lgorithmic bias,鈥 in which lacking or missing data make such tools less accurate, or even misleading, when applied to certain demographic populations.
Hadis Anahideh, an assistant professor of industrial engineering at the University of Illinois, Chicago, and her colleagues analyzed federal longitudinal data on students who were 10th graders in 2002 and later entered four-year degree programs. They used a variety of 鈥渃ollege success鈥 models to predict the likelihood that students would complete a bachelor鈥檚 degree within eight years of their high school graduation, and then compared those predictions to students鈥 actual reported educational attainment.
鈥淚t makes [admissions officers鈥橾 job easier because they don鈥檛 have to go through the data one by one,鈥 Anahideh said. 鈥淚f they use these models, which are very powerful, they can estimate, OK, if this is the performance of the new student coming in, based on their high school variable and based on their background information, will they be a successful student? Can they graduate from the program or not?鈥
College-success algorithms falsely predicted failure for about 1 in 5 Black and Hispanic students, the researchers found. By contrast, only 12 percent of white students and 6 percent of Asian students were tapped as likely to fail, when they actually went on to complete a bachelor鈥檚 degree. This kind of flag can be used to target interventions to struggling students, but Anahideh said they could also put students at a disadvantage in admissions and scholarships.
The models also tended to dramatically overestimate how well white and Asian students would do in college relative to other students. Seventy-three percent of Asian students and 65 percent of white students who did not earn a four-year degree in eight years had been predicted to do so. Only a third of Black students and 28 percent of Hispanic students were incorrectly tagged for success.
鈥淭here is a bias right in the system,鈥 Anahideh said. 鈥淏ut the surprising thing in this study was, these common [bias]-mitigation techniques are not really effective. ... There isn鈥檛 one unique solution to address the bias.鈥
Prior research has found datasets used to train predictive tools often don鈥檛 include enough diverse students to teach the tools how to estimate what a successful student of color looks like in the system. But Anahideh and her colleagues found that adding in more examples of successful students from different backgrounds wasn鈥檛 enough to remove bias from the system.
That鈥檚 because linked, increasing the weight they have in the model. For example, average ACT and SAT scores are highly predictive of later college achievement鈥攂ut they are also closely linked to race, and low exam scores can be a less accurate predictor of earning a degree for Black and Hispanic students than white students, the study found.
Educators who work with students transitioning to college can help buffer their students against the effects of algorithmic bias, she said, by 鈥渓earning from this historical data and what the model estimates for these students ... and try to advise them accordingly to be more successful.鈥