Content-type: text/html Downes.ca ~ Stephen's Web ~ Assessing the Fairness of Course Success Prediction Models in the Face of (Un)equal Demographic Group Distribution

Stephen Downes

Knowledge, Learning, Community

This article looks at the fairness of AI models in cases where demographic groups are unevenly distributed. How fair, for example, would a prediction model be for women if women were only five percent of the data sample used to train the model. To test the models, the authors first ran them on an unbalanced distribution, then compared the results with those obtained by running it with a balanced data sample. They found that "none of the predictive models was consistently fair in all 3 courses." But more surprisingly, "attributing the unfairness to demographic group imbalance may cause the unfairness to persist even when the data becomes balanced." Assuming that the model is unfair to women may obscure the fact, say, that the model is also unfair to people without pockets. It's a small study, and before jumping to any conclusions we need to see the results replicated in a large-scale study.

Today: 0 Total: 10 [Direct link] [Share]


Stephen Downes Stephen Downes, Casselman, Canada
stephen@downes.ca

Copyright 2024
Last Updated: Dec 22, 2024 01:17 a.m.

Canadian Flag Creative Commons License.

Force:yes