Content-type: text/html Downes.ca ~ Stephen's Web ~ Gender, Race, and Intersectional Bias in Resume Screening via Language Model Retrieval

Stephen Downes

Knowledge, Learning, Community

The topic of AI-based recruitment and hiring has been discussed here before and research continues apace. This item (13 page PDF), despite the characterization in GeekWire, is a fairly narrow study. It looks at three text-embedding models based on Mistral-7B-v0.1, and tests for gender and racial bias on applications containing name and position only, and name and position and some content (the paper discusses removing the name but does do it). The interesting bit is that intersectional bias (ie., combining gender and race) is not merely a combination of the separate biases; while separate biases exaggerated the discrimination, "intersectional results, on the other hand, do correspond more strongly to real-world discrimination in resume screening." Via Lisa Marie Blaschke, who in turn credits Audrey Watters.

Today: 19 Total: 387 [Direct link] [Share]


Stephen Downes Stephen Downes, Casselman, Canada
stephen@downes.ca

Copyright 2024
Last Updated: Nov 12, 2024 5:18 p.m.

Canadian Flag Creative Commons License.

Force:yes