AI-aided candidate selection contends with gendered candidate fit assessment in addition to the influence of AI recommendations. Whether biased or unbiased, AI candidate recommendations introduce an additional source of information that either feeds into stereotype-based decision-making or activates bias countering. In a choice-based conjoint experiment (n=873), we examine candidate selection between gender stereotype- (vs. qualifications-) fit candidates and biased (vs. unbiased) AI recommendations. Additionally, we investigate the moderating role of acceptance of AI hiring tools. We find that decision-makers select qualifications- (vs. gender stereotype-) fit candidates and candidates recommended by AI. When comparing biased and unbiased AI recommendations, decision-makers select candidates recommended by unbiased AI. We additionally find that AI-recommended candidate preference disappears for high-AI acceptance decision-makers, indicating acceptance of AI use is an important contextual factor. Our findings highlight that AI-aided candidate selection does not necessarily lead to biased hiring and that reflective engagement is key to equitable candidate selection.
Noon, M. F. A., van der Meer, T. G. L. A., Kroon, A. C., & Vliegenthart, R. (2026). Bias in AI-aided candidate selection: Investigating the influence of AI recommendations and gender stereotypical frames on candidate selection and hiring decision-making. Manuscript under review.