Abstract
Technology firms increasingly leverage artificial intelligence (AI) to enhance human decision-making processes in the rapidly evolving talent acquisition landscape. However, the ramifications of these advancements on workforce diversity remain a topic of intense debate. Drawing upon Gilliland’s procedural justice framework, we explore how IT job candidates interpret the fairness of AI-driven recruitment systems. Gilliland’s model posits that an organization’s adherence to specific fairness principles, such as honesty and the opportunity to perform, profoundly shapes candidates’ self-perceptions, their judgments of the recruitment system’s equity, and the overall attractiveness of the organization. Using focus groups and interviews, we interacted with 47 women, Black and Latinx or Hispanic undergraduates specializing in computer and information science to discern how gender, race, and ethnicity influence attitudes toward AI in hiring. Three procedural justice rules, consistency of administration, job-relatedness, and selection information, emerged as critical in shaping participants’ fairness perceptions. Although discussed less frequently, the propriety of questions held significant resonance for Black and Latinx or Hispanic participants. Our study underscores the critical role of fairness evaluations for organizations, especially those striving to diversify the tech workforce.
References
Anderson, N.: Applicant and Recruiter reactions to new technology in selection: a critical review and agenda for future research. Int. J. Sel. Assess. 11(2–3), 121–136 (2003). https://doi.org/10.1111/1468-2389.00235
Arvey, R.D., Sackett, P.R.: Fairness in selection: Current developments and perspectives. In: Schmitt, N. and Borman, W. (eds.) Personnel Selection. Jossey-Bass, San Francisco, CA (1993)
Assarroudi, A., et al.: Directed qualitative content analysis: the description and elaboration of its underpinning methods and data analysis process. J. Res. Nurs. 23(1), 42–55 (2018). https://doi.org/10.1177/1744987117741667
Barocas, S. et al.: Big Data, Data Science, and Civil Rights. arXiv:1706.03102 [cs]. (2017)
Bauer, T.N., et al.: Applicant reactions to different selection technology: face-to-face, interactive voice response, and computer-assisted telephone screening interviews. Int. J. Sel. Assess. 12(1–2), 135–148 (2004). https://doi.org/10.1111/j.0965-075X.2004.00269.x
Bauer, T.N., et al.: Applicant reactions to selection: development of the selection procedural justice scale (spjs). Pers. Psychol. 54(2), 387–419 (2001). https://doi.org/10.1111/j.1744-6570.2001.tb00097.x
Bauer, T.N. et al.: Applicant reactions to technology-based selection: what we know so far. In: Technology-Enhanced Assessment of Talent, pp. 190–223. John Wiley & Sons, Ltd. (2011). https://doi.org/10.1002/9781118256022.ch6
Bies, R.J.: Beyond formal procedures: the interpersonal context of procedural justice. In: Carroll, J.S. (ed.) Organizational Settings, vol. 88, p. 98 Erlbaum, Hillsdale, NJ (1990)
Bies, R.J.: Interactional justice: Communication criteria of fairness. Res. Negotiat. Organiz. 1, 43–55 (1986)
Bies, R.J., Shapiro, D.L.: Voice and justification: their influence on procedural fairness judgments. Acad. Manag. J. 31(3), 676–685 (1988)
Brockner, J.: Making sense of procedural fairness: how high procedural fairness can reduce or heighten the influence of outcome favorability. AMR. 27(1), 58–76 (2002). https://doi.org/10.5465/amr.2002.5922363
Buyl, M. et al.: Tackling algorithmic disability discrimination in the hiring process: an ethical, legal and technical analysis. In: 2022 ACM Conference on Fairness, Accountability, and Transparency, pp. 1071–1082 Association for Computing Machinery, New York (2022). https://doi.org/10.1145/3531146.3533169
Celani, A., et al.: In justice we trust: A model of the role of trust in the organization in applicant reactions to the selection process. Hum. Resour. Manag. Rev. 18(2), 63–76 (2008). https://doi.org/10.1016/j.hrmr.2008.04.002
Chambers, B.A.: Applicant reactions and their consequences: review, advice, and recommendations for future research. Int. J. Manag. Rev. 4(4), 317–333 (2002). https://doi.org/10.1111/1468-2370.00090
Cooper, J.: A Call for a Language Shift: From Covert Oppression to Overt Empowerment, https://education.uconn.edu/2016/12/07/a-call-for-a-language-shift-from-covert-oppression-to-overt-empowerment/ (Accessed 21 Jan 2022)
Danieli, O., et al.: How to hire with algorithms. Harvard Bus. Rev. 17 (2016)
De Vries, R.E., Van Gelder, J.-L.: Explaining workplace delinquency: the role of Honesty-Humility, ethical culture, and employee surveillance. Personality Individ. Differ. 86, 112–116 (2015). https://doi.org/10.1016/j.paid.2015.06.008
Denzin, N.K., Ryan, K.E.: Qualitative methodology (including focus groups). In: The SAGE Handbook of Social Science Methodology, pp. 578–594 SAGE Publications Ltd, 1 Oliver’s Yard, 55 City Road, London England EC1Y 1SP United Kingdom (2007). https://doi.org/10.4135/9781848607958.n32
Dineen, B.R., et al.: Perceived fairness of web-based applicant screening procedures: Weighing the rules of justice and the role of individual differences. Hum. Resour. Manage. 43(2–3), 127–145 (2004). https://doi.org/10.1002/hrm.20011
Elo, S., Kyngäs, H.: The qualitative content analysis process. J. Adv. Nurs. 62(1), 107–115 (2008). https://doi.org/10.1111/j.1365-2648.2007.04569.x
Florentine, S.: How artificial intelligence can eliminate bias in hiring. CIO Mag. (2016)
Folger, R., Greenberg, J.: Procedural justice: An interpretive analysis of personnel systems. Res. Pers. Hum. Resour. Manag. 3(1), 141–183 (1985)
Fried, I.: Exclusive: Many tech workers would quit if employer recorded them. https://www.axios.com/2022/05/31/tech-workers-quit-employer-recorded-surveillance (Accessed 20 June 2022)
Frith, H.: Focusing on sex: using focus groups in sex research. Sexualities 3(3), 275–297 (2000). https://doi.org/10.1177/136346000003003001
Gilliland, S.: The tails of justice: a critical examination of the dimensionality of organizational justice constructs. Hum. Resour. Manag. Rev. 18(4), 271–281 (2008). https://doi.org/10.1016/j.hrmr.2008.08.001
Gilliland, S.W.: Effects of procedural and distributive justice on reactions to a selection system. J. Appl. Psychol. 79(5), 691–701 (1994). https://doi.org/10.1037/0021-9010.79.5.691
Gilliland, S.W.: The perceived fairness of selection systems: an organizational justice perspective. AMR. 18(4), 694–734 (1993). https://doi.org/10.5465/amr.1993.9402210155
Greenberg, J.: Determinants of perceived fairness of performance evaluations. J. Appli. Psychol. 71, 2, 340 (1986)
Hsieh, H.-F., Shannon, S.E.: Three Approaches to qualitative content analysis. Qual. Health Res. 15(9), 1277–1288 (2005). https://doi.org/10.1177/1049732305276687
Iles, P.A., Robertson, I.T.: The impact of personnel selection procedures on candidates. Assessment Select. Organiz., 257–271 (1989)
Jansen, B.J., et al.: Using the web to look for work: Implications for online job seeking and recruiting. Internet Res. 15(1), 49–66 (2005). https://doi.org/10.1108/10662240510577068
Kanara, K.: Council Post: Accelerating Through The Curve: How Value Creation Teams Help PE Firms Weather Economic Storms, https://www.forbes.com/sites/forbeshumanresourcescouncil/2020/05/21/accelerating-through-the-curve-how-value-creation-teams-help-pe-firms-weather-economic-storms/ (Accessed 19 Jan 2022)
Kim, P.T.: Data-driven discrimination at work. Wm. & Mary L. Rev. 58(3), 857–936 (2016)
Kirat, T. et al.: Fairness and Explainability in Automatic Decision-Making Systems. A challenge for computer science and law. (2022)
Kirkpatrick, K.: Battling algorithmic bias: how do we ensure algorithms treat us fairly? Commun. ACM 59(10), 16–17 (2016). https://doi.org/10.1145/2983270
Kitzinger, J.: The methodology of Focus Groups: the importance of interaction between research participants. Sociol. Health Illn. 16(1), 103–121 (1994). https://doi.org/10.1111/1467-9566.ep11347023
Köchling, A., Wehner, M.C.: Discriminated by an algorithm: a systematic review of discrimination and fairness by algorithmic decision-making in the context of HR recruitment and HR development. Bus. Res. 13(3), 795–848 (2020). https://doi.org/10.1007/s40685-020-00134-w
Konradt, U., et al.: Fairness Perceptions in Web-based Selection: Impact on applicants’ pursuit intentions, recommendation intentions, and intentions to reapply. Int. J. Sel. Assess. 21(2), 155–169 (2013). https://doi.org/10.1111/ijsa.12026
Konradt, U., et al.: Patterns of change in fairness perceptions during the hiring process. Int. J. Sel. Assess. 24(3), 246–259 (2016). https://doi.org/10.1111/ijsa.12144
Kulkarni, S., Che, X.: Intelligent software tools for recruiting. J. Inter. Technol. Inform. Manag. 28(2), 2–16 (2019)
Langer, M., et al.: Highly automated job interviews: acceptance under the influence of stakes. Int. J. Sel. Assess. 27(3), 217–234 (2019). https://doi.org/10.1111/ijsa.12246
Langer, M., et al.: Information as a double-edged sword: the role of computer experience and information on applicant reactions towards novel technologies for personnel selection. Comput. Hum. Behav. 81, 19–30 (2018). https://doi.org/10.1016/j.chb.2017.11.036
Lee, M.K. et al.: Procedural justice in algorithmic fairness: leveraging transparency and outcome control for fair algorithmic mediation. In: Proceedings of ACM Human-Computer Interaction, CSCW, vol. 3, pp. 182:1–182:26 (2019). https://doi.org/10.1145/3359284
Lee, M.K.: Understanding perception of algorithmic decisions: fairness, trust, and emotion in response to algorithmic management. Big Data Soc. 5(1), 2053951718756684 (2018). https://doi.org/10.1177/2053951718756684
Leventhal, G.S.: What should be done with equity theory? In: Social Exchange, pp. 27–55 Springer. US (1980). https://doi.org/10.1007/978-1-4613-3087-5_2
Li, D. et al.: Hiring as Exploration (2020). https://papers.ssrn.com/abstract=3630630, https://doi.org/10.2139/ssrn.3630630.
Mann, G., O’Neil, C.: Hiring algorithms are not neutral. Harv. Bus. Rev. 9, 2016 (2016)
Martin, C.L., Nagao, D.H.: Some effects of computerized interviewing on job applicant responses. J. Appl. Psychol. 74(1), 72–80 (1989). https://doi.org/10.1037/0021-9010.74.1.72
McCarthy, J.M., et al.: Applicant perspectives during selection: a review addressing “so what?”, “what’s new?”, and “where to next?” J. Manag. 43(6), 1693–1725 (2017). https://doi.org/10.1177/0149206316681846
Miller, C.C.: Can an algorithm hire better than a human, vol. 25. The New York Times (2015)
Nyagadza, B., et al.: Emotions influence on customers’ e-banking satisfaction evaluation in e-service failure and e-service recovery circumstances. Soc. Sci. Humanities Open. 6, 1–14 (2022). https://doi.org/10.1016/j.ssaho.2022.100292
Oates, C.: Research training for social scientists. Presented at the, London January 11 (2022). https://doi.org/10.4135/9780857028051
O’Neil, C.: Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy, Crown (2016)
Otterbacher, J. et al.: Competent men and warm women: gender stereotypes and backlash in image search results. In: Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, pp. 6620–6631 Association for Computing Machinery, New York (2017). https://doi.org/10.1145/3025453.3025727
Powell, R.A., Single, H.M.: Focus Groups. Inter. J. Quality Health Care 8(5), 499–504 (1996). https://doi.org/10.1093/intqhc/8.5.499
Quillian, L., et al.: Meta-analysis of field experiments shows no change in racial discrimination in hiring over time. PNAS 114(41), 10870–10875 (2017). https://doi.org/10.1073/pnas.1706255114
Raub, M.: Bots, bias and big data: artificial intelligence, algorithmic bias and disparate impact liability in hiring practices comment. Ark. L. Rev. 71(2), 529–570 (2018)
Rooney, K., Khorram, Y.: Tech companies say they value diversity but reports show little change in last six years. CNBC (2020)
Rosenbaum, S., et al.: Focus groups in HCI. Presented at the CHI 2002 Extended Abstracts on Human Factors in Computing Systems - CHI 2002 (2002). https://doi.org/10.1145/506443.506554
Roth, P.L., et al.: Ethnic group differences in measures of job performance: a new meta-analysis. J. Appl. Psychol. 88(4), 694–706 (2003). https://doi.org/10.1037/0021-9010.88.4.694
RoyChowdhury, T., Srimannarayana, M.: Applicants’ perceptions on online recruitment procedures. Manag. Labour Stud. 38(3), 185–199 (2013). https://doi.org/10.1177/0258042X13509737
Ryan, A.M., Huth, M.: Not much more than platitudes? a critical look at the utility of applicant reactions research. Hum. Resour. Manag. Rev. 18(3), 119–132 (2008). https://doi.org/10.1016/j.hrmr.2008.07.004
Rynes, S.L., et al.: The importance of recruitment in job choice: a different way of looking. Pers. Psychol. 44(3), 487–521 (1991). https://doi.org/10.1111/j.1744-6570.1991.tb02402.x
Rynes, S.L.: Barber, AE: applicant attraction strategies: an organizational perspective. AMR. 15(2), 286–310 (1990). https://doi.org/10.5465/amr.1990.4308158
Sandvig, C., et al.: Auditing algorithms: Research methods for detecting discrimination on internet platforms. Data a Discriminat. Converting Critic. Concerns Productive Inquiry. 22, 4349–4357 (2014)
Schinkel, S., et al.: Selection fairness and outcomes: a field study of interactive effects on applicant reactions. Int. J. Sel. Assess. 21(1), 22–31 (2013). https://doi.org/10.1111/ijsa.12014
Schuler, H.: Social validity of selection situations: a concept and some empirical results (1993)
Sheppard, B.H., Lewicki, R.J.: Toward general principles of managerial fairness. Soc Just Res. 1(2), 161–176 (1987). https://doi.org/10.1007/BF01048014
Smithson, J.: Using and analysing focus groups: limitations and possibilities. Int. J. Soc. Res. Methodol. 3(2), 103–119 (2000). https://doi.org/10.1080/136455700405172
Stanton, J.M., Stam, K.R.: The visible employee: using workplace monitoring and surveillance to protect information assets--without compromising employee privacy or trust. Information Today, Medford, N.J (2006)
Stone, D.L., et al.: The influence of technology on the future of human resource management. Hum. Resour. Manag. Rev. 25(2), 216–231 (2015). https://doi.org/10.1016/j.hrmr.2015.01.002
Strohmeier, S.: Research in e-HRM: review and implications. Hum. Resour. Manag. Rev. 17(1), 19–37 (2007). https://doi.org/10.1016/j.hrmr.2006.11.002
Thibaut, J.W., Walker, L.: Procedural justice: a psychological analysis. L. Erlbaum Associates (1975)
Thielsch, M.T., et al.: E-recruiting and fairness: the applicant’s point of view. Inf. Technol. Manag. 13(2), 59–67 (2012). https://doi.org/10.1007/s10799-012-0117-x
Truxillo, D.M., et al.: Selection fairness information and applicant reactions: a longitudinal field study. J. Appl. Psychol. 87(6), 1020–1031 (2002). https://doi.org/10.1037/0021-9010.87.6.1020
Truxillo, D.M., et al.: The importance of organizational justice in personnel selection: defining when selection fairness really matters. Int. J. Sel. Assess. 12(1–2), 39–53 (2004). https://doi.org/10.1111/j.0965-075X.2004.00262.x
Tyler, T., Bies, R.J.: Applied social psychology and organizational settings. Beyond formal procedures: the interpersonal context of procedural justice, pp. 77–98 (1990)
Vaughn, S. et al.: Why use focus group interviews in educational and psychological research. Focus Group Interv. Educ. Psychol., 12–21 (1996)
Walker, H.J., et al.: Watch what you say: job applicants’ justice perceptions from initial organizational correspondence. Hum. Resour. Manage. 54(6), 999–1011 (2015). https://doi.org/10.1002/hrm.21655
Wang, R. et al.: Factors influencing perceived fairness in algorithmic decision-making: algorithm outcomes, development procedures, and individual differences. In: Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. pp. 1–14 Association for Computing Machinery, New York (2020). https://doi.org/10.1145/3313831.3376813
West, S.M., et al.: Discriminating systems. AI Now (2019)
Wiechmann, D., Ryan, A.M.: Reactions to computerized testing in selection contexts. Int. J. Sel. Assess. 11(2–3), 215–229 (2003). https://doi.org/10.1111/1468-2389.00245
Willard, G., et al.: Some evidence for the nonverbal contagion of racial bias. Organ. Behav. Hum. Decis. Process. 128, 96–107 (2015). https://doi.org/10.1016/j.obhdp.2015.04.002
Williams, B.A., et al.: How Algorithms discriminate based on data they lack: challenges, solutions, and policy implications. J. Inf. Policy 8, 78–115 (2018). https://doi.org/10.5325/jinfopoli.8.2018.0078
Wilson, C.. et al.: Building and auditing fair algorithms: a case study in candidate screening. In: Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency, pp. 666–677. Association for Computing Machinery, New York (2021). https://doi.org/10.1145/3442188.3445928
Woodruff, A. et al.: A qualitative exploration of perceptions of algorithmic fairness. In: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, pp. 1–14 Association for Computing Machinery, New York (2018)
Yarger, L.K., et al.: Algorithmic equity in the hiring of underrepresented IT job candidates. Online Inf. Rev. (2020). https://doi.org/10.1108/oir-10-2018-0334
Zhang, L., Yencha, C.: Examining perceptions towards hiring algorithms. Technol. Soc. 68, 101848 (2022). https://doi.org/10.1016/j.techsoc.2021.101848
Zhang, T., et al.: Working from home: small business performance and the COVID-19 pandemic. Small Bus. Econ. 58(2), 611–636 (2022). https://doi.org/10.1007/s11187-021-00493-6
Zorn, T.E., et al.: Focus groups as sites of influential interaction: building communicative self-efficacy and effecting attitudinal change in discussing controversial topics. J. Appl. Commun. Res. 34(2), 115–140 (2006). https://doi.org/10.1080/00909880600573965
Zou, J., Schiebinger, L.: AI can be sexist and racist — it’s time to make it fair. Nature 559(7714), 324–326 (2018). https://doi.org/10.1038/d41586-018-05707-8
2021 home : US Bureau of Labor Statistics. https://www.bls.gov/opub/mlr/2021/home.htm, (Accessed 21 Jan 2022)
The postpandemic workforce: Responses to a McKinsey global survey of 800 executives | McKinsey. https://www.mckinsey.com/featured-insights/future-of-work/what-800-executives-envision-for-the-postpandemic-workforce, (Accessed 19 Jan 2022)
Workforce and Learning Trends 2021 | IT Workforce | CompTIA. https://connect.comptia.org/content/research//workforce-learning-trends-2021, l(Accessed 19 Jan 2022)
Acknowledgment
This material is based upon work supported by the National Science Foundation under Grant Number 1841368. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Girona, A.E., Yarger, L. (2024). To Impress an Algorithm: Minoritized Applicants’ Perceptions of Fairness in AI Hiring Systems. In: Sserwanga, I., et al. Wisdom, Well-Being, Win-Win. iConference 2024. Lecture Notes in Computer Science, vol 14597. Springer, Cham. https://doi.org/10.1007/978-3-031-57860-1_4
Download citation
DOI: https://doi.org/10.1007/978-3-031-57860-1_4
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-57859-5
Online ISBN: 978-3-031-57860-1
eBook Packages: Computer ScienceComputer Science (R0)