Skip to main content

Do Perceived Gender Biases in Retrieval Results Affect Relevance Judgements?

  • Conference paper
  • First Online:
Advances in Bias and Fairness in Information Retrieval (BIAS 2022)

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 1610))

Abstract

This work investigates the effect of gender-stereotypical biases in the content of retrieved results on the relevance judgement of users/annotators. In particular, since relevance in information retrieval (IR) is a multi-dimensional concept, we study whether the value and quality of the retrieved documents for some bias-sensitive queries can be judged differently when the content of the documents represents different genders. To this aim, we conduct a set of experiments where the genders of the participants are known as well as experiments where the participants’ genders are not specified. The set of experiments comprise of retrieval tasks, where participants perform a rated relevance judgement for different search query and search result document compilations. The shown documents contain different gender indications and are either relevant or non-relevant to the query. The results show the differences between the average judged relevance scores among documents with various gender contents. Our work initiates further research on the connection of the perception of gender stereotypes in users with their judgements and effects on IR systems, and aim to raise awareness about the possible biases in this domain.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    Bias-sensitive refers to a gender-neutral query whose bias in its retrieval results is considered as socially problematic [16, 23].

References

  1. Bajaj, P., et al.: MS MARCO: a human generated machine reading comprehension dataset. arXiv:1611.09268 [cs], October 2018

  2. Behm-Morawitz, E., Mastro, D.: The effects of the sexualization of female video game characters on gender stereotyping and female self-concept. Sex Roles 61(11–12), 808–823 (2009). https://doi.org/10.1007/s11199-009-9683-8

    Article  Google Scholar 

  3. Bonart, M., Samokhina, A., Heisenberg, G., Schaer, P.: An investigation of biases in web search engine query suggestions. Online Inf. Rev. 44(2), 365–381 (2019)

    Article  Google Scholar 

  4. Chang, K.W., Prabhakaran, V., Ordonez, V.: Bias and fairness in natural language processing. In: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP): Tutorial Abstracts (2019)

    Google Scholar 

  5. Chen, L., Ma, R., Hannák, A., Wilson, C.: Investigating the impact of gender on rank in resume search engines. In: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, pp. 1–14 (2018)

    Google Scholar 

  6. Cohen, G.L., Garcia, J.: “i am us’’: negative stereotypes as collective threats. J. Pers. Soc. Psychol. 89(4), 566 (2005)

    Article  Google Scholar 

  7. Craswell, N., Mitra, B., Yilmaz, E., Campos, D., Voorhees, E.M.: Overview of the TREC 2019 deep learning track. arXiv preprint arXiv:2003.07820 (2020)

  8. Danks, D., London, A.J.: Algorithmic bias in autonomous systems. In: Proceedings of the 26th International Joint Conference on Artificial Intelligence, pp. 4691–4697 (2017)

    Google Scholar 

  9. Fabris, A., Purpura, A., Silvello, G., Susto, G.A.: Gender stereotype reinforcement: measuring the gender bias conveyed by ranking algorithms. Inf. Process. Manag. 57(6), 102377 (2020)

    Article  Google Scholar 

  10. Gerhart, S.: Do web search engines suppress controversy? First Monday 9(1) (2004). https://doi.org/10.5210/fm.v9i1.1111

  11. Gezici, G., Lipani, A., Saygin, Y., Yilmaz, E.: Evaluation metrics for measuring bias in search engine results. Inf. Retrieval J. 24(2), 85–113 (2021). https://doi.org/10.1007/s10791-020-09386-w

    Article  Google Scholar 

  12. Glick, P., Fiske, S.T.: Sexism and other “isms”: independence, status, and the ambivalent content of stereotypes. In: Sexism and Stereotypes in Modern Society: The Gender Science of Janet Taylor Spence, pp. 193–221. American Psychological Association (1999). https://doi.org/10.1037/10277-008

  13. Hentschel, T., Heilman, M.E., Peus, C.V.: The multiple dimensions of gender stereotypes: a current look at men’s and women’s characterizations of others and themselves. Front. Psychol. 10, 11 (2019)

    Article  Google Scholar 

  14. Kay, M., Matuszek, C., Munson, S.A.: Unequal representation and gender stereotypes in image search results for occupations. In: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, pp. 3819–3828 (2015)

    Google Scholar 

  15. Kordzadeh, N., Ghasemaghaei, M.: Algorithmic bias: review, synthesis, and future research directions. Eur. J. Inf. Syst. 31(3), 388–409 (2021)

    Article  Google Scholar 

  16. Krieg, K., Parada-Cabaleiro, E., Medicus, G., Lesota, O., Schedl, M., Rekabsaz, N.: Grep-BiasIR: a dataset for investigating gender representation-bias in information retrieval results. arXiv:2201.07754 [cs] (2022)

  17. Melchiorre, A.B., Rekabsaz, N., Parada-Cabaleiro, E., Brandl, S., Lesota, O., Schedl, M.: Investigating gender fairness of recommendation algorithms in the music domain. Inf. Process. Manag. (2021). https://doi.org/10.1016/j.ipm.2021.102666

    Article  Google Scholar 

  18. Moss-Racusin, C.A., Phelan, J.E., Rudman, L.A.: When men break the gender rules: status incongruity and backlash against modest men. Psychol. Men Masculinity 11(2), 140 (2010)

    Article  Google Scholar 

  19. Otterbacher, J., Bates, J., Clough, P.: Competent men and warm women: gender stereotypes and backlash in image search results. In: Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, pp. 6620–6631 (2017)

    Google Scholar 

  20. Otterbacher, J., Checco, A., Demartini, G., Clough, P.: Investigating user perception of gender bias in image search: the role of sexism. In: The 41st International ACM SIGIR Conference on Research & Development in Information Retrieval, pp. 933–936 (2018)

    Google Scholar 

  21. Pan, B., Hembrooke, H., Joachims, T., Lorigo, L., Gay, G., Granka, L.: In Google we trust: users’ decisions on rank, position, and relevance. J. Comput. Mediated Commun. 12(3), 801–823 (2007)

    Article  Google Scholar 

  22. Perez, C.C.: Invisible Women: Exposing Data Bias in a World Designed for Men. Random House (2019)

    Google Scholar 

  23. Rekabsaz, N., Kopeinik, S., Schedl, M.: Societal biases in retrieved contents: measurement framework and adversarial mitigation of BERT rankers. In: Proceedings of the 44th International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 306–316 (2021)

    Google Scholar 

  24. Rekabsaz, N., Lesota, O., Schedl, M., Brassey, J., Eickhoff, C.: TripClick: the log files of a large health web search engine. In: Proceedings of the 44th International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 2507–2513. Association for Computing Machinery, New York, July 2021

    Google Scholar 

  25. Rekabsaz, N., Schedl, M.: Do neural ranking models intensify gender bias? In: Proceedings of the 43rd International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 2065–2068 (2020)

    Google Scholar 

  26. Rekabsaz, N., West, R., Henderson, J., Hanbury, A.: Measuring societal biases from text corpora with smoothed first-order co-occurrence. In: Proceedings of the Fifteenth International AAAI Conference on Web and Social Media, ICWSM 2021, Held Virtually, 7–10 June 2021, pp. 549–560. AAAI Press (2021)

    Google Scholar 

  27. Sattler, K.M., Deane, F.P., Tapsell, L., Kelly, P.J.: Gender differences in the relationship of weight-based stigmatisation with motivation to exercise and physical activity in overweight individuals. Health Psychol. Open 5(1) (2018)

    Google Scholar 

  28. Shah, H.: Algorithmic accountability. Philos. Trans. Roy. Soc. A Math. Phys. Eng. Sci. 376(2128), 20170362 (2018)

    Article  Google Scholar 

  29. Sherman, J.W.: Development and mental representation of stereotypes. J. Pers. Soc. Psychol. 70(6), 1126 (1996)

    Article  Google Scholar 

  30. Silva, S., Kenney, M.: Algorithms, platforms, and ethnic bias. Commun. ACM 62(11), 37–39 (2019)

    Article  Google Scholar 

  31. Simpson, J.A., Kenrick, D.T.: Evolutionary Social Psychology. Psychology Press (2014)

    Google Scholar 

  32. Stangor, C., Jhangiani, R., Tarry, H., et al.: Principles of Social Psychology. BCcampus (2014)

    Google Scholar 

  33. Steele, C.M., Aronson, J.: Stereotype threat and the intellectual test performance of African Americans. J. Pers. Soc. Psychol. 69(5), 797 (1995)

    Article  Google Scholar 

Download references

Acknowledgements

This work received financial support by the Austrian Science Fund (FWF): P33526 and DFH-23; and by the State of Upper Austria and the Federal Ministry of Education, Science, and Research, through grant LIT-2020-9-SEE-113 and LIT-2021-YOU-215. We thank Robert Bosch GmbH for providing financial support for the conference registration and travel costs of the first author.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Klara Krieg .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Krieg, K., Parada-Cabaleiro, E., Schedl, M., Rekabsaz, N. (2022). Do Perceived Gender Biases in Retrieval Results Affect Relevance Judgements?. In: Boratto, L., Faralli, S., Marras, M., Stilo, G. (eds) Advances in Bias and Fairness in Information Retrieval. BIAS 2022. Communications in Computer and Information Science, vol 1610. Springer, Cham. https://doi.org/10.1007/978-3-031-09316-6_10

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-09316-6_10

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-09315-9

  • Online ISBN: 978-3-031-09316-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics