Skip to main content

The Influence of Gender and Skin Colour on the Watchlist Imbalance Effect in Facial Identification Scenarios

  • Conference paper
  • First Online:
Pattern Recognition, Computer Vision, and Image Processing. ICPR 2022 International Workshops and Challenges (ICPR 2022)

Abstract

Nowadays it is well known that artificial intelligence can be biased. In biometric recognition, this is a very sensitive topic since biased algorithms often discriminate against specific demographic groups. This can have severe consequences when searching criminal databases or blacklists. In this context, the watchlist imbalance effect might induce additional performance differentials based on the demographic composition of the target database. In this work, we utilise a fairly distributed subset of the FairFace database to evaluate the watchlist imbalance effect when combining the demographic attributes gender and skin colour. The results show that the skin colour has a huge impact on the differential performance to the disadvantage of dark skin tones.

This research work has been funded by the German Federal Ministry of Education and Research and the Hessian Ministry of Higher Education, Research, Science and the Arts within their joint support of the National Research Center for Applied Cybersecurity ATHENE.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    model: LResNet100E-IR,ArcFace@ms1m-refine-v2

    https://github.com/deepinsight/insightface/wiki/Model-Zoo/6633390634bcf907c383cc6c90b62b6700df2a8e.

  2. 2.

    FaceQnet: https://github.com/uam-biometrics/FaceQnet.

References

  1. Abdurrahim, S.H., Samad, S.A., Huddin, A.B.: Review on the effects of age, gender, and race demographics on automatic face recognition. Vis. Comput. 34(11), 1617–1630 (2017). https://doi.org/10.1007/s00371-017-1428-z

    Article  Google Scholar 

  2. Agarwal, A., Agarwal, H., Agarwal, N.: Fairness score and process standardization: Framework for fairness certification in artificial intelligence systems. AI and Ethics, pp. 1–13 (2022). https://doi.org/10.1007/s43681-022-00147-7

  3. Albiero, V., Krishnapriya, K.S., Vangara, K., Zhang, K., King, M.C., Bowyer, K.W.: Analysis of gender inequality in face recognition accuracy. In: Proceedings IEEE/CVF Winter Conference on Applications of Computer Vision Workshops, pp. 81–89 (2020). https://doi.org/10.1109/WACVW50321.2020.9096947

  4. Angwin, J., Larson, J., Mattu, S., Kirchner, L.: Machine Bias: there’s software used across the country to predict future criminals and it’s biased against blacks. ProPublica (2016)

    Google Scholar 

  5. Das, A., Dantcheva, A., Bremond, F.: Mitigating bias in gender, age and ethnicity classification: a multi-task convolution neural network approach. In: Leal-Taixé, L., Roth, S. (eds.) ECCV 2018. LNCS, vol. 11129, pp. 573–585. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-11009-3_35

    Chapter  Google Scholar 

  6. de Freitas Pereira, T., Marcel, S.: Fairness in biometrics: a figure of merit to assess biometric verification systems. IEEE Trans. Biometr. Behav. Ident. Sci. 4(1), 19–29 (2021). https://doi.org/10.1109/TBIOM.2021.3102862

  7. Deng, J., Guo, J., Zafeiriou, S.: ArcFace: additive angular margin loss for deep face recognition. In: Conference on Computer Vision and Pattern Recognition (CVPR) (2019). https://doi.org/10.1109/TPAMI.2021.3087709

  8. Drozdowski, P., Rathgeb, C., Busch, C.: Computational workload in biometric identification systems: an overview. IET Biometrics 8(6), 351–368 (2019). https://doi.org/10.1049/iet-bmt.2019.0076

    Article  Google Scholar 

  9. Drozdowski, P., Rathgeb, C., Busch, C.: The watchlist imbalance effect in biometric face identification: comparing theoretical estimates and empiric measurements. In: International Conference on Computer Vision Workshops (ICCVW), pp. 1–9. IEEE/CVF (2021). https://doi.org/10.1109/ICCVW54120.2021.00419

  10. Drozdowski, P., Rathgeb, C., Dantcheva, A., Damer, N., Busch, C.: Demographic bias in biometrics: a survey on an emerging challenge. Trans. Technol. Soc. (TTS) 1(2), 89–103 (2020). https://doi.org/10.1109/TTS.2020.2992344

    Article  Google Scholar 

  11. Du, M., Yang, F., Zou, N., Hu, X.: Fairness in deep learning: a computational perspective. IEEE Intell. Syst. 36(4), 25–34 (2020). https://doi.org/10.1109/MIS.2020.3000681

    Article  Google Scholar 

  12. eu-LISA: Best practice technical guidelines for automated border control (ABC) systems. Tech. rep. TT-02-16-152-EN-N, European Agency for the Management of Operational Cooperation at the External Borders of the Member States of the European Union (2015)

    Google Scholar 

  13. Garvie, C.: The perpetual line-up: Unregulated police face recognition in America. Center on Privacy & Technology, Georgetown Law (2016)

    Google Scholar 

  14. Grother, P., Ngan, M., Hanaoka, K.: Ongoing face recognition vendor test (FRVT) part 3: Demographic effects. National Institute of Standards and Technology (NIST), vol. 8280 (2019)

    Google Scholar 

  15. Hernandez-Ortega, J., Galbally, J., Fierrez, J., Haraksim, R., Beslay, L.: FaceQnet: quality assessment for face recognition based on deep learning. In: International Conference on Biometrics (ICB), pp. 1–8. IEEE (2019). https://doi.org/10.1109/ICB45273.2019.8987255

  16. Hernandez-Ortega, J., Galbally, J., Fierrez, J., L. Beslay, L.: Biometric quality: review and application to face recognition with FaceQnet. arXiv preprint arXiv:2006.03298 (2020). https://doi.org/10.48550/arXiv.2006.03298

  17. Howard, J.J., Laird, E.J., Sirotin, Y.B., Rubin, R.E., Tipton, J.L., Vemury, A.R.: Evaluating proposed fairness models for face recognition algorithms. arXiv preprint arXiv:2203.05051 (2022). https://doi.org/10.48550/arXiv.2203.05051

  18. Howard, J.J., Sirotin, Y.B., Vemury, A.R.: The effect of broad and specific demographic homogeneity on the imposter distributions and false match rates in face recognition algorithm performance. In: IEEE International Conference on Biometrics Theory, Applications and Systems (BTAS), pp. 1–8. IEEE (2019). https://doi.org/10.1109/BTAS46853.2019.9186002

  19. ISO/IEC JTC1 SC37 Biometrics: ISO/IEC 19795–10. Information Technology - Biometric Performance Testing and Reporting - Part 10: Quantifying Biometric System Performance Variation Across Demographic Groups. International Organization for Standardization, unpublished draft

    Google Scholar 

  20. ISO/IEC JTC1 SC37 Biometrics: ISO/IEC 19795–1:2021. Information Technology - Biometric Performance Testing and Reporting - Part 1: Principles and Framework. International Organization for Standardization (2021)

    Google Scholar 

  21. Karkkainen, K., Joo, J.: FairFace: face attribute dataset for balanced race, gender, and age for bias measurement and mitigation. In: Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision, pp. 1548–1558 (2021). https://doi.org/10.1109/WACV48630.2021.00159

  22. Klare, B.F., Burge, M.J., Klontz, J.C., Bruegge, R.W.V., Jain, A.K.: Face recognition performance: Role of demographic information. IEEE Trans. Inform. Forensics Secur. (TIFS) 7(6), 1789–1801 (2012). https://doi.org/10.1109/TIFS.2012.2214212

  23. Kortylewski, A., Egger, B., Schneider, A., Gerig, T., Morel-Forster, A., Vetter, T.: Analyzing and reducing the damage of dataset bias to face recognition with synthetic data. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, pp. 2261–2268 (2019). https://doi.org/10.1109/CVPRW.2019.00279

  24. Krishnapriya, K.S., Albiero, V., Vangara, K., King, M.C., Bowyer, K.W.: Issues related to face recognition accuracy varying based on race and skin tone. IEEE Trans. Technol. Soc. 1(1), 8–20 (2020). https://doi.org/10.1109/TTS.2020.2974996

    Article  Google Scholar 

  25. Mehrabi, N., Morstatter, F., Saxena, N., Lerman, K., Galstyan, A.: A survey on bias and fairness in machine learning. ACM Comput. Surv. (CSUR) 54(6), 1–35 (2021). https://doi.org/10.1145/3457607

    Article  Google Scholar 

  26. Menezes, H.F., Ferreira, A.S.C., Pereira, E.T., Gomes, H.M.: Bias and fairness in face detection. In: Proceedings Conference on Graphics, Patterns and Images (SIBGRAPI), pp. 247–254. IEEE (2021). https://doi.org/10.1109/SIBGRAPI54419.2021.00041

  27. O’Toole, A.J., Phillips, P.J., Jiang, F., Ayyad, J., Penard, N., Abdi, H.: Face recognition algorithms surpass humans matching faces over changes in illumination. IEEE Trans. Pattern Anal. Mach. Intell. (TPAMI) 29(9), 1642–1646 (2007). https://doi.org/10.1109/TPAMI.2007.1107

  28. O’Toole, A.J., Phillips, P.J., Narvekar, A.: Humans versus algorithms: comparisons from the face recognition vendor test 2006. In: IEEE Intl. Conf. on Automatic Face & Gesture Recognition, pp. 1–6. IEEE (2008). https://doi.org/10.1109/AFGR.2008.4813318

  29. Park, S., Kim, S., Lim, Y.: Fairness audit of machine learning models with confidential computing. In: Proceedings of the ACM Web Conference, pp. 3488–3499 (2022). https://doi.org/10.1145/3485447.3512244

  30. Pessach, D., E. Shmueli, E.: A review on fairness in machine learning. ACM Comput. Surv. (CSUR) 55(3), 1–44 (2022). https://doi.org/10.1145/3494672

  31. Rathgeb, C., Drozdowski, P., Damer, N., Frings, D.C., Busch, C.: Demographic fairness in biometric systems: what do the experts say? arXiv preprint arXiv:2105.14844 (2021). https://doi.org/10.48550/arXiv.2105.14844

  32. Ricanek, K., Tesafaye, T.: MORPH: a longitudinal image database of normal adult age-progression. In: Intl. Conference on Automatic Face and Gesture Recognition (FGR), pp. 341–345. IEEE Computer Society (2006). https://doi.org/10.1109/FGR.2006.78

  33. Segal, S., Adi, Y., Pinkas, B., Baum, C., Ganesh, C., Keshet, J.: Fairness in the eyes of the data: Certifying machine-learning models. In: Proceedings AAAI/ACM Conference on AI, Ethics, and Society, pp. 926–935 (2021). https://doi.org/10.1145/3461702.3462554

  34. Serna, I., Morales, A., Fierrez, J., Cebrian, M., Obradovich, N., Rahwan, I.: Algorithmic discrimination: Formulation and exploration in deep learning-based face biometrics. In: Proceedings of the Workshop on Artificial Intelligence Safety (SafeAI), pp. 146–152 (2020)

    Google Scholar 

  35. Sirotin, Y.B., Vemury, A.R.: Demographic variation in the performance of biometric systems: Insights gained from large-scale scenario testing. EAB Virtual Events Series - Demographic Fairness in Biometric Systems (2021)

    Google Scholar 

  36. Sixta, T., Jacques Junior, J.C.S., Buch-Cardona, P., Vazquez, E., Escalera, S.: FairFace challenge at ECCV 2020: analyzing bias in face recognition. In: Bartoli, A., Fusiello, A. (eds.) ECCV 2020. LNCS, vol. 12540, pp. 463–481. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-65414-6_32

    Chapter  Google Scholar 

  37. Tan, S., Shen, Y., Zhou, B.: Improving the fairness of deep generative models without retraining. arXiv preprint arXiv:2012.04842 (2020)

  38. Terhörst, P., Kolf, J.N., Damer, N., Kirchbuchner, F., Kuijper, A.: Face quality estimation and its correlation to demographic and non-demographic bias in face recognition. In: Proceedings IEEE International Joint Conference on Biometrics (IJCB), pp. 1–11. IEEE (2020). https://doi.org/10.1109/IJCB48548.2020.9304865

  39. Terhörst, P., Kolf, J.N., Huber, M., Kirchbuchner, F., Damer, N., et al.: A comprehensive study on face recognition biases beyond demographics. IEEE Trans. Technol. Soc. (TTS) 3(1), 16–30 (2021). https://doi.org/10.1109/TTS.2021.3111823

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jascha Kolberg .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Kolberg, J., Rathgeb, C., Busch, C. (2023). The Influence of Gender and Skin Colour on the Watchlist Imbalance Effect in Facial Identification Scenarios. In: Rousseau, JJ., Kapralos, B. (eds) Pattern Recognition, Computer Vision, and Image Processing. ICPR 2022 International Workshops and Challenges. ICPR 2022. Lecture Notes in Computer Science, vol 13643. Springer, Cham. https://doi.org/10.1007/978-3-031-37660-3_33

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-37660-3_33

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-37659-7

  • Online ISBN: 978-3-031-37660-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics