Skip to main content

It’s a Long Way to Neutrality. An Evaluation of Gendered Artificial Faces

  • Conference paper
  • First Online:
Design, User Experience, and Usability (HCII 2023)

Abstract

Implementing gender-neutral virtual agents seems to be one possible solution to the problem of designing technologies which do not represent and convey gender stereotypes. Three tests were structured with the intention of selecting faces of male, female, or neutral gender hypothetical virtual agents. In each of these tests 30 participants assessed the gender and age of 9 hypothetical virtual agent faces by means of an online questionnaire. From the results of these tests, 3 faces were selected, one male, one female and one neutral, which were assessed through an online questionnaire (N  =  83) with reference to some feminine or masculine characteristics of their personality. The willingness/pleasure to interact with artificial agents having those faces was also assessed. The results highlighted the difficulty in synthesizing faces that are perceived as absolutely neutral. Evaluations of the stimulus characterized by greater gender neutrality were less likely to refer to a female stereotype. The stimulus representing a gender-neutral face resulted also less accepted and liked than the male stimulus in all aspects considered and, in fewer aspects, than the female stimulus as well.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 109.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 139.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Nass, C., Steuer, J., Tauber, E.R.: Computers are social actors. In: Proceedings of SIGCHI 1994 Human Factors in Computing Systems, pp. 72–78. ACM (1994). https://doi.org/10.1145/259963.260288

  2. Epley, N., Waytz, A., Cacioppo, J.T.: On seeing human: a three-factor theory of anthropomorphism. Psychol. Rev. 114(4), 864–886 (2007). https://doi.org/10.1037/0033-295X.114.4

    Article  Google Scholar 

  3. Blut, M., Wang, C., Wünderlich, N.V., Brock, C.: Understanding anthropomorphism in service provision: a meta-analysis of physical robots, chatbots, and other AI. J. Acad. Mark. Sci. 49(4), 632–658 (2021). https://doi.org/10.1007/s11747-020-00762-y

    Article  Google Scholar 

  4. Bernotat, J., Eyssel, F., Sachse, J.: The (fe)male robot: how robot body shape impacts first impressions and trust towards robots. Int. J. Soc. Robot. 13(3), 477–489 (2019). https://doi.org/10.1007/s12369-019-00562-7

    Article  Google Scholar 

  5. Parlangeli, O., Palmitesta, P., Bracci, M., Marchigiani, E., Guidi, S.: Gender role stereotypes at work in humanoid robots. Behav. Info. Technol. (2022). https://doi.org/10.1080/0144929X.2022.2150565

    Article  Google Scholar 

  6. Eagly, A.H., Nater, C., Miller, D.I., Kaufmann, M., Sczesny, S.:. Gender stereotypes have changed: a cross-temporal meta-analysis of U.S. public opinion polls from 1946 to 2018. Am. Psychol. 75(3), 301–315. (2019). https://doi.org/10.1037/amp0000494

  7. Bracci, M., Guidi, S., Marchigiani, E., Masini, M., Palmitesta, P., Parlangeli, O.: Perception of faces and elaboration of gender and victim/aggressor stereotypes: the influence of internet use and of the perceiver’s personality. Front. Psychol. 12, 561480 (2021). https://doi.org/10.3389/fpsyg.2021.561480

    Article  Google Scholar 

  8. Perugia, G., Guidi, S., Bicchi, M., Parlangeli, O.: The shape of our bias: perceived age and gender in the humanoid robots of the ABOT database. In: HRI 2022, Proceedings of the 2022 ACM/IEEE International Conference on Human-Robot Interaction, pp. 110–119. ACM - IEEE Press (2022). https://doi.org/10.1109/HRI53351.2022.9889366

  9. Carpenter, J.: Why project Q is more than the world’s first nonbinary voice for technology. Interactions 26(6), 56–59 (2019). https://doi.org/10.1145/3358912

    Article  Google Scholar 

  10. Nag, P., Yalçın, Ö.: Gender stereotypes in virtual agents. In: Proceedings of the 20th ACM International Conference on Intelligent Virtual Agents, vol. 41, pp. 1–8 (2020). https://doi.org/10.1145/3383652.3423876

  11. Mooshammer, S., Etzrodt, K.: Social research with gender-neutral voices in chatbots—the generation and evaluation of artificial gender-neutral voices with Praat and Google WaveNet. In: Følstad, A., et al. (Eds.) Lecture Notes in Computer Science. Chatbot Research and Design, vol. 13171, pp. 176–191, Springer International Publishing, Cham (2022). https://doi.org/10.1007/978-3-030-94890-0_11

  12. Koda, T., Tsuji, S., Takase, M.: Measuring subconscious gender biases against male and female virtual agents in Japan. In: HAI 2022, Proceedings of the 10th International Conference on Human-Agent Interaction, pp. 275–277. ACM, New York (2022). https://doi.org/10.1145/3527188.3563909

  13. Nass, C., Moon, Y., Green, N.: Are machines gender neutral? Gender-stereotypic responses to computers with voices. J. Appl. Soc. Psychol. 27, 864–876 (1997). https://doi.org/10.1111/j.1559-1816.1997.tb00275.x

    Article  Google Scholar 

  14. Baird, A., Jørgensen, S.H., Parada-Cabaleiro, E., Cummins, N., Hantke, S., Schuller, B.: The perception of vocal traits in synthesized voices: age, gender, and human likeness. J. Audio Eng. Soc. 66(4), 277–285 (2018). https://doi.org/10.17743/jaes.2018.0023

    Article  Google Scholar 

  15. Silvervarg, A., Raukola, K., Haake, M., Gulz, A.: The effect of visual gender on abuse in conversation with ECAs. In: Nakano, Y., Neff, M., Paiva, A., Walker, M. (eds.) IVA 2012. LNCS (LNAI), vol. 7502, pp. 153–160. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-33197-8_16

    Chapter  Google Scholar 

  16. Ladwig, R.C., Ferstl, E.C.: What’s in a name? An online survey on gender stereotyping of humanoid social robots. In: Proceedings of 4th Gender & IT Conference, Heilbronn, Germany (GenderIT2018). ACM, New York (2018). https://doi.org/10.1145/3196839.3196851

  17. Makhortykh, M., Urman, A., Ulloa, R.: Detecting race and gender bias in visual representation of AI on web search engines. In: Boratto, L., Faralli, S., Marras, M., Stilo, G. (eds.) BIAS 2021. CCIS, vol. 1418, pp. 36–50. Springer, Cham (2021). https://doi.org/10.1007/978-3-030-78818-6_5

    Chapter  Google Scholar 

  18. Armando, M., Ochs, M., Régner, I.: The impact of pedagogical agents’ gender on academic learning: a systematic review. Front. Artif. Intell. 5, 862997 (2022). https://doi.org/10.3389/frai.2022.862997

    Article  Google Scholar 

  19. Choi, N., Fuqua, D.R., Newman, J.L.: Exploratory and confirmatory studies of the structure of the Bem sex role inventory short form with two divergent samples. Educ. Psychol. Meas. 69(4), 696–705 (2009). https://doi.org/10.1177/0013164409332218

    Article  MathSciNet  Google Scholar 

  20. Huang, R., Kim, M., Lennon, S.: Trust as a second-order construct: Investigating the relationship between consumers and virtual agents. Telemat. Inform. 70, 101811 (2022). https://doi.org/10.1016/j.tele.2022.101811

    Article  Google Scholar 

  21. Scutella, M., Plewa, C., Reaiche, C.: Virtual agents in the public service: examining citizens’ value-in-use. Public Manag. Rev. (2022). https://doi.org/10.1080/14719037.2022.2044504

    Article  Google Scholar 

  22. Venkatesh, V., Davis, F.D.: A theoretical extension of the technology acceptance model: four longitudinal field studies. Manage. Sci. 46(2), 186–204 (2000). https://doi.org/10.1287/mnsc.46.2.186.11926

    Article  Google Scholar 

  23. Yuksel, B.F., Collisson, P., Czerwinski, M.: Brains or beauty: how to engender trust in user-agent interactions. ACM Trans. Internet Technol. (TOIT) 17(1), 1–20 (2017). https://doi.org/10.1145/2998572

    Article  Google Scholar 

  24. Strojny, P.M., Dużmańska-Misiarczyk, N., Lipp, N., Strojny, A.: Moderators of social facilitation effect in virtual reality: co-presence and realism of virtual agents. Front. Psychol. 11, 1252 (2020). https://doi.org/10.3389/fpsyg.2020.01252

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Oronzo Parlangeli .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Parlangeli, O., Palmitesta, P., Masi, L., Tittarelli, M., Guidi, S. (2023). It’s a Long Way to Neutrality. An Evaluation of Gendered Artificial Faces. In: Marcus, A., Rosenzweig, E., Soares, M.M. (eds) Design, User Experience, and Usability. HCII 2023. Lecture Notes in Computer Science, vol 14033. Springer, Cham. https://doi.org/10.1007/978-3-031-35708-4_27

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-35708-4_27

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-35707-7

  • Online ISBN: 978-3-031-35708-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics