Abstract
Agent abuse is emerging as a significant concern in the realm of human-chatbot interactions. Despite the relevance of this phenomenon, from a social and psychological perspective, there have been relatively few published studies on the topic over the years. This calls for special attention and a need for a comprehensive understanding of the challenges posed by abusive behaviors towards conversational agents. Following the PRISMA protocol, this review intends to systematize the knowledge currently available in the scientific domain of chatbot abuse, identifying and evaluating research published between January 1989 and July 2023 across two databases. The review sheds light on the diverse range of studies that have contributed to defining and operationalizing chatbot abuse while exploring avenues for developing evidence-based interventions to discourage verbal mistreatment of conversational agents. By building on empirical, theoretical, and conceptual works, this research promotes awareness and consciousness-raising against chatbot mistreatment, advancing the scientific community's understanding of the complexities surrounding chatbot abuse and its possible implications. In doing so, the study fosters a more ethical, respectful, and empathetic approach toward conversational agents in the digital landscape, and considering the cross-cutting and cross-cultural nature of the issue, the author prompts the need for further empirical research on the topic.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Bandura, A.: Social-learning theory of identificatory processes. Handbook Socialization Theory Res. 213, 262 (1969)
Brahnam, S., De Angeli, A.: Gender affordances of conversational agents. Interact. Comput. 24(3), 139–153 (2012)
Brahnam, S., De Angeli, A.: Special issue on the abuse and misuse of social agents. Interact. Comput. 20(3), 287–291 (2008)
Brahnam, S., Weaver, M.: Re/Framing virtual conversational partners: A feminist critique and tentative move towards a new design paradigm. In: Design, User Experience, and Usability: Users and Interactions: 4th International Conference, DUXU 2015, Held as Part of HCI International 2015, Los Angeles, CA, USA, 2–7 August 2015, Proceedings, Part II 4, pp. 172–183. Springer International Publishing (2015). https://doi.org/10.1007/978-3-319-20898-5_17
Brahnam, S.: Strategies for handling customer abuse of ECAs. Abuse: the darker side of human-computer interaction, pp. 62–67 (2005)
Chin, H., Molefi, L.W., Yi, M.Y.: Empathy is all you need: How a conversational agent should respond to verbal abuse. In: Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, pp. 1–13 (2020)
Chin, H., Yi, M.Y.: Should an agent be ignoring it? A study of verbal abuse types and conversational agents’ response styles. In: Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems, pp. 1–6 (2019)
Chin, H., Yi, M.Y.: Voices that care differently: understanding the effectiveness of a conversational agent with an alternative empathy orientation and emotional expressivity in mitigating verbal abuse. Inter. J. Hum. Comput. Interact. 38(12), 1153–1167 (2022)
Christofi, M., Pereira, V., Vrontis, D., Tarba, S., Thrassou, A.: Agility and flexibility in international business research: a comprehensive review and future research directions. J. World Bus. 56(3), 101194 (2021)
Creed, C., Beale, R.: Abusive interactions with embodied agents. Interact. Stud. 9(3), 481–503 (2008)
Curry, A.C., Rieser, V.: # MeToo Alexa: how conversational systems respond to sexual harassment. In: Proceedings of the Second ACL Workshop on Ethics in Natural Language Processing, pp. 7–14 (2018)
Darling, K.: Extending legal protection to social robots: the effects of anthropomorphism, empathy, and violent behavior towards robotic objects. In: Calo, F.K. (eds.) Robot Law, Edward Elgar (2016)
De Angeli, A., Brahnam, S., Wallis, P., Dix, A.: Misuse and abuse of interactive technologies. In CHI 2006 Extended Abstracts on Human Factors in Computing Systems, pp. 1647–1650 (2006)
De Angeli, A., Brahnam, S.: I hate you! disinhibition with virtual partners. Interact. Comput. 20(3), 302–310 (2008)
De Angeli, A., Brahnam, S.: Sex stereotypes and conversational agents. In: Proceedings of Gender and Interaction: Real and Virtual Women in a Male World, Venice, Italy (2006)
De Angeli, A., Carpenter, R.: Stupid computer! abuse and social identities. In: Proceedings of INTERACT 2005 workshop Abuse: The darker side of Human-Computer Interaction, vol. 4, pp. 19–25 (2005)
De Angeli, A.: Ethical implications of verbal disinhibition with conversational agents. PsychNology J. 7(1) (2009)
Dix, A.: Response to Sometimes it’s hard to be a robot: a call for action on the ethics of abusing artificial agents. Interact. Comput. 20(3), 334–337 (2008)
Fossa, F., Sucameli, I.: Gender bias and conversational agents: an ethical perspective on social robotics. Sci. Eng. Ethics 28(3), 23 (2022)
Gulz, A., Haake, M., Silvervarg, A., Sjödén, B., Veletsianos, G.: Building a social conversational pedagogical agent: Design challenges and methodological approaches. In: Conversational Agents and Natural Language Interaction: Techniques and Effective Practices, pp. 128–155. IGI Global (2011)
Hern, A.: Apple made Siri deflect questions on feminism, leaked papers reveal (2010). https://www.theguardian.com/technology/2019/sep/06/apple-rewrote-siri-to-deflect-questions-about-feminism(Accessed 26 August 2023)
Hill, J., Ford, W.R. farreras, I G.: Real conversations with artificial intelligence: A comparison between human-human online conversations and human-chatbot conversations. Comput. Hum. Behav. 49, 245–250 (2015)
Hilvert-Bruce, Z., Neill, J.T.: I’m just trolling: the role of normative beliefs in aggressive behaviour in online gaming. Comput. Hum. Behav. 102, 303–311 (2020)
Iacobucci, S., De Cicco, R.: A literature review of bullshit receptivity: perspectives for an informed policy making against misinformation. J. Behav. Econ. Policy 6, 23–40 (2022)
Keijsers, M., Bartneck, C., Eyssel, F.: What’s to bullying a bot? correlates between chatbot humanlikeness and abuse. Interact. Stud. 22(1), 55–80 (2021)
Keijsers, M., Bartneck, C.: Mindless robots get bullied. In: Proceedings of the 2018 ACM/IEEE International Conference on Human-robot Interaction, pp. 205–214 (2018)
Kitchenham, B., Charters, S.: Guidelines for performing systematic literature reviews in software engineering, Technical Report EBSE-2007-01. Keele University, School of Computer Science and Mathematics (2007)
Massaro, M., Dumay, J., Guthrie, J.: On the shoulders of giants: undertaking a structured literature review in accounting. Accounting, Auditing Accountability J, 29(5), 767–801 (2016)
Morel, M.-A.: Computer–human communication. In: Taylor, M., Neel, F., Bouhuis, D. (eds.) The Structure of Multimodal Communication, pp. 323–330. North-Holland Elsevier, Amsterdam (1989)
Page, M.J., et al.: The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. Int. J. Surg. 88, 105906 (2021)
Park, N., Jang, K., Cho, S., Choi, J.: Use of offensive language in human-artificial intelligence chatbot interaction: the effects of ethical ideology, social competence, and perceived humanlikeness. Comput. Hum. Behav. 121, 106795 (2021)
Pereira, V., Santos, J., Leite, F., Escórcio, P.: Using BIM to improve building energy efficiency–a scientometric and systematic review. Energy Build. 250, 111292 (2021)
Pluta, A., Mazurek, J., Wojciechowski, J., Wolak, T., Soral, W., Bilewicz, M.: Exposure to hate speech deteriorates neurocognitive mechanisms of the ability to understand others’ pain. Sci. Rep. 13(1), 4127 (2023)
Pontzer, D.: A theoretical test of bullying behavior: Parenting, personality, and the bully/victim relationship. J. Family Violence 25, 259–273 (2010)
Reeves, B., Nass, C.: The media equation: How people treat computers, television, and new media like real people, vol. 10(10), Cambridge, UK (1996)
Silvervarg, A., Raukola, K., Haake, M., Gulz, A.: The effect of visual gender on abuse in conversation with ECAs. In: Nakano, Y., Neff, M., Paiva, A., Walker, M. (eds.) IVA 2012. LNCS (LNAI), vol. 7502, pp. 153–160. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-33197-8_16
Strait, M., Contreras, V., Vela, C.D.: Verbal disinhibition towards robots is associated with general antisociality. arXiv preprint arXiv:1808.01076 (2018)
Teicher, M.H., Samson, J.A.: Childhood maltreatment and psychopathology: a case for ecophenotypic variants as clinically and neurobiologically distinct subtypes. Am. J. Psychiatry 170, 1114–1133 (2013)
Veletsianos, G., Miller, C., Doering, A.: EnALI: a research and design framework for virtual characters and pedagogical agents. J. Educ. Comput. Res. 41(2), 171–194 (2009)
Veletsianos, G., Scharber, C., Doering, A.: When sex, drugs, and violence enter the classroom: conversations between adolescents and a female pedagogical agent. Interact. Comput. 20(3), 292–301 (2008)
Whitby, B.: Sometimes it’s hard to be a robot: a call for action on the ethics of abusing artificial agents. Interact. Comput. 20(3), 326–333 (2008)
Whittaker, L., Mulcahy, R., Letheren, K., Kietzmann, J., Russell-Bennett, R.: Mapping the deepfake landscape for innovation: a multidisciplinary systematic review and future research agenda. Technovation 125, 102784 (2023)
Yun, J.Y., Shim, G., Jeong, B.: Verbal abuse related to self-esteem damage and unjust blame harms mental health and social interaction in college population. Sci. Rep. 9(1), 5655 (2019)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
De Cicco, R. (2024). Exploring the Dark Corners of Human-Chatbot Interactions: A Literature Review on Conversational Agent Abuse. In: Følstad, A., et al. Chatbot Research and Design. CONVERSATIONS 2023. Lecture Notes in Computer Science, vol 14524. Springer, Cham. https://doi.org/10.1007/978-3-031-54975-5_11
Download citation
DOI: https://doi.org/10.1007/978-3-031-54975-5_11
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-54974-8
Online ISBN: 978-3-031-54975-5
eBook Packages: Computer ScienceComputer Science (R0)