Skip to main content

Can Chatbot Anthropomorphism and Empathy Mitigate the Impact of Customer Anger on Satisfaction?

  • Conference paper
  • First Online:
Wisdom, Well-Being, Win-Win (iConference 2024)

Abstract

When customers initiate inquiries with negative emotions following a service failure, whether chatbot service agents can alleviate the undesirable outcomes resulting from negative emotions poses significant challenges for researchers and practitioners. Drawing upon computers are social actors (CASA) framework, this study examines how chatbot anthropomorphism and empathy features function as emotion-focused service recovery to mitigate the adverse effect of customer anger on satisfaction. The model is validated through a three-way factorial between-subjects experiment. The results demonstrate that anger negatively affects satisfaction. Chatbots with empathy features can mitigate this negative impact, whereas chatbots with anthropomorphism features cannot. The findings provide theoretical and practical implications for designing chatbot service agents to achieve emotion-based service recovery.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

References

  1. Følstad, A., Araujo, T., Law, E.L.-C., Brandtzaeg, P.B., Papadopoulos, S., Reis, L., et al.: Future directions for chatbot research: an interdisciplinary research agenda. Computing 103(12), 2915–2942 (2021). https://doi.org/10.1007/s00607-021-01016-7

    Article  Google Scholar 

  2. Crolic, C., Thomaz, F., Hadi, R., Stephen, A.T.: Blame the bot: anthropomorphism and anger in customer-chatbot interactions. J. Mark. 86(1), 132–148 (2022). https://doi.org/10.1177/00222429211045687

    Article  Google Scholar 

  3. Gelbrich, K.: Anger, frustration, and helplessness after service failure: coping strategies and effective informational support. J. Acad. Mark. Sci. 38(5), 567–585 (2010)

    Article  Google Scholar 

  4. Lerner, J.S., Keltner, D.: Beyond valence: Toward a model of emotion-specific influences on judgement and choice. Cogn. Emot.. Emot. 14(4), 473–493 (2000). https://doi.org/10.1080/026999300402763

    Article  Google Scholar 

  5. Van Vaerenbergh, Y., Varga, D., De Keyser, A., Orsingher, C.: The service recovery journey: Conceptualization, integration, and directions for future research. J. Serv. Res. 22(2), 103–119 (2019)

    Article  Google Scholar 

  6. Hsu, C.-L., Lin, J.C.-C.: Understanding the user satisfaction and loyalty of customer service chatbots. J. Retail. Consum. Serv.Consum. Serv. 71, 103211 (2023)

    Article  Google Scholar 

  7. Gambino, A., Fox, J., Ratan, R.A.: Building a stronger CASA: extending the computers are social actors paradigm. Hum.-Mach. Commun. 1, 71–85 (2020)

    Article  Google Scholar 

  8. Nass, C., Steuer, J., Tauber, E.R.: Computers are social actors. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 1994, pp. 72–78 (1994)

    Google Scholar 

  9. Reeves, B., Nass, C.I.: The media equation: How people treat computers, television, and new media like real people: Cambridge University Press (1996)

    Google Scholar 

  10. Song, M., Zhang, H., Xing, X., Duan, Y.: Appreciation vs. apology: Research on the influence mechanism of chatbot service recovery based on politeness theory. J. Retailing Consumer Serv. 73, 103323 (2023)

    Google Scholar 

  11. Benbya, H., Pachidi, S., Jarvenpaa, S.: Artificial intelligence in organizations: implications for information systems research. J. Assoc. Inf. Syst. 22(2), 281–303 (2021). https://doi.org/10.17705/1jais.00662

  12. Schanke, S., Burtch, G., Ray, G.: Estimating the impact of “humanizing” customer service chatbots. Inf. Syst. Res. 32(3), 736–751 (2021). https://doi.org/10.1287/isre.2021.1015

    Article  Google Scholar 

  13. Toader, D.-C., Boca, G., Toader, R., Măcelaru, M., Toader, C., Ighian, D., et al.: The effect of social presence and chatbot errors on trust. Sustainability 12(1), 256 (2019). https://doi.org/10.3390/su12010256

    Article  Google Scholar 

  14. Hill, V.: Digital citizenship through game design in minecraft. New Library World 116(7–8), 369–382 (2015). https://doi.org/10.1108/NLW-09-2014-0112

  15. Leite, I., Pereira, A., Mascarenhas, S., Martinho, C., Prada, R., Paiva, A.: The influence of empathy in human–robot relations. Int. J. Hum. Comput. Stud.Comput. Stud. 71(3), 250–260 (2013)

    Article  Google Scholar 

  16. Westbrook, R.A., Oliver, R.L.: The dimensionality of consumption emotion patterns and consumer satisfaction. J. Consumer Res. 18(1), 84–91 (1991). https://doi.org/10.2307/2489487

    Article  Google Scholar 

  17. Lou, C., Kang, H., Tse, C.H.: Bots vs. humans: how schema congruity, contingency-based interactivity, and sympathy influence consumer perceptions and patronage intentions. Int. J. Advertising 41(4), 655–684 (2022). https://doi.org/10.1080/02650487.2021.1951510

  18. Bougie, R., Pieters, R., Zeelenberg, M.: Angry customers don’t come back, they get back: the experience and behavioral implications of anger and dissatisfaction in services. J. Acad. Mark. Sci. 31(4), 377–393 (2003)

    Article  Google Scholar 

  19. Kull, A.J., Romero, M., Monahan, L.: How may I help you? driving brand engagement through the warmth of an initial chatbot message. J. Bus. Res. 135, 840–850 (2021). https://doi.org/10.1016/j.jbusres.2021.03.005

    Article  Google Scholar 

  20. Diederich, S., Brendel, A.B., Kolbe, L.M.: Designing anthropomorphic enterprise conversational agents. Bus. Inf. Syst. Eng. 62(3), 193–209 (2020). https://doi.org/10.1007/s12599-020-00639-y

    Article  Google Scholar 

  21. Pfeuffer, N., Benlian, A., Gimpel, H., Hinz, O.: Anthropomorphic information systems. Bus. Inf. Syst. Eng. 61, 523–533 (2019)

    Article  Google Scholar 

  22. Go, E., Sundar, S.S.: Humanizing chatbots: the effects of visual, identity and conversational cues on humanness perceptions. Comput. Hum. Behav. 97, 304–316 (2019). https://doi.org/10.1016/j.chb.2019.01.020

    Article  Google Scholar 

  23. Han, M.C.: The impact of anthropomorphism on consumers’ purchase decision in chatbot commerce. J. Internet Commerce 20(1), 46–65 (2021). https://doi.org/10.1080/15332861.2020.1863022

    Article  MathSciNet  Google Scholar 

  24. Yen, C., Chiang, M.-C.: Trust me, if you can: a study on the factors that influence consumers’ purchase intention triggered by chatbots based on brain image evidence and self-reported assessments. Behav. Inf. Technol. 40(11), 1177–1194 (2021). https://doi.org/10.1080/0144929X.2020.1743362

    Article  Google Scholar 

  25. Hill, J., Randolph Ford, W., Farreras, I.G.: Real conversations with artificial intelligence: a comparison between human–human online conversations and human–chatbot conversations. Comput. Hum. Behav. 49, 245–250 (2015). https://doi.org/10.1016/j.chb.2015.02.026

    Article  Google Scholar 

  26. Levenson, R.W., Ruef, A.M.: Empathy: a physiological substrate. J. Pers. Soc. Psychol. 63(2), 234–246 (1992)

    Article  Google Scholar 

  27. Jiang, Q., Zhang, Y., Pian, W.: Chatbot as an emergency exist: Mediated empathy for resilience via human-AI interaction during the COVID-19 pandemic. Inf. Process. Manage. 59(6), 103074 (2022). https://doi.org/10.1016/j.ipm.2022.103074

    Article  Google Scholar 

  28. Diederich, S., Janßen-Müller, M., Brendel, A., Morana, S.: Emulating empathetic behavior in online service encounters with sentiment-adaptive responses: insights from an experiment with a conversational agent. In: Proceedings of the Fortieth International Conference on Information Systems, vol. 2 (2019)

    Google Scholar 

  29. Lv, X., Yang, Y., Qin, D., Cao, X., Xu, H.: Artificial intelligence service recovery: the role of empathic response in hospitality customers’ continuous usage intention. Comput. Hum. Behav. 126, 106993 (2022). https://doi.org/10.1016/j.chb.2021.106993

    Article  Google Scholar 

  30. Bodenhausen, G.V., Sheppard, L.A., Kramer, G.P.: Negative affect and social judgment: The differential impact of anger and sadness. Eur. J. Soc. Psychol. 24(1), 45–62 (1994)

    Article  Google Scholar 

  31. Averill, J.R.: Studies on anger and aggression: Implications for theories of emotion. Am. Psychol. 38(11), 1145–1160 (1983)

    Article  Google Scholar 

  32. Funches, V.: The consumer anger phenomena: causes and consequences. J. Serv. Mark. 25(6), 420–428 (2011)

    Article  Google Scholar 

  33. Puzakova, M., Kwak, H.: Should anthropomorphized brands engage customers? the impact of social crowding on brand preferences. J. Mark. 81(6), 99–115 (2017)

    Article  Google Scholar 

  34. Aggarwal, P., McGill, A.L.: When brands seem human, do humans act like brands? automatic behavioral priming effects of brand anthropomorphism. J. Consumer Res. 39(2), 307–323 (2012)

    Article  Google Scholar 

  35. Lv, X., Liu, Y., Luo, J., Liu, Y., Li, C.: Does a cute artificial intelligence assistant soften the blow? The impact of cuteness on customer tolerance of assistant service failure. Ann. Tour. Res. 87, 103114 (2021)

    Article  Google Scholar 

  36. Miao, F., Kozlenkova, I.V., Wang, H., Xie, T., Palmatier, R.W.: An emerging theory of avatar marketing. J. Mark. 86(1), 67–90 (2021). https://doi.org/10.1177/0022242921996646

    Article  Google Scholar 

  37. Chattaraman, V., Kwon, W.-S., Gilbert, J.E., Ross, K.: Should AI-Based, conversational digital assistants employ social- or task-oriented interaction style? a task-competency and reciprocity perspective for older adults. Comput. Hum. Behav. 90, 315–330 (2019). https://doi.org/10.1016/j.chb.2018.08.048

    Article  Google Scholar 

  38. Shum, H.-Y., He, X.-D., Li, D.: From Eliza to XiaoIce: challenges and opportunities with social chatbots. Front. Inf. Technol. Electron. Eng. 19, 10–26 (2018)

    Article  Google Scholar 

  39. Portela, M., Granell-Canut, C.: A new friend in our smartphone? observing interactions with chatbots in the search of emotional engagement. In: Proceedings of the XVIII International Conference on Human Computer Interaction, pp. 1–7 (2017)

    Google Scholar 

  40. Clark, M., Robertson, M., Young, S.: “I feel your pain”: A critical review of organizational research on empathy. J. Organ. Behav.Behav. 40(2), 166–192 (2018). https://doi.org/10.1002/job.2348

    Article  Google Scholar 

  41. Cheng, Y., Jiang, H.: How do ai-driven chatbots impact user experience? examining gratifications, perceived privacy risk, satisfaction, loyalty, and continued use. J. Broadcast. Electron. Media 64, 592–614 (2020)

    Article  Google Scholar 

  42. Zhang, J., Zhu, Y., Wu, J., Yu-Buck, G.F.: A natural apology is sincere: Understanding chatbots’ performance in symbolic recovery. Int. J. Hosp. Manag.Manag. 108, 103387 (2023)

    Google Scholar 

Download references

Acknowledgement

This work was partially supported by the National Natural Science Foundation of China (71904215, 72072194), the Social Science Fund Research Base Project of Beijing (19JDGLB029), and the Young Talents Support Program from the Central University of Finance and Economics (No: QYP2211).

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Jian Tang , Xinxue Zhou or Chenguang Li .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Tang, J., Wang, Y., Zhou, X., Guo, J., Li, C. (2024). Can Chatbot Anthropomorphism and Empathy Mitigate the Impact of Customer Anger on Satisfaction?. In: Sserwanga, I., et al. Wisdom, Well-Being, Win-Win. iConference 2024. Lecture Notes in Computer Science, vol 14596. Springer, Cham. https://doi.org/10.1007/978-3-031-57850-2_7

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-57850-2_7

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-57849-6

  • Online ISBN: 978-3-031-57850-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics