Skip to main content
Log in

Empathy for Artificial Agents

  • Published:
International Journal of Social Robotics Aims and scope Submit manuscript

Abstract

The paper has three goals. First, it introduces into different notions of empathy and related capacities such as emotional contagion, affective empathy, cognitive empathy, and sympathy. Second, it presents a case in point of an intelligent tutoring system, Affective AutoTutor, whose affect-sensitive behavior seems to further and enhance the outcome of its interactions with its students. Affective AutoTutor appears to behave empathically within a well defined learning environment. Third, attention is directed towards the requirements to be met by artificial empathizers to be judged as empathizers tout court by their social interactants, even when acting in unspecified social situations. To be a convincing empathizer, the artificial agent would not only need to grasp the emotional states of its interaction partners and understand their situation with respect to an adequate world model, but also communicate its own affective states. Eventually, an artificial empathizer should be ready to react appropriately to its interaction partner’s reciprocal empathy.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Notes

  1. In their elaborate introduction [3], Coplan and Goldie offer a comprehensive overview of both the history of the term and its more recent use in a large variety of fields such as phenomenology and hermeneutics, clinical psychology, developmental and social psychology, care ethics, ethology and neuroscience.

  2. In other areas of science, different notions of empathy might seem more appropriate. Preston and de Waal, e.g., think that many of the distinctions as, e.g., provided above have been “overemphasized to the point of distraction” [4, p. 2]. In contrast, they define empathy broadly as “any process where the attended perception of the object generates a state in the subject that is more applicable to the object’s state or situation than to the subject’s own prior state or situation” [4, p. 4]. However, to discuss the prospects of artificial empathizers we need “more specificity, not more generality”, as Coplan convincingly has stressed in a different context, too [5, p. 5].

  3. It is more this performative idea of empathy that Ratcliffe [7] and Dullstein [1] are after and to which I will return in the last section of this paper.

  4. See, however, Coplan, who explicitly argues that the empathic observer must experience the same type of emotion [5, pp. 6–7].

  5. With respect to specific mental disorders, psychiatric studies show remarkable differences in the capacity of cognitive and affective (here: emotional) empathy. While individuals with Narcissistic Personality Disorder display significant impairments in emotional empathy, but do not show deficits in cognitive empathy [11], individuals with Asperger Syndrome seem to be impaired in cognitive empathy, but do not differ from controls in emotional empathy [12].

  6. For an introduction, see http://www.autotutor.org/.

  7. For my purposes it suffices to consider the behavioral capacities of Affective AutoTutor. Those who are interested in the algorithmic and implementation level of its architecture, find detailed information in [16] with further links to original sources.

  8. Currently, these two response patterns are implemented from the start. For the future, it may be aimed at providing Affective AutoTutor with adequate learning routines to assess on its own when and for whom to use what pedagogical strategy, and to further refine these strategies.

  9. Regardless of the undeniable difference between both the Supportive and the Shakeup AutoTutor, even the latter’s responses sound pretty supportive in the ears of a German tutor! This may be due to cultural differences in learning environments.

  10. Illustrations for such different facial responses can be found, e.g., in [16, p. 21].

  11. Actually, the original AutoTutor already passed a bystander Turing test [19].

  12. This is a different and even more demanding task than those, which are approached already in recent studies where the goal is to equip social robots with empathic and memory skills to foster long-term human-robot interactions [20, 21].

  13. This is also acknowledged for human-robot interactions in which the robot is asking for assistance to better achieve one of its own goals [23].

References

  1. Dullstein M (2013) Einfühlung und Empathie. In: Breyer T (ed) Grenzen der Empathie. Wilhelm Fink Verlag, München, pp 93–107

    Google Scholar 

  2. Stueber K (2013) Empathy. In: Zalta EN (ed) Stanford encyclopedia of philosophy (Summer 2013 edition, PDF version), pp 1–67. http://plato.stanford.edu/archives/sum2013/entries/empathy/

  3. Coplan A, Goldie P (2011) Introduction. In: Coplan A, Goldie P (eds) Empathy: philosophical and psychological perspectives. Oxford University Press, Oxford, pp IX–XLVII

  4. Preston SD, de Waal FBM (2002) Empathy: its ultimate and proximate bases. Behav Brain Sci 25:1–72

    Google Scholar 

  5. Coplan A (2011) Understanding empathy: its features and effects. In: Coplan A, Goldie P (eds) Empathy: philosophical and psychological perspectives. Oxford University Press, Oxford, pp 3–18

    Chapter  Google Scholar 

  6. Walter H (2012) Social cognitive neuroscience of empathy: concepts, circuits, and genes. Emot Rev 4:9–17

  7. Ratcliffe M (2015) Depression and empathy, chap 9. In: Experiences of depression: a study in phenomenology. Oxford University Press, Oxford

  8. Goleman D (2007) Three Kinds of empathy: cognitive, emotional, compassionate. http://www.danielgoleman.info/three-kinds-of-empathy-cognitive-emotional-compassionate/. Accessed 2 June 2014

  9. Darwall S (1998) Empathy, sympathy, and care. Philos Stud 89:261–282

  10. Batson CD, Fultz F, Schoenrade P (1987) Distress and empathy: two qualitatively distinct vicarious emotions with different motivational consequences. J Personal 55:19–39

  11. Ritter K, Dziobek I, Preißler S, Rüter A, Vater A, Fydrich T, Lammers C-H, Heekeren HR, Roepke S (2011) Lack of Empathy in Patients with Narcissistic Personality Disorder. Psychiatry Research 187:241–247

    Article  Google Scholar 

  12. Dziobek I, Rogers K, Fleck S, Bahnemann M, Heekeren HR, Wolf OT, Convit A (2008) Dissociation of cognitive and emotional empathy in adults with asperger syndrome using multifaceted empathy test (MET). J Autism Dev Disord 38:464–473

    Article  Google Scholar 

  13. Lim MY, Aylett R, Jones CM (2005) Empathic interaction with a virtual guide. In: Proceedings of the joint symposium on virtual social agents (AISB 2005 Concention), pp 122–129

  14. Nagai Y, Tanioka T, Fuji S, Yasuhara Y, Sakamaki S, Taoka N, Locsin RC, Ren F, Matsumoto K (2010) Needs and challenges of care robots in nursing care setting: a literature review. In: International conference on natural language processing and knowledge engineering (NLP–KE). doi:10.1109/NLPKE.2010.5587815

  15. el Kaliouby R, Picard R, Baron-Cohen S (2006) Affective computing and autism. Ann NY Acad Sci 1093:228–248

    Article  Google Scholar 

  16. D’Mello S, Graesser A (2012) AutoTutor and affective Autotutor: learning by talking with cognitively and emotionally intelligent computers that talk back. ACM Trans Interact Intell Syst 2(4):39. doi:10.1145/2395123.2395128

    Google Scholar 

  17. Ekman P (1992) An argument for basic emotions. Cognit Emot 6:169–200

    Article  Google Scholar 

  18. Scarantino A, Griffiths P (2011) Don’t give up on basic emotions. Emot Rev 3:444–454

    Article  Google Scholar 

  19. Person N, Graesser A (2002) Human or computer? AutoTutor, in a bystander turing test. In: Cerri SA, Gouarderes G, Paraguacu F (eds) Proceedings of the 6th international conference on intelligent tutoring systems. Springer, London, pp 821–830

    Chapter  Google Scholar 

  20. Leite I, Martinho C, Paiva A (2013) Social robots for long-term interaction: a survey. Int J Soc Robot 5:291–308

    Article  Google Scholar 

  21. Leite I, Castellano G, Pereita A, Martinho C, Paiva A (2014) Empathic robots for long-term interaction. Evaluating social presence, engagement and perceived support in children. Int J Soc Robot 6:329–341

    Article  Google Scholar 

  22. Niculescu A, van Dijk B, Nijholt A, Haizhou L (2014) Making social robots more attractive: the effects of voice pitch, humor and empathy. Int J Soc Robot 5:171–191

    Article  Google Scholar 

  23. Kühnlenz B, Sosnoswki S, Buß M, Wollherr D, Kühnlenz K, Buss M (2013) Increasing helpfulness towards a robot by emotional adaption to the user. Int J Soc Robot 5:457–476

  24. Ratcliffe M (2015) Empathy is exploration, not simulation (forthcoming)

  25. Stephan A, Walter S (2013) Handbuch Kognitionswissenschaft. Metzler Verlag, Stuttgart

    Google Scholar 

  26. Stephan A, Walter S, Wilutzky W (2014) Emotions beyond brain and body. Philos Psychol 27(1):65–81

    Article  Google Scholar 

  27. Stephan A (2012) Emotions, existential feelings, and their regulation. Emot Rev 4(2):157–162

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Achim Stephan.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Stephan, A. Empathy for Artificial Agents. Int J of Soc Robotics 7, 111–116 (2015). https://doi.org/10.1007/s12369-014-0260-0

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12369-014-0260-0

Keywords

Navigation