Skip to main content

Why We Need Emotional Intelligence in the Design of Autonomous Social Robots and How Confucian Moral Sentimentalism Can Help

  • Conference paper
  • First Online:
Social Robotics (ICSR 2021)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 13086))

Included in the following conference series:

Abstract

This paper argues for the need to develop emotion in social robots to enable them to become artificial moral agents. The paper considers four dimensions of this issue: what, why, which, and how. The main thesis is that we need to build not just emotional intelligence, but also ersatz emotions, in autonomous social robots. Moral sentimentalism and moral functionalism are employed as the theoretical models. However, this paper argues that the popularly endorsed moral sentiment empathy is the wrong model to implement in social robots. In its stead, I propose the four moral sentiments (commiseration, shame/disgust, respect and deference, and the sense of right and wrong) in Confucian moral sentimentalism as our starting point for the top-down affective structure of robot design.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 109.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 139.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    For example, Joseph Weizenbaum’s computer program Eliza drew students’ interest to chat with the program or even wanted to be alone with it. This is called “the Eliza Effect.” Turkle also reports her own studies: “From 1997 to the present, I have conducted field research with these relational artifacts and also with Furbies, Aibos, My Real Babies, Paros, and Cog. What these machines have in common is that they display behaviors that make people feel as though they are dealing with sentient creatures that care about their presence” (Turkle 2018: 64). In real life we also have an example. Since 2016, Georgia Tech has been employing an AI teaching assistant program named Jill Watson. Some students even asked to have a date with Jill. In 2019 Jill Watson the social agent was introduced, and students engaged actively not only with Jill but also among themselves (Georgia Tech GVU Center News). The conclusion from these studies seems to suggest that people are willing to engage with an artificial system, knowing full well that it is “artificial.” However, we should also add that Gray and Wegner (2012) conducted a series of experiments based on Mori (1970)’s “Uncanny Valley theory” and concluded that people feel scared and uneasy about robots that seem to have emotions. This problem may be resolved either with robots designed to be less like humans, or in the future when robots that can express emotions become a common phenomenon.

  2. 2.

    When Sony terminated the maintenance of AIBO in 2006, many owners were not able to let go, so a Japanese company even hosted a Buddhist farewell ceremony for AIBOs (White and Katsuno 2021).

  3. 3.

    Affective Computing was originally published in 1997 and reprinted in 2000. Today, there is an emotional algorithm research group in the MIT Media Lab at Massachusetts Institute of Technology, and Picard is the leader of this group.

  4. 4.

    “All living creatures between heaven and earth which have blood and breath must possess [understanding], and nothing that possesses [understanding] fails to love its own kind. If any of the animals or great birds happens to become separated from the herd or flock, though a month or a season may pass, it will invariably return to its old haunts, and when it passes its former home it will look about and cry, hesitate and drag its feet before it can bear to pass on…. Among creatures of blood and breath, none has greater understanding than man; therefore man ought to love his parents until the day he dies.” (“A Discussion on Rites.” Xunzi: Basic Writings. Watson 2003: 155).

  5. 5.

    “The nature of man is such that he is born with a fondness for profit. If he indulges this fondness, it will lead him into wrangling and strife, and all sense of courtesy and humility will disappear. He is born with feelings of envy and hate, and if he indulges these, they will lead him into violence and crime, and all sense of loyalty and good faith will disappear.” (“Human Nature is Evil.” Xunzi: Basic Writings. Watson 2003: 226).

  6. 6.

    “Man is born with the desires of the eyes and ears, with a fondness for beautiful sights and sounds. If he indulges these, they will lead him into license and wantonness, and all ritual principles and correct forms will be lost.” (“Human Nature is Evil.” Xunzi: Basic Writings. Watson 2003: 226). “Phenomena such as the eye’s fondness for beautiful forms, the ear’s fondness for beautiful sounds, the mouth’s fondness for delicious flavors, the mind’s fondness for profit, or the body’s fondness for pleasure and ease—these are all products of the emotional nature of man. They are instinctive and spontaneous; man does not have to do anything to produce them.” (Ibid. 231).

  7. 7.

    Xunzi says, “The reason people despise [the tyrant] Jie, Robber Zhi, or the petty man is that they give free rein to their nature, follow their emotions, and are content to indulge their passions, so that their conduct is marked by greed and contentiousness. Therefore, it is clear that man’s nature is evil, and that his goodness is the result of conscious activity.” (“Human Nature is Evil.” Xunzi: Basic Writings. Watson 2003: 237).

  8. 8.

    According to Xunzi, “Man’s emotions are very unlovely things indeed! What need is there to ask any further? Once a man acquires a wife and children, he no longer treats his parents as a filial son should. Once he succeeds in satisfying his cravings and desires, he neglects his duty to his friends. Once he has won a high position and a good stipend, he ceases to serve his sovereign with a loyal heart. Man’s emotions, man’s emotions—they are very unlovely things indeed!” (“Human Nature is Evil.” Xunzi: Basic Writings. Watson 2003: 241).

  9. 9.

    The “prototypical” forms of emotions are anger, disgust, fear, happiness, sadness, and surprise. Paul Ekman lists the seven “universal emotions” as: anger, contempt, disgust, enjoyment, fear, sadness, and surprise, while Antonio Damasio also lists joy, shame, contempt, pride, compassion, and admiration, etc. A universally accepted prototyping of natural emotion is unlikely to be reached. Using the Chinese categorization in this paper helps us better see the contrast between “natural emotions” and “moral sentiments” that dominated the discourse on human emotion in Chinese as well as Korean neo-Confucianism. More on this later.

  10. 10.

    https://behavioralsignals.com/aboutus/.

  11. 11.

    According to Breazeal’s design philosophy, the emotions of robots are self-centered: these emotions can prompt the robot to give positive or negative evaluations when encountering different environmental stimuli, and to adopt correct behavioral response to maintain its “well-being.”

  12. 12.

    For instance, Leite et al. (2012) explain their strategies this way: “Currently, the empathic strategies implemented in the robot are the following:

    1. 1.

      Encouraging comments, for example, “don’t be sad, I believe you can still recover your disadvantage”.

    2. 2.

      Scaffolding, by providing feedback on the user's last move and, if the move is not good, let the user play again.

    3. 3.

      Suggesting a good move for the user to play in his or her next turn.

    4. 4.

      Intentionally playing a bad move, for example, playing a move that allows the user to capture an important piece of the robot.

  13. 13.

    Chinese Neo-Confucians in the Song-Ming era (11th–17th Century) distinguish the moral value of the four moral sprouts and seven natural emotions, attributing the former to “human moral essence” and the latter to “human emotion.” This distinction was extensively discussed in the “Debate on the Distinction between Four and Seven” in Korean Confucianism. Scholars who particularly emphasize the moral distinction between the four sentiments and the seven emotions (such as Chinese Confucians Zhu Xi, Wang Fuzhi, and Korean Confucian Li Tuixi) argue that the seven emotions are natural emotions and have no value of good and evil, but their motivating force can promote ethical behaviors. In contrast, the four sentiments are purely good.

  14. 14.

    Wang Yangming (1472–1529), for example, declared that the sense of right and wrong is nothing but the heart of like and dislike.

  15. 15.

    As we have pointed out earlier, ‘anger’ could be the emotional basis for the moral sentiment of shame/disgust. Emotions are only “negative” when they are not moderate and balanced. Even a seemingly harmless emotion “love” could become a negative emotion when it is in excess and uncontrolled.

  16. 16.

    Howard (2017) defines ‘moral functionalism’ as the theory that takes ethical properties to supervene on descriptive natural properties. On his version, “Moral functionalism adopted here emphasizes the role of the functional and behavioral nature of the moral agent: its decision, its output state, are functional in nature, individuated by its dependence on the input, the previous output (a form of “moral memory”) and other, current, or previous, moral states” (Howard 2017: 134). I shall adopt his version here.

  17. 17.

    The “functionalism” they use is “causal-role functionalism”: “The definition of mental state is partly based on its causal functional role in the psychological process” (Reisenzein et al. 2013: 248). In other words, certain mental states will trigger emotion x, and emotion x will trigger another mental state y or drive behavior z. However, they also point out that although on the “causes” of emotions psychological literature are generally in agreement, on the “effects” of emotions, that is, on how emotions lead to other emotions and behaviors, there is less consensus.

  18. 18.

    They point out that according to the statistics of Strongman (2003), in psychology and philosophy, there are at least 150 kinds of theories of emotion in history.

References

  • Allen, C., Wendell, W., Iva, S.: Why Machine Ethics?. Anderson & Anderson, Aiken, pp. 51–61 (2018)

    Google Scholar 

  • Anderson, M., Anderson, S.L. (eds.): Machine Ethics. Cambridge University Press, Cambridge (2018)

    Google Scholar 

  • Breazeal, C.L.: Emotion and sociable humanoid robots. Int. Hum.-Comput. Stud. 59, 119–155 (2003)

    Article  Google Scholar 

  • Breazeal, C.L.: Designing Sociable Robots, 1st edn. The MIT Press, Cambridge (2002)

    MATH  Google Scholar 

  • Calvo, R., D’Mello, S., Gratch, J., Kappas, A.: The Oxford Handbook of Affective Computing. Oxford University Press, Oxford (2015)

    Book  Google Scholar 

  • Chan, W.-T.: A Source Book in Chinese Philosophy. Princeton University Press, Princeton (1963)

    Book  Google Scholar 

  • Crawford, K.: Artificial intelligence is misreading human emotion. The Atlantic (2021)

    Google Scholar 

  • Damasio, A., Carvalho, G.B.: The nature of feelings: evolutionary and neurobiological origins. Nat. Rev. Neurosci. 14(2), 143–152 (2013). https://doi.org/10.1038/nrn3403

    Article  Google Scholar 

  • Engelhart, K.: What robots can—and can’t—do for the old and lonely. The New Yorker (2021)

    Google Scholar 

  • Gossett, S.: Emotion AI technology has great promise (when used responsibly) (2021). https://builtin.com/artificial-intelligence/emotion-ai. Accessed 2 Mar 2021

  • Gray, K., Wegner, D.M.: Feeling robots and human zombies: mind perception and the uncanny valley. Cognition 125, 125–130 (2012)

    Article  Google Scholar 

  • Howard, D., Muntean, I.: Artificial moral cognition: moral functionalism and autonomous moral agency. In: Powers, T.M. (ed.) Philosophy and Computing. PSS, vol. 128, pp. 121–159. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-61043-6_7

    Chapter  Google Scholar 

  • Leben, D.: Ethics for Robots: How to Design a Moral Algorithm. Routledge, UK (2019)

    Google Scholar 

  • Leite, I., Pereira, A., Mascarenhas, S., Martinho, C., Prada, R., Paiva, A.: The influence of empathy in human-robot relations. J. Hum.-Comput. Stud. 71, 250–260 (2013)

    Article  Google Scholar 

  • Leite, I., Pereira, A., Castellano, G., Mascarenhas, S., Martinho, C., Paiva, A.: Modelling empathy in social robotic companions. In: Ardissono, L., Kuflik, T. (eds.) UMAP 2011. LNCS, vol. 7138, pp. 135–147. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-28509-7_14

    Chapter  Google Scholar 

  • Levenson, R.W.: Blood, sweat, and fears: the autonomic architecture of emotion. In: Ekman, P., Campos, J.J., Davidson, R.J., DeWaal, F.B.M. (eds.) Emotions Inside Out, vol. 1000, pp. 348–366. Annals of the New York Academy of Sciences, New York (2003)

    Google Scholar 

  • Marsella, S., Gratch, J., Petta, P.: Computational models of emotion. In: Scherer, K.R., Bänziger, T., Roesch, E. (eds.) A Blueprint for Affective Computing: A Sourcebook, pp. 21–45. Oxford University Press, Oxford (2010)

    Google Scholar 

  • Minsky, M.: The Emotion Machine: Commonsense Thinking, Artificial Intelligence, and the Future of the Human Mind. Simon & Schuster. Reprint edition (2007)

    Google Scholar 

  • Minsky, M.: The Society of Mind, 1st edn. Simon & Schuster (1988)

    Google Scholar 

  • Mori, M.: The uncanny valley. Energy 7(4), 33–35 (1970). Originally published in Japanese. Authorized English translation by Karl F. MacDorman and Norri Kageki is available at IEEE site (https://spectrum.ieee.org/automaton/robotics/humanoids/the-uncanny-valley) (2012)

  • Nummenmaa, L., Glerean, E., Hari, R., Hietanen, J.K.: Bodily maps of emotions. Proc. Natl. Acad. Sci. 111(2), 646–651 (2014). https://doi.org/10.1073/pnas.1321664111

    Article  Google Scholar 

  • Picard, R.W., Vyzas, E., Healey, J.: Toward machine emotional intelligence: analysis of affective physiological state. IEEE Trans. Pattern Anal. Mach. Intell. 23(10), 1175–1191 (2001)

    Article  Google Scholar 

  • Picard, R.W.: Affective Computing. The MIT Press, Reprint edition (2000)

    Google Scholar 

  • Reisenzein, R., et al.: Computational modeling of emotion: toward improving the inter- and intradisciplinary exchange. IEEE Trans. Affect. Comput. 4(3), 246–266 (2013)

    Article  Google Scholar 

  • Rodogno, R.: Social robots, fiction, and sentimentality. Ethics Inf. Technol. 18, 257–268 (2016)

    Article  Google Scholar 

  • Slote, M.: A Sentimentalist Theory of the Mind. Oxford University Press, Oxford (2014)

    Book  Google Scholar 

  • Slote, M.: Moral Sentimentalism. Oxford University Press, Reprint edition (2013)

    Google Scholar 

  • Sparrow, R.: The march of the robotic dogs. Ethics Inf. Technol. 4, 305–318 (2002)

    Article  Google Scholar 

  • Strongman, K.T.: The Psychology of Emotion: From Everyday Life to Theory, 5th edn. Wiley Publishing, Hoboken (2003)

    Google Scholar 

  • Sugiyama, S., Vincent, J. (eds.): Social Robots and Emotion: Transcending the Boundary Between Humans and ICTs. Intervallla, vol. 1. Franklin University, Switzerland (2013). https://www.fus.edu/intervalla/volume-1-social-robots-and-emotion-transcending-the-boundary-between-humans-and-icts

  • Turkle, S.: Authenticity in the age of digital companions. Anderson & Anderson, pp. 62–76 (2018). Originally published in Interaction Studies 8(3), 501–517 (2007)

    Google Scholar 

  • Turkle, S., Taggart, W., Kidd, C.D., Dasté, O.: Relational artifacts with children and elders: the complexities of cybercompanionship. Connect. Sci. 18(4), 347–361 (2006)

    Article  Google Scholar 

  • Turkle, S.: Whither psychoanalysis in computer culture? Psychoanal. Psychol. 21(1), 16–30 (2004)

    Article  Google Scholar 

  • Turkle, S., Breazeal, C., Dasté, O., Scassellati, B.: Encounters with kismet and cog: children respond to relational artifacts (2004). https://www.researchgate.net/publication/251940996_Encounters_with_Kismet_and_Cog_Children_Respond_to_Relational_Artifacts

  • Vallverdú, J., Casacuberta, D.: Ethical and Technical Aspects of Emotions to Create Empathy in Medical Machines (2015)

    Google Scholar 

  • Watson, B.: Xunzi: Basic Writings. Columbia University Press, Columbia (2003)

    Google Scholar 

  • White, D., Katsuno, H.: Toward an affective sense of life: artificial intelligence, animacy, and amusement at a robot pet memorial service in Japan. Cult. Anthropol. 36(2), 2021 (2021). https://doi.org/10.1002/oarr.10000380.1

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Liu, J. (2021). Why We Need Emotional Intelligence in the Design of Autonomous Social Robots and How Confucian Moral Sentimentalism Can Help. In: Li, H., et al. Social Robotics. ICSR 2021. Lecture Notes in Computer Science(), vol 13086. Springer, Cham. https://doi.org/10.1007/978-3-030-90525-5_20

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-90525-5_20

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-90524-8

  • Online ISBN: 978-3-030-90525-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics