Skip to main content
Log in

Expanded linear dynamic affect-expression model for lingering emotional expression in social robot

  • Original Research Paper
  • Published:
Intelligent Service Robotics Aims and scope Submit manuscript

Abstract

The service robot market is growing, and robots are replacing humans in many service industry jobs. Recently, as humans treat robots more emotionally and socially, designing robot emotions to improve human satisfaction in human–robot interaction (HRI) is crucial. Despite the importance of expressing the robot emotions, many robots only respond to the current stimulus when expressing emotions. Just as humans can feel the lingering effects of a strong stimulus after it has passed, social robots might do the same. For example, if a user hits a robot and the robot is very angry, even if the user praises the robot to make it feel better, the anger will not dissipate for some time. In this study, we propose the expanded linear dynamic affect-expression model (e-LDAEM) for expressing lingering emotion. Different intensity of stimuli in the e-LDAEM leads to different results, even if the robot is stimulated to the same emotion. The viscosity matrix and intensity vector show a positive correlation in the process of determining the emotion dynamics. This model enables the implementation of robots with diverse personalities by adjusting the lingering emotions to suit the robot’s form or situation. Through user evaluation results, the e-LDAEM has the proven effect of remaining in emotion for longer periods when the robot is given a strong stimulus. Thus, e-LDAEM is expected to improve the emotional bond between humans and robots.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

References

  1. Breazeal C (2003) Emotion and sociable humanoid robots. Int J Hum Comput Stud 59(1–2):119–155. https://doi.org/10.1016/S1071-5819(03)00018-1

    Article  Google Scholar 

  2. Breazeal C, Aryananda L (2002) Recognition of affective communicative intent in robot-directe. Auton Robots 12(1):83–104. https://doi.org/10.1023/a:1013215010749

    Article  MATH  Google Scholar 

  3. Garcia-Soler A, Facal D, Diaz-Orueta U et al (2018) Inclusion of service robots in the daily lives of frail older users: a step-by-step definition procedure on users’ requirements. Arch Gerontol Geriatr 74:191–196. https://doi.org/10.1016/j.archger.2017.10.024

    Article  Google Scholar 

  4. Ekman P, Rosenberg EL (2005) What the face revealsbasic and applied studies of spontaneous expression using the facial action coding system (FACS). Oxford University Press. https://doi.org/10.1093/acprof:oso/9780195179644.001.0001

  5. Pollmann K, Tagalidou N, Fronemann N (2019) It’s in Your Eyes. In: Proceedings of Mensch und Computer 2019. ACM: New York, pp 639–642. https://doi.org/10.1145/3340764.3344883

  6. Scherer KR (1992) What does facial expression express? International review of studies on emotion, vol 2. Wiley, Oxford, pp 139–165

    Google Scholar 

  7. Mirnig N, Tan YK, Chang TW, et al (2014) Screen feedback in human-robot interaction: how to enhance robot expressiveness. In: The 23rd IEEE international symposium on robot and human interactive communication. IEEE, pp 224–230. https://doi.org/10.1109/ROMAN.2014.6926257

  8. Kennedy A, Kunwar PS, Ly Li et al (2020) Stimulus-specific hypothalamic encoding of a persistent defensive state. Nature 586(7831):730–734. https://doi.org/10.1038/s41586-020-2728-4

    Article  Google Scholar 

  9. Lee HS, Park JW, Chung MJ (2007) A linear affect-expression space model and control points for mascot-type facial robots. IEEE Trans Robot 23(5):863–873. https://doi.org/10.1109/TRO.2007.907477

  10. Lee HS, Park JW, Jo SH, et al (2007) A linear dynamic affect-expression model: facial expressions according to perceived emotions in mascot-type facial robots. In: RO-MAN 2007—the 16th IEEE international symposium on robot and human interactive communication. IEEE, pp 619–624. https://doi.org/10.1109/ROMAN.2007.4415158

  11. Park JW, Lee HS, Jo SH, et al (2008) Emotional boundaries for choosing modalities according to the intensity of emotion in a linear affect-expression space. In: RO-MAN 2008—The 17th IEEE International Symposium on Robot and Human Interactive Communication. IEEE, pp 225–230, https://doi.org/10.1109/ROMAN.2008.4600670

  12. Ekman P, Friesen WV (1971) Constants across cultures in the face and emotion. J Personal Soc Psychol 17(2):124–129. https://doi.org/10.1037/h0030377

    Article  Google Scholar 

  13. Sosnowski S, Bittermann A, Kuhnlenz K, et al (2006) Design and evaluation of emotion-display EDDIE. In: 2006 IEEE/RSJ international conference on intelligent robots and systems. IEEE, pp 3113–3118. https://doi.org/10.1109/IROS.2006.282330

  14. Yoo BS, Cho SH, Kim JH (2011) Fuzzy integral-based composite facial expression generation for a robotic head. In: 2011 IEEE international conference on fuzzy systems (FUZZ-IEEE 2011). IEEE, pp 917–923. https://doi.org/10.1109/FUZZY.2011.6007468

  15. Oh KG, Jang MS, Kim SJ (2010) Automatic emotional expression of a face robot by using a reactive behavior decision model. J Mech Sci Technol 24(3):769–774. https://doi.org/10.1007/s12206-010-0118-9

    Article  Google Scholar 

  16. Ribeiro T, Paiva A (2012) The illusion of robotic life. In: Proceedings of the seventh annual ACM/IEEE international conference on human-robot interaction—HRI ’12. ACM Press, New York, p 383–390. https://doi.org/10.1145/2157689.2157814

  17. Zheng M, She Y, Liu F, et al (2019) BabeBay-a companion robot for children based on multimodal affective computing. In: 2019 14th ACM/IEEE international conference on human-robot interaction (HRI). IEEE, pp 604–605. https://doi.org/10.1109/HRI.2019.8673163

  18. Miwa H, Okuchi T, Itoh K, et al (2003) A new mental model for humanoid robots for human friendly communication introduction of learning system, mood vector and second order equations of emotion. In: 2003 IEEE International Conference on Robotics and Automation (Cat. No.03CH37422), vol 3, IEEE, pp 3588–3593. https://doi.org/10.1109/ROBOT.2003.1242146

  19. Hirth J, Schmitz N, Berns K (2007) Emotional architecture for the humanoid robot head ROMAN. In: Proceedings—IEEE international conference on robotics and automation, pp 2150–2155. https://doi.org/10.1109/ROBOT.2007.363639

  20. Cañamero LD (2002) Playing the emotion game with Feelix. In: Socially intelligent agents: creating relationships with computers and robots. Springer, pp 69–76. https://doi.org/10.1007/0-306-47373-9_8

  21. Mobahi H, Ansari S (2003) Fuzzy perception, emotion and expression for interactive robots. In: 2003 IEEE international conference on systems, man and cybernetics. Conference theme - system security and assurance (Cat. No.03CH37483). https://doi.org/10.1109/ICSMC.2003.1244500

  22. Gonsior B, Sosnowski S, Buss M, et al (2012) An emotional adaption approach to increase helpfulness towards a robot. In: 2012 IEEE/RSJ international conference on intelligent robots and systems. IEEE, pp 2429–2436. https://doi.org/10.1109/IROS.2012.6385941

  23. Arkin RC (2005) Moving up the food chain. In: Who needs emotions? Oxford University Press, pp 245–270, https://doi.org/10.1093/acprof:oso/9780195166194.003.0009

  24. Rizzo P, Rizzo P (1999) Emotional agents for user entertainment: discussing the underlying assumptions. In: Proceedings of the international workshop on affect in interactions held in conjunction with the AC’99, annual conference of the EC I3 programme, pp 21–22

  25. Salichs MA, Barber R, Khamis AM, et al (2006) Maggie: a robotic platform for human-robot social interaction. In: 2006 IEEE conference on robotics, automation and mechatronics, pp 1–7. https://doi.org/10.1109/RAMECH.2006.252754

  26. Cavallo F, Semeraro F, Fiorini L et al (2018) Emotion modelling for social robotics applications: a review. J Bionic Eng 15(2):185–203. https://doi.org/10.1007/s42235-018-0015-y

    Article  Google Scholar 

  27. Dimitrievska V, Ackovska N (2020) Behavior models of emotion-featured robots: a survey. J Intell Robot Syst 100(3–4):1031–1053. https://doi.org/10.1007/s10846-020-01219-8

    Article  Google Scholar 

  28. Breazeal C (2004) Function meets style: insights from emotion theory applied to HRI. IEEE Trans Syst Man Cybern Part C Appl Rev 34(2):187–194. https://doi.org/10.1109/TSMCC.2004.826270

    Article  Google Scholar 

  29. Russell JA, Mehrabian A (1977) Evidence for a three-factor theory of emotions. J Res Personal 11(3):273–294. https://doi.org/10.1016/0092-6566(77)90037-X

    Article  Google Scholar 

  30. Miwa H, Itoh K, Ito D et al (2003) Introduction of the need model for humanoid robots to generate active behavior. In: Proceedings 2003 IEEE/RSJ international conference on intelligent robots and systems (IROS 2003) (Cat. No.03CH37453). IEEE, Las Vegas, NV, USA, pp 1400–1406. https://doi.org/10.1109/IROS.2003.1248840

Download references

Funding

This work was partly supported by the National Research Foundation of Korea (NRF) Grant funded by the Korea government(MSIT) (NRF-2020R1F1A1066397) and Korea Institute for Advancement of Technology (KIAT) Grant funded by the Korea Government(MOTIE)(P0012725, The Competency Development Program for Industry Specialist).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hui Sung Lee.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Below is the link to the electronic supplementary material.

Supplementary file 1 (mp4 77391 KB)

Appendix A: Questionnaire

Appendix A: Questionnaire

Table 4 Questionnaire

See Table 4.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Park, H., Lee, J., Dzhoroev, T. et al. Expanded linear dynamic affect-expression model for lingering emotional expression in social robot. Intel Serv Robotics 16, 619–631 (2023). https://doi.org/10.1007/s11370-023-00483-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11370-023-00483-5

Keywords

Navigation