Skip to main content

An Emotional Support Robot Framework Using Emotion Recognition as Nonverbal Communication for Human-Robot Co-adaptation

  • Conference paper
  • First Online:
Proceedings of the Future Technologies Conference (FTC) 2022, Volume 3 (FTC 2022 2022)

Part of the book series: Lecture Notes in Networks and Systems ((LNNS,volume 561))

Included in the following conference series:

Abstract

Human emotion is an essential nonverbal communication tool. Granting machines this type of ability will improve our communication with technology significantly, thus giving us a more natural experience while interacting with machines. Software systems should have the ability to adapt to such nonverbal cues. The focus of our research is the incorporation of human emotions in co-adaptive software systems. Specifically, how emotionally aware systems should react to human emotions. One of the numerous application areas for this promising technology is affective robotics. In this paper, we propose a Framework for a co-adaptive Emotional Support Robot. This framework adopts facial expression recognition as the main method for detecting emotions. In addition, this human-centric framework has a strong emphasis on the personalization of user experience. We adopt a personalized emotion recognition approach, as not all humans show emotions in the same way. As well as the personalization of the system’s adaptive reactions, based on a reinforced learning approach where the system assesses its own actions.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Picard, R.W.: Affective computing. Pattern Anal. Appl. 1, 71–73 (1997). https://doi.org/10.1007/BF01238028

    Article  Google Scholar 

  2. Feil-Seifer, D., Matarić, M.J.: Human robot interaction. In: Encyclopedia of Complexity and Systems Science, pp. 4643–4659. Springer, New York (2009)

    Chapter  Google Scholar 

  3. Kirby, R., Forlizzi, J., Simmons, R.: Affective social robots. Rob. Auton. Syst. 58, 322–332 (2010). https://doi.org/10.1016/j.robot.2009.09.015

    Article  Google Scholar 

  4. François, D., Polani, D., Dautenhahn, K.: Towards socially adaptive robots: a novel method for real time recognition of human-robot interaction styles. In: 2008 8th IEEE-RAS International Conference Humanoid Robot Humanoids 2008, pp. 353–359 (2008). https://doi.org/10.1109/ICHR.2008.4756004

  5. Carroll, J.D., Mohlenhoff, B.S., Kersten, C.M., et al.: Laws and ethics related to emotional support animals. J. Am. Acad. Psychiatry Law 48(4), 509–518 (2020) https://doi.org/10.29158/JAAPL.200047-20

  6. Brooks, H.L., Rushton, K., Lovell, K,, et al.: The power of support from companion animals for people living with mental health problems: a systematic review and narrative synthesis of the evidence. BMC Psychiatry 18(1), 1–12 (2018). https://doi.org/10.1186/s12888-018-1613-2

  7. Al-Omair, O.M., Huang, S.A.: Comparative study of algorithms and methods for facial expression recognition. In: IEEE International Systems Conference (SysCon), pp. 1–6. Orlando, FL (2019)

    Google Scholar 

  8. Ekman, P., Friesen, W.V.: Constants across cultures in the face and emotion. J. Pers. Soc. Psychol. 17, 124–129 (1971). https://doi.org/10.1037/h0030377

    Article  Google Scholar 

  9. Huang, S., Miranda, P.: Incorporating human intention into self-adaptive systems. In: Proceedings IEEE International Conference on Software Engineering, vol. 2, pp. 571–574 (2015). https://doi.org/10.1109/ICSE.2015.196

  10. Hill, N.J., Wolpaw, J.R.: Brain–Computer Interface☆. Ref Modul Biomed Sci. (2016).https://doi.org/10.1016/B978-0-12-801238-3.99322-X

  11. Kaur, B., Singh, D., Roy, P.P.: EEG based emotion classification mechanism in BCI. Procedia Comput. Sci. 132, 752–758 (2018). https://doi.org/10.1016/J.PROCS.2018.05.087

    Article  Google Scholar 

  12. Shu, L., Yu, Y., Chen, W., et al.: Wearable emotion recognition using heart rate data from a smart bracelet. Sensors 20(3), 718 (2020). https://doi.org/10.3390/s20030718

  13. Nwe, T.L., Foo, S.W., De Silva, L.C.: Speech emotion recognition using hidden markov models. Speech Commun. 41, 603–623 (2003). https://doi.org/10.1016/S0167-6393(03)00099-2

    Article  Google Scholar 

  14. Introduction — OpenCV 3.0.0-dev documentation. https://docs.opencv.org/3.0-beta/modules/core/doc/intro.html. Accessed 30 Jan 2018

  15. dlib C++ Library. http://dlib.net/. Accessed 10 Jan 2018

  16. Kazemi, V., Sullivan, J.: One millisecond face alignment with an ensemble of regression trees. In: 2014 IEEE Conference on Computer Vision and Pattern Recognition, pp. 1867–1874. IEEE (2014)

    Google Scholar 

  17. Al-Omair, O.M., Huang, S.: A comparative study on detection accuracy of cloud- based emotion recognition services. In: The International Conference on Signal Processing and Machine Learning. Shanghai, China, pp. 142–148 (2018)

    Google Scholar 

  18. Kephart, J.O., Chess, D.M.: The vision of autonomic computing. Comput. (Long Beach Calif) 36, 41–50 (2003). https://doi.org/10.1109/MC.2003.1160055

  19. Arcaini, P., Scandurra, P.: Modeling and Analyzing MAPE-K Feedback Loops for Self-Adaptation - IEEE Xplore Document (2015)

    Google Scholar 

  20. Chumkamon, S., Masato, K., Hayashi, E.: Facial expression of social interaction based on emotional motivation of animal robot. In: Proceedings - 2015 IEEE International Conference on Systems, Man, and Cybernetics, SMC 2015, pp .185–190. IEEE (2016)

    Google Scholar 

  21. Admoni, H., Srinivasa, S.S.: Predicting user intent through eye gaze for shared autonomy. In: Proceedings of the 2016 AAAI Fall Symposium: Shared Autonomy in Research and Practice. pp. 298–303 (2016)

    Google Scholar 

  22. Admoni, H., Scassellati, B.: Nonverbal behavior modeling for socially assistive robots. In: Proceedings of the 2014 AAAI Fall Symposium: Artificial Intelligence for Human-Robot Interaction (AI-HRI), pp. 7–9 (2014)

    Google Scholar 

  23. Lisetti, C., Amini, R., Yasavur, U., Rishe, N.: I can help you change! an empathic virtual agent delivers behavior change health interventions. ACM Trans. Manag. Inf. Syst. 4, 1–28 (2013). https://doi.org/10.1145/2544103

    Article  Google Scholar 

  24. Boukricha, H., Wachsmuth, I., Carminati, M.N., Knoeferle, P.: A computational model of empathy: empirical evaluation. In: Proceedings - 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction, ACII 2013, pp. 1–6. IEEE (2013)

    Google Scholar 

Download references

Acknowledgment

This work was supported by the Deanship of Scientific Research, Vice Presidency for Graduate Studies and Scientific Research, King Faisal University, Saudi Arabia [Project No. GRANT669].

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Osamah M. Al-Omair .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Al-Omair, O.M., Huang, S. (2023). An Emotional Support Robot Framework Using Emotion Recognition as Nonverbal Communication for Human-Robot Co-adaptation. In: Arai, K. (eds) Proceedings of the Future Technologies Conference (FTC) 2022, Volume 3. FTC 2022 2022. Lecture Notes in Networks and Systems, vol 561. Springer, Cham. https://doi.org/10.1007/978-3-031-18344-7_30

Download citation

Publish with us

Policies and ethics