Skip to main content

Affect Display Recognition Through Tactile and Visual Stimuli in a Social Robot

  • Conference paper
  • First Online:
Social Robotics (ICSR 2022)

Abstract

New technologies are nowadays an important part of human communication and interaction. While text, facial, and voice recognition have become increasingly fluid in recent years, thanks to the development of machine learning algorithms, recognising and expressing sensations or moods via multimodal recognition is a field that the literature could further explore. This situation introduces a new challenge to social robots. In this work, the authors study how a combination of visual and tactile stimuli influences people’s perceptions of affect display and seeks to apply these findings to a social robot. In the experiments, the subjects had to determine the perceived valence and arousal of simultaneously being exposed to the two stimuli mentioned above. The analysis revealed that the combination of touch and facial expression significantly influences the valence and arousal perceived by users. Based on these findings, this work includes an application for the robot to determine the user’s affect display in real-time.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    In this work, we will use the definition of affect display introduced by Yohanan et al. [20]. We must clarify that the authors acknowledge that this expression could be faked, but these nuances are out of the scope of the paper.

  2. 2.

    Emotion recognition network: https://docs.openvinotoolkit.org/latest/_models_intel_emotions_recognition_retail_0003_description_emotions_recognition_retail_0003.html.

  3. 3.

    Working example video: https://youtu.be/jrv8bY0ssUI.

References

  1. Altun, K., MacLean, K.E.: Recognizing affect in human touch of a robot. Pattern Recogn. Lett. 66, 31–40 (2015)

    Article  Google Scholar 

  2. Andreasson, R., Alenljung, B., Billing, E., Lowe, R.: Affective touch in human-robot interaction: conveying emotion to the nao robot. Int. J. Soc. Robot. 10(4), 473–491 (2018)

    Article  Google Scholar 

  3. Beale, R., Peter, C.: The role of affect and emotion in HCI. In: Peter, C., Beale, R. (eds.) Affect and Emotion in Human-Computer Interaction. LNCS, vol. 4868, pp. 1–11. Springer, Heidelberg (2008). https://doi.org/10.1007/978-3-540-85099-1_1

    Chapter  Google Scholar 

  4. Breazeal, C., Aryananda, L.: Recognition of affective communicative intent in robot-directed speech. Auton. Robot. 12(1), 83–104 (2002)

    Article  MATH  Google Scholar 

  5. Calvo, M.G., Lundqvist, D.: Facial expressions of emotion (kdef): Identification under different display-duration conditions. Behav. Res. Methods 40(1), 109–115 (2008)

    Article  Google Scholar 

  6. Diekhof, E.K., Kipshagen, H.E., Falkai, P., Dechent, P., Baudewig, J., Gruber, O.: The power of imagination-how anticipatory mental imagery alters perceptual processing of fearful facial expressions. Neuroimage 54(2), 1703–1714 (2011)

    Article  Google Scholar 

  7. Ekman, P.: Basic emotions. Handbook Cogn. Emotion 98(45–60), 16 (1999)

    Google Scholar 

  8. Gamboa-Montero, J.J., Alonso-Martin, F., Castillo, J.C., Malfaz, M., Salichs, M.A.: Detecting, locating and recognising human touches in social robots with contact microphones. Eng. Appl. Artif. Intell. 92, 103670 (2020)

    Article  Google Scholar 

  9. Gobron, S., Ahn, J., Paltoglou, G., Thelwall, M., Thalmann, D.: From sentence to emotion: a real-time three-dimensional graphics metaphor of emotions extracted from text. Vis. Comput. 26(6), 505–519 (2010)

    Article  Google Scholar 

  10. Henschel, A., Laban, G., Cross, E.S.: What makes a robot social? a review of social robots from science fiction to a home or hospital near you. Current Robot. Reports 2(1), 9–19 (2021)

    Article  Google Scholar 

  11. Huang, Y., Yang, J., Liu, S., Pan, J.: Combining facial expressions and electroencephalography to enhance emotion recognition. Future Internet 11(5), 105 (2019)

    Article  Google Scholar 

  12. Paltoglou, G., Thelwall, M.: Seeing stars of valence and arousal in blog posts. IEEE Trans. Affect. Comput. 4(1), 116–123 (2012)

    Article  Google Scholar 

  13. Russell, J.A.: A circumplex model of affect. J. Pers. Soc. Psychol. 39(6), 1161 (1980)

    Article  Google Scholar 

  14. Salichs, M.A., et al.: Mini: a new social robot for the elderly. Int. J. Soc. Robot. 12(6), 1231–1249 (2020)

    Article  Google Scholar 

  15. Shapiro, S.S., Wilk, M.B.: An analysis of variance test for normality (complete samples). Biometrika 52(3–4), 591–611 (1965)

    Article  MATH  Google Scholar 

  16. Silvera-Tawil, D., Rye, D., Velonaki, M.: Interpretation of social touch on an artificial arm covered with an eit-based sensitive skin. Int. J. Soc. Robot. 6(4), 489–505 (2014)

    Article  Google Scholar 

  17. Teyssier, M., Bailly, G., Pelachaud, C., Lecolinet, E.: Conveying emotions through device-initiated touch. IEEE Trans. Affect. Comput. 13, 1477–1488 (2020)

    Article  Google Scholar 

  18. Tsalamlal, M.Y., Amorim, M.A., Martin, J.C., Ammi, M.: Combining facial expression and touch for perceiving emotional valence. IEEE Trans. Affect. Comput. 9(4), 437–449 (2016)

    Article  Google Scholar 

  19. Vasconcelos, M., Dias, M., Soares, A.P., Pinheiro, A.P.: What is the melody of that voice? probing unbiased recognition accuracy with the montreal affective voices. J. Nonverbal Behav. 41(3), 239–267 (2017)

    Article  Google Scholar 

  20. Yohanan, S., MacLean, K.E.: The role of affective touch in human-robot interaction: Human intent and expectations in touching the haptic creature. Int. J. Soc. Robot. 4(2), 163–180 (2012)

    Article  Google Scholar 

Download references

Acknowledgements

The research leading to these results has received funding from the projects: Robots Sociales para Estimulación Física, Cognitiva y Afectiva de Mayores (ROSES), RTI2018-096338-B-I00, funded by the Ministerio de Ciencia, Innovación y Universidades; Robots sociales para mitigar la soledad y el aislamiento en mayores (SOROLI), PID2021-123941OA-I00, funded by Agencia Estatal de Investigación (AEI), Spanish Ministerio de Ciencia e Innovación; the project PLEC2021-007819, funded by MCIN/AEI/10.13039/501100011033 and by the European Union NextGenerationEU/PRTR, and RoboCity2030-DIH-CM, Madrid Robotics Digital Innovation Hub, S2018/NMT-4331, funded by “Programas de Actividades I+D en la Comunidad de Madrid” and cofunded by the European Social Funds (FSE) of the EU.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Sara Marques-Villarroya .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Marques-Villarroya, S., Gamboa-Montero, J.J., Jumela-Yedra, C., Castillo, J.C., Salichs, M.A. (2022). Affect Display Recognition Through Tactile and Visual Stimuli in a Social Robot. In: Cavallo, F., et al. Social Robotics. ICSR 2022. Lecture Notes in Computer Science(), vol 13817. Springer, Cham. https://doi.org/10.1007/978-3-031-24667-8_12

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-24667-8_12

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-24666-1

  • Online ISBN: 978-3-031-24667-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics