Abstract
Human-robot interactions (HRI) can be made much more immersive by using sounds. Regarding this, a study was carried out to explore different types of sounds that may be more suitable for HRI. Participants were presented with a video in which a social robot navigates in a context and expresses six basic emotions in specific situations. The participants generated sounds for each emotion displayed by the robot in the video. Results revealed a preference for human-like sounds, especially onomatopoeia. Moreover, most participants deemed the sounds produced by current robots in general as inappropriate and lacking empathy, primarily due to their resemblance with machine-like noises. The study provides specific insights into the perception of audio expressions, which are helpful in creating inclusive and emotionally compelling HRI. The results highlight how crucial it is to give social robots more human-like auditory features.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Frauenberger, C., Putz, V., Holdrich, R., Stockman, T.: Interaction patterns for auditory user interfaces. In: ICAD Proceedings, Limerick, Ireland, pp. 154–160 (2005)
Song, S., Yamada, S.: Expressing emotions through color, sound, and vibration with an appearance-constrained social robot. In: Proceedings of the 2017 ACM/IEEE International Conference on Human-Robot Interaction, pp. 2–11 (2017)
Breazeal, C.: Toward sociable robots. Robot. Auton. Syst. 42(3–4), 167–175 (2003)
Fong, T., Nourbakhsh, I., Dautenhahn, K.: A survey of socially interactive robots. Robot. Auton. Syst. 42(3–4), 143–166 (2003)
Latupeirissa, A.B., Panariello, C., Bresin, R.: Exploring emotion perception in sonic HRI. In: Sound and Music Computing Conference, Torino, 24–26 June 2020, pp. 434–441, Zenodo (2020)
Moore, D., Ju, W.: Sound as implicit influence on human-robot interactions. In: Companion of the 2018 ACM/IEEE International Conference on Human-Robot Interaction, pp. 311–312 (2018)
Pelikan, H., Robinson, F.A., Keevallik, L., Velonaki, M., Broth, M., Bown, O.: Sound in human-robot interaction. In: Companion of the 2021 ACM/IEEE International Conference on Human-Robot Interaction, HRI ’21 Companion, New York, NY, USA, pp. 706–708, Association for Computing Machinery (2021)
Robinson, F.A., Bown, O., Velonaki, M.: Implicit communication through distributed sound design: exploring a new modality in human-robot interaction. In: Companion of the 2020 ACM/IEEE International Conference on Human-Robot Interaction, pp. 597–599 (2020)
Yan, H., Ang, M.H., Poo, A.N.: A survey on perception methods for human-robot interaction in social robots. Int. J. Soc. Robot. 6, 85–119 (2014)
Ekman,P.: Emotion in the Human Face . Cambridge Cambridgeshire, New York (1982)
Sequeira, J.S., Ferreira, I.A.: Lessons from the monarch project. In: International Conference on Informatics in Control, Automation and Robotics, vol. 2, pp. 241–248. SCITEPRESS (2016)
Damholdt, M.F., Christina, V., Kryvous, A., Smedegaard, C.V., Seibt, J.: What is in three words? Exploring a three-word methodology for assessing impressions of a social robot encounter online and in real life. Paladyn J. Behav. Robot. 10(1), 438–453 (2019)
Mkpojiogu, E.O., Okeke-Uzodike, O.E., Emmanuel, E.I.: Quality characteristics of an LMS UX psychomotor model for the design and evaluation of learning management systems. In: 3rd International Conference on Integrated Intelligent Computing Communication & Security (ICIIC 2021), pp. 243–249, Atlantis Press (2021)
Tong, L., Lindeman, R.W., Regenbrecht, H.: Viewer’s role and viewer interaction in cinematic virtual reality. Computers 10(5), 66 (2021)
Fu, D., et al.: A trained humanoid robot can perform human-like crossmodal social attention and conflict resolution. Int. J. Soc. Robot. 15, 1325–1340 (2023)
Giambattista, A., Teixeira, L., Ayanoğlu, H., Saraiva, M., Duarte, E.: Expression of emotions by a service robot: a pilot study. In: Marcus, A. (ed.) DUXU 2016, Part III. LNCS, vol. 9748, pp. 328–336. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-40406-6_31
Ayanoğlu, H., Saraiva M., Teixeira, L., Duarte, E.: Human-robot interaction: exploring the ability to express emotions by a social robot. In: Emotional Design in Human-Robot Interaction: Theory, Methods and Applications, pp. 163–183 (2019)
Zhang, B.J., Fitter, N.T.: Nonverbal sound in human-robot interaction: a systematic review. ACM Trans. Hum.-Robot Interact. 12, 4 (2023)
Braun, V., Clarke, V.: Using thematic analysis in psychology. Qual. Res. Psychol. 3(2), 77–101 (2006)
Xu, J., Tao, Y., Lin, H.: Semantic word cloud generation based on word embeddings. In: 2016 IEEE Pacific Visualization Symposium (PacificVis), pp. 239–243. IEEE (2016)
Acknowledgments
Special thanks to the MonarCH project (Multi-Robot Cognitive Systems Operating in Hospitals) for granting permission to utilize their robotic platform. The project, funded under the reference FP7-ICT-2011-9-601033, has been instrumental in facilitating our exploration of human-robot interactions. More information about the MonarCH project can be found at https://welcome.isr.tecnico.ulisboa.pt/projects/multi-robot-cognitive-systems-operating-in-hospitals/.
Furthermore, this study received support from UNIDCOM/IADE under a grant from the Fundação para a Ciência e Tecnologia (FCT), with the reference UIDB/00711/2020 attributed to UNIDCOM/IADE - Unidade de Investigação em Design e Comunicação, Lisbon, Portugal.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Ethics declarations
Disclosure of Interests
The authors have no competing interests to declare relevant to this article’s content.
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Andrade Pires, G., Tsvetcoff, R., Ayanoglu, H., Duarte, E. (2024). Emotive Acoustics: Sound Design in Robotic Emotion Expression. A Study on Participant Generated Sounds. In: Kurosu, M., Hashizume, A. (eds) Human-Computer Interaction. HCII 2024. Lecture Notes in Computer Science, vol 14685. Springer, Cham. https://doi.org/10.1007/978-3-031-60412-6_3
Download citation
DOI: https://doi.org/10.1007/978-3-031-60412-6_3
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-60411-9
Online ISBN: 978-3-031-60412-6
eBook Packages: Computer ScienceComputer Science (R0)