ABSTRACT
As robots find their way into homes, workplaces, and public spaces, rich and effective human-robot interaction will play an essential role in their success. While most sound-related research in the field of HRI focuses on speech and semantic-free utterances, the potential of sound as an implicit non-verbal channel of communication has only recently received attention and remains largely unexplored. This research will bring design approaches from the fields of sound design and spatial audio into the context of human-robot interaction to influence human perception of robot characteristics and refine non-verbal auditory communication. It will implement sound design systems into various physical robots and evaluate their effect through user studies. By developing design principles for the sonic augmentation of robots, we aim to provide the HRI community with new tools to enrich the way robots communicate with humans.
- Cindy L Bethel and Robin R Murphy. 2006. Auditory and Other Non-verbal Expressions of Affect for Robots. In AAAI Fall Symposium: Aurally Informed Performance. 1--5.Google Scholar
- Oliver Bown, Lian Loke, Sam Ferguson, and Dagmar Reinhardt. 2015. Distributed Interactive Audio Devices: Creative strategies and audience responses to novel musical interaction scenarios. (2015), 4.Google Scholar
- C. Breazeal, C.D. Kidd, A.L. Thomaz, G. Hoffman, and M. Berlin. 2005. Effects of nonverbal communication on efficiency and robustness in human-robot teamwork. In 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE, Edmonton, Alta., Canada, 708--713. https://doi.org/10.1109/IROS.2005.1545011Google ScholarCross Ref
- Emma Frid, Roberto Bresin, and Simon Alexanderson. 2018. Perception of Mechanical Sounds Inherent to Expressive Gestures of a NAO Robot-Implications for Movement Sonification of Humanoids. In Sound and Music Computing.Google Scholar
- Marcel Heerink, Ben Krose, Vanessa Evers, and Bob Wielinga. 2009. Measuring acceptance of an assistive social robot: a suggested toolkit. In RO-MAN 2009 - The 18th IEEE International Symposium on Robot and Human Interactive Communication. IEEE, Toyama, Japan, 528--533. https://doi.org/10.1109/ROMAN.2009. 5326320Google ScholarCross Ref
- Kaoru Inoue, Kazuyoshi Wada, and Yuko Ito. 2008. Effective application of Paro: Seal type robots for disabled people in according to ideas of occupational therapists. In International Conference on Computers for Handicapped Persons. Springer, 1321--1324.Google ScholarDigital Library
- Danijela Kulezic-Wilson. 2008. Sound Design is the New Score. Music, Sound, and the Moving Image 2, 2 (Dec. 2008), 127--131. https://doi.org/10.3828/msmi.2.2.5Google ScholarCross Ref
- Dylan Moore, Tobias Dahl, Paula Varela, Wendy Ju, Tormod Næs, and Ingunn Berget. 2019. Unintended Consonances: Methods to Understand Robot Motor Sound Perception. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems - CHI '19. ACM Press, Glasgow, Scotland Uk, 1--12. https://doi.org/10.1145/3290605.3300730Google ScholarDigital Library
- Dylan Moore, Hamish Tennent, Nikolas Martelaro, and Wendy Ju. 2017. Making Noise Intentional: A Study of Servo Sound Perception. In Proceedings of the 2017 ACM/IEEE International Conference on Human-Robot Interaction - HRI '17. ACM Press, Vienna, Austria, 12--21. https://doi.org/10.1145/2909824.3020238Google ScholarDigital Library
- Frederic Robinson, Cedric Spindler, Volker Böhm, and Erik Oña. 2015. Gestural control in electronic music performance: sound design based on the 'striking' and 'bowing' movement metaphors. In Proceedings of the Audio Mostly 2015 on Interaction With Sound - AM '15. ACM Press, Thessaloniki, Greece, 1--6. https://doi.org/10.1145/2814895.2814901Google ScholarDigital Library
- Hamish Tennent, Dylan Moore, Malte Jung, andWendy Ju. 2017. Good vibrations: How consequential sounds affect perception of robotic arms. In 2017 26th IEEE International Symposium on Robot and Human Interactive Communication (ROMAN). IEEE, Lisbon, 928--935. https://doi.org/10.1109/ROMAN.2017.8172414Google ScholarDigital Library
- Raquel Thiessen, Daniel J Rea, Diljot S Garcha, Cheng Cheng, and James E Young. 2019. Infrasound for HRI: A Robot Using Low-Frequency Vibrations to Impact How People Perceive its Actions. In 2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI). IEEE, 11--18.Google ScholarCross Ref
- Gabriele Trovato, Martin Do, Ömer Terlemez, Christian Mandery, Hiroyuki Ishii, Nadia Bianchi-Berthouze, Tamim Asfour, and Atsuo Takanishi. 2016. Is hugging a robot weird? Investigating the influence of robot appearance on users' perception of hugging. In 2016 IEEE-RAS 16th International Conference on Humanoid Robots (Humanoids). IEEE, 318--323.Google ScholarDigital Library
- Gabriele Trovato, Renato Paredes, Javier Balvin, Francisco Cuellar, Nicolai Baek Thomsen, Soren Bech, and Zheng-Hua Tan. 2018. The Sound or Silence: Investigating the Influence of Robot Noise on Proxemics. In 2018 27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN). IEEE, Nanjing, 713--718. https://doi.org/10.1109/ROMAN.2018.8525795Google ScholarDigital Library
- René Tünnermann, Jan Hammerschmidt, and Thomas Hermann. 2013. Blended sonification -- sonification for casual information interaction. Georgia Institute of Technology.Google Scholar
- René Van Egmond. 2008. The experience of product sounds. In Product experience. Elsevier, 69--89.Google Scholar
- JasonWolford, Ben Gabaldon, Jordan Rivas, and Brian Min. 2019. Condition-Based Robot Audio Techniques. Google Patents.Google Scholar
- Selma Yilmazyildiz, Robin Read, Tony Belpeame, and Werner Verhelst. 2016. Review of Semantic-Free Utterances in Social Human--Robot Interaction. International Journal of Human-Computer Interaction 32, 1 (Jan. 2016), 63--85. https://doi.org/10.1080/10447318.2015.1093856Google ScholarCross Ref
Index Terms
- Implicit Communication through Distributed Sound Design: Exploring a New Modality in Human-Robot Interaction
Recommendations
Sound in Human-Robot Interaction
HRI '21 Companion: Companion of the 2021 ACM/IEEE International Conference on Human-Robot InteractionRobot sound spans a wide continuum, from subtle motor hums, through music, bleeps and bloops, to human-inspired vocalizations, and can be an important means of communication for robotic agents. This first workshop on sound in HRI aims to bring together ...
Sound as Implicit Influence on Human-Robot Interactions
HRI '18: Companion of the 2018 ACM/IEEE International Conference on Human-Robot InteractionAutonomous robots in the home and on the road are fundamentally changing the way we live and interact. The visual expressions and interactions of these devices are well studied; however, more could be done to learn how sound could be a deliberate (or ...
Audio Cells: A Spatial Audio Prototyping Environment for Human-Robot Interaction
TEI '20: Proceedings of the Fourteenth International Conference on Tangible, Embedded, and Embodied InteractionRich and elaborate communication will play an essential part in the success of social robotics. The role of non-verbal sound as a communication channel has received relatively little attention in human-robot interaction research so far. Audio Cells is a ...
Comments