ABSTRACT
Can we influence how a robot is perceived by designing the sound of its movement? Drawing from practices in film sound, we overlaid a video depicting a robot's movement routine with three types of artificial movement sound. In a between-subject study design, participants saw either one of the three designs or a quiet control condition and rated the robot's movement quality, safety, capability, and attractiveness. We found that, compared to our control, the sound designs both increased and decreased perceived movement quality. Coupling the same robotic movement with different sounds lead to the motions being rated as more or less precise, elegant, jerky, or uncontrolled, among others. We further found that the sound conditions decreased perceived safety, and did not affect perceived capability and attractiveness. More unrealistic sound conditions led to larger differences in ratings, while the subtle addition of harmonic material was not rated differently to the control condition in any of the measures. Based on these findings, we discuss the challenges and opportunities regarding the use of artificial movement sound as an implicit channel of communication that may eventually be able to selectively target specific characteristics, helping designers in creating more refined and nuanced human-robot interactions.
Supplemental Material
Available for Download
This file contains the dataset and R commands used in the study "Smooth Operator: Tuning Robot Perception Through Artificial Movement Sound" by Robinson et al.
- Hervé Abdi and Lynne J Williams. 2010. Tukey's honestly significant difference (HSD) test. Encyclopedia of research design, Vol. 3 (2010).Google Scholar
- Almohannad Albastaki, Marius Hoggenmueller, Frederic Anthony Robinson, and Luke Hespanhol. 2020. Augmenting Remote Interviews through Virtual Experience Prototypes. In 32nd Australian Conference on Human-Computer Interaction (OzCHI '20), Sydney, NSW, Australia. ACM, New York, NY, USA, 9. https://doi.org/10.1145/3441000.3441057 ZSCC: 0000000.Google ScholarDigital Library
- Barry Arons. 1992. A review of the cocktail party effect. Journal of the American Voice I/O Society, Vol. 12, 7 (1992).Google Scholar
- Jon Bellona, Lin Bai, Luke Dahl, and Amy LaViers. 2017. Empirically Informed Sound Synthesis Application for Enhancing the Perception of Expressive Robotic Movement. In Proceedings of the 23rd International Conference on Auditory Display - ICAD 2017. The International Community for Auditory Display, University Park Campus. https://doi.org/10.21785/icad2017.049Google ScholarCross Ref
- Bérenger Bramas, Young-Min Kim, and Dong-Soo Kwon. 2008. Design of a sound system to increase emotional expression impact in human-robot interaction. In 2008 International Conference on Control, Automation and Systems. IEEE.Google ScholarCross Ref
- Cynthia Breazeal, Kerstin Dautenhahn, and Takayuki Kanda. 2016. Social Robotics. In Springer Handbook of Robotics,, Bruno Siciliano and Oussama Khatib (Eds.). Springer International Publishing, Cham, 1935--1972. https://doi.org/10.1007/978-3-319-32552-1_72Google Scholar
- C. Breazeal, C.D. Kidd, A.L. Thomaz, G. Hoffman, and M. Berlin. 2005. Effects of nonverbal communication on efficiency and robustness in human-robot teamwork. In 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE, Edmonton, Alta., Canada. https://doi.org/10.1109/IROS.2005.1545011Google ScholarCross Ref
- Elizabeth Cha, Anca D. Dragan, and Siddhartha S. Srinivasa. 2015. Perceived robot capability. In 2015 24th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN). IEEE, Kobe, Japan, 541--548. https://doi.org/10.1109/ROMAN.2015.7333656Google Scholar
- Elizabeth Cha, Naomi T. Fitter, Yunkyung Kim, Terrence Fong, and Maja J. Matari?. 2018. Effects of Robot Sound on Auditory Localization in Human-Robot Collaboration. In Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction - HRI '18. ACM Press, Chicago, IL, USA. https://doi.org/10.1145/3171221.3171285Google Scholar
- Kuan-Ta Chen, Chen-Chi Wu, Yu-Chun Chang, and Chin-Laung Lei. 2009. A crowdsourceable QoE evaluation framework for multimedia content. In Proceedings of the 17th ACM international conference on Multimedia.Google ScholarDigital Library
- R Core Development Team and others. 2016. R: A language and environment for statistical computing. R foundation for statistical computing Vienna, Austria.Google Scholar
- Luke Dahl, Jon Bellona, Lin Bai, and Amy LaViers. 2017. Data-Driven Design of Sound for Enhancing the Perception of Expressive Robotic Movement. In Proceedings of the 4th International Conference on Movement Computing - MOCO '17. ACM Press, London, United Kingdom. https://doi.org/10.1145/3077981.3078047Google ScholarDigital Library
- Alfred O Effenberg. 2005. Movement sonification: Effects on perception and action. IEEE multimedia, Vol. 12, 2 (2005), 53--59.Google Scholar
- Bruno Falissard. 2012. psy: Various procedures used in psychometry. R package version, Vol. 1 (2012).Google Scholar
- Michael Filimowicz. 2020. Foundations in Sound Design for Linear Media: A Multidisciplinary Approach. Routledge.Google Scholar
- John Fox, Gregor Gorjanc Friendly, Spencer Graves, Richard Heiberger, Georges Monette, Henric Nilsson, Brian Ripley, Sanford Weisberg, Maintainer John Fox, and MASS Suggests. 2007. The car package. R Foundation for Statistical Computing (2007).Google Scholar
- Emma Frid, Roberto Bresin, and Simon Alexanderson. 2018. Perception of Mechanical Sounds Inherent to Expressive Gestures of a NAO Robot-Implications for Movement Sonification of Humanoids. In Sound and Music Computing.Google Scholar
- William Gaver. 2012. What should we expect from research through design?. In Proceedings of the 2012 ACM annual conference on Human Factors in Computing Systems - CHI '12. ACM Press, Austin, Texas, USA. https://doi.org/10.1145/2207676.2208538Google ScholarDigital Library
- Chin-Chang Ho and Karl F MacDorman. 2010. Revisiting the uncanny valley theory: Developing and validating an alternative to the Godspeed indices. Computers in Human Behavior, Vol. 26, 6 (2010).Google ScholarDigital Library
- Guy Hoffman and Wendy Ju. 2014. Designing Robots With Movement in Mind . Journal of Human-Robot Interaction, Vol. 3, 1 (March 2014). https://doi.org/10.5898/JHRI.3.1.HoffmanGoogle ScholarDigital Library
- Marius Hoggenmueller, Luke Hespanhol, and Martin Tomitsch. 2020 a. Stop and Smell the Chalk Flowers: A Robotic Probe for Investigating Urban Interaction with Physicalised Displays. (2020).Google Scholar
- Marius Hoggenmueller, Wen-Ying Lee, Luke Hespanhol, Martin Tomitsch, and Malte Jung. 2020 b. Beyond the Robotic Artefact: Capturing Designerly HRI Knowledge through Annotated Portfolios. (2020).Google Scholar
- Kaoru Inoue, Kazuyoshi Wada, and Yuko Ito. 2008. Effective application of Paro: Seal type robots for disabled people in according to ideas of occupational therapists. In International Conference on Computers for Handicapped Persons. Springer.Google ScholarDigital Library
- Gunnar Johannsen. 2001. Auditory Displays in Human--Machine Interfaces of Mobile Robots for Non-Speech Communication with Humans. Journal of Intelligent and Robotic Systems, Vol. 32, 2 (2001).Google ScholarDigital Library
- Gunnar Johannsen. 2002. Auditory display of directions and states for mobile systems. Georgia Institute of Technology.Google Scholar
- Gunnar Johannsen. 2004. Auditory displays in human-machine interfaces. Proc. IEEE, Vol. 92, 4 (2004).Google Scholar
- Patrik Jonell, Taras Kucherenko, Ilaria Torre, and Jonas Beskow. 2020. Can we trust online crowdworkers? Comparing online and offline participants in a preference test of virtual agents. arXiv preprint arXiv:2009.10760 (2020).Google Scholar
- Veikko Jousmäki and Riitta Hari. 1998. Parchment-skin illusion: sound-biased touch. Current biology, Vol. 8, 6 (1998), R190--R191.Google Scholar
- Takanori Komatsu and Seiji Yamada. 2011. How Does the Agents' Appearance Affect Users' Interpretation of the Agents' Attitudes: Experimental Investigation on Expressing the Same Artificial Sounds From Agents With Different Appearances. International Journal of Human-Computer Interaction, Vol. 27, 3 (Feb. 2011). https://doi.org/10.1080/10447318.2011.537209Google ScholarCross Ref
- Gregory Kramer, Bruce Walker, Terri Bonebright, Perry Cook, John H Flowers, Nadine Miner, and John Neuhoff. 2010. Sonification report: Status of the field and research agenda. (2010).Google Scholar
- Thierry Lageat, Sandor Czellar, and Gilles Laurent. 2003. Engineering hedonic attributes to generate perceptions of luxury: Consumer perception of an everyday sound. Marketing Letters, Vol. 14, 2 (2003).Google ScholarCross Ref
- Adrian Benigno Latupeirissa and Roberto Bresin. 2020. Understanding non-verbal sound of humanoid robots in films. In Workshop on Mental Models of Robots at HRI 2020 in Cambridge, UK.Google Scholar
- Adrian Benigno Latupeirissa, Panariello Claudio, and Roberto Bresin. 2020. Exploring emotion perception in sonic HRI. In Sound and Music Computing Conference, Torino, 24--26 June 2020. Zenodo.Google Scholar
- Adrian Benigno Latupeirissa, Emma Frid, and Roberto Bresin. 2019. Sonic characteristics of robots in films. In Sound and Music Computing Conference.Google Scholar
- Michal Luria, Samantha Reig, Xiang Zhi Tan, Aaron Steinfeld, Jodi Forlizzi, and John Zimmerman. 2019 a. Re-Embodiment and Co-Embodiment: Exploration of social presence for robots and conversational agents. In Proceedings of the 2019 on Designing Interactive Systems Conference - DIS '19. ACM Press, San Diego, CA, USA, 633--644. https://doi.org/10.1145/3322276.3322340Google ScholarDigital Library
- Michal Luria, John Zimmerman, and Jodi Forlizzi. 2019 b. Championing Research Through Design in HRI. (2019).Google Scholar
- Winter Mason and Siddharth Suri. 2012. Conducting behavioral research on Amazon's Mechanical Turk. Behavior research methods, Vol. 44, 1 (2012).Google Scholar
- Dylan Moore, Rebecca Currano, and David Sirkin. 2020. Sound Decisions: How Synthetic Motor Sounds Improve Autonomous Vehicle-Pedestrian Interactions. In 12th International Conference on Automotive User Interfaces and Interactive Vehicular Applications.Google Scholar
- Dylan Moore, Tobias Dahl, Paula Varela, Wendy Ju, Tormod Næs, and Ingunn Berget. 2019. Unintended Consonances: Methods to Understand Robot Motor Sound Perception. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems - CHI '19. ACM Press, Glasgow, Scotland Uk. https://doi.org/10.1145/3290605.3300730Google ScholarDigital Library
- Dylan Moore, Hamish Tennent, Nikolas Martelaro, and Wendy Ju. 2017. Making Noise Intentional: A Study of Servo Sound Perception. In Proceedings of the 2017 ACM/IEEE International Conference on Human-Robot Interaction - HRI '17. ACM Press, Vienna, Austria. https://doi.org/10.1145/2909824.3020238Google ScholarDigital Library
- Ioana Ocnarescu and Isabelle Cossin. 2019. The Contribution of Art and Design to Robotics. In International Conference on Social Robotics. Springer.Google ScholarDigital Library
- Stefan Palan and Christian Schitter. 2018. Prolific. ac-A subject pool for online experiments. Journal of Behavioral and Experimental Finance, Vol. 17 (2018).Google ScholarCross Ref
- Hannah RM Pelikan, Mathias Broth, and Leelo Keevallik. 2020. "Are You Sad, Cozmo?" How Humans Make Sense of a Home Robot's Emotion Displays. In Proceedings of the 2020 ACM/IEEE International Conference on Human-Robot Interaction.Google ScholarDigital Library
- Robin Read and Tony Belpaeme. 2014. Situational context directs how people affectively interpret robotic non-linguistic utterances. In Proceedings of the 2014 ACM/IEEE international conference on Human-robot interaction - HRI '14. ACM Press, Bielefeld, Germany. https://doi.org/10.1145/2559636.2559680Google ScholarDigital Library
- Frederic Anthony Robinson, Oliver Bown, and Mari Velonaki. 2020. Implicit Communication through Distributed Sound Design: Exploring a New Modality in Human-Robot Interaction. In Companion of the 2020 ACM/IEEE International Conference on Human-Robot Interaction (HRI '20 Companion). ACM, Cambridge, United Kingdom. https://doi.org/10.1145/3371382.3377431Google ScholarDigital Library
- Markus Schwenk and Kai O. Arras. 2014. R2-D2 Reloaded: A flexible sound synthesis system for sonic human-robot interaction design. In The 23rd IEEE International Symposium on Robot and Human Interactive Communication. IEEE, Edinburgh, UK. https://doi.org/10.1109/ROMAN.2014.6926247Google Scholar
- Charles Spence and Qian Wang. 2015. Sensory expectations elicited by the sounds of opening the packaging and pouring a beverage. Flavour, Vol. 4, 1 (Dec. 2015), 35. https://doi.org/10.1186/s13411-015-0044-yGoogle ScholarCross Ref
- Hamish Tennent, Dylan Moore, Malte Jung, and Wendy Ju. 2017. Good vibrations: How consequential sounds affect perception of robotic arms. In 2017 26th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN). IEEE, Lisbon. https://doi.org/10.1109/ROMAN.2017.8172414Google ScholarCross Ref
- Raquel Thiessen, Daniel J Rea, Diljot S Garcha, Cheng Cheng, and James E Young. 2019. Infrasound for HRI: A Robot Using Low-Frequency Vibrations to Impact How People Perceive its Actions. In 2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI) . IEEE.Google ScholarCross Ref
- Gabriele Trovato, Martin Do, Ömer Terlemez, Christian Mandery, Hiroyuki Ishii, Nadia Bianchi-Berthouze, Tamim Asfour, and Atsuo Takanishi. 2016. Is hugging a robot weird? Investigating the influence of robot appearance on users' perception of hugging. In 2016 IEEE-RAS 16th International Conference on Humanoid Robots (Humanoids). IEEE.Google ScholarDigital Library
- Gabriele Trovato, Renato Paredes, Javier Balvin, Francisco Cuellar, Nicolai Baek Thomsen, Soren Bech, and Zheng-Hua Tan. 2018. The Sound or Silence: Investigating the Influence of Robot Noise on Proxemics. In 2018 27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN). IEEE, Nanjing. https://doi.org/10.1109/ROMAN.2018.8525795Google ScholarCross Ref
- René Tünnermann, Jan Hammerschmidt, and Thomas Hermann. 2013. Blended sonification -- sonification for casual information interaction. Georgia Institute of Technology.Google Scholar
- René Van Egmond. 2008. The experience of product sounds. In Product experience. Elsevier.Google Scholar
- Katsumi Watanabe and Shinsuke Shimojo. 2001. When Sound Affects Vision: Effects of Auditory Grouping on Visual Motion Perception . Psychological Science, Vol. 12, 2 (March 2001), 109--116. https://doi.org/10.1111/1467--9280.00319Google ScholarCross Ref
- Hadley Wickham. 2016. ggplot2: elegant graphics for data analysis .springer.Google Scholar
- Jason Wolford, Ben Gabaldon, Jordan Rivas, and Brian Min. 2019. Condition-Based Robot Audio Techniques. Google Patents.Google Scholar
- Kevin J. P. Woods, Max H. Siegel, James Traer, and Josh H. McDermott. 2017. Headphone screening to facilitate web-based auditory experiments. Attention, Perception, & Psychophysics, Vol. 79, 7 (Oct. 2017). https://doi.org/10.3758/s13414-017--1361--2Google ScholarCross Ref
- Selma Yilmazyildiz, Robin Read, Tony Belpeame, and Werner Verhelst. 2016. Review of Semantic-Free Utterances in Social Human--Robot Interaction . International Journal of Human-Computer Interaction, Vol. 32, 1 (Jan. 2016). https://doi.org/10.1080/10447318.2015.1093856Google ScholarCross Ref
- John Zimmerman, Jodi Forlizzi, and Shelley Evenson. 2007. Research through design as a method for interaction design research in HCI. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems - CHI '07. ACM Press, San Jose, California, USA. https://doi.org/10.1145/1240624.1240704Google ScholarDigital Library
- Udo Zölzer. 2011. DAFX: digital audio effects. John Wiley & Sons. https://www.dafx.de/DAFX_Book_Page_2nd_edition/index.htmlGoogle Scholar
- Elif Özcan and René van Egmond. 2006. Product sound design and application: An overview. In Proceedings of the fifth international conference on design and emotion, Gothenburg.Google Scholar
Index Terms
- Smooth Operator: Tuning Robot Perception Through Artificial Movement Sound
Recommendations
Implicit Communication through Distributed Sound Design: Exploring a New Modality in Human-Robot Interaction
HRI '20: Companion of the 2020 ACM/IEEE International Conference on Human-Robot InteractionAs robots find their way into homes, workplaces, and public spaces, rich and effective human-robot interaction will play an essential role in their success. While most sound-related research in the field of HRI focuses on speech and semantic-free ...
Sound in Human-Robot Interaction
HRI '21 Companion: Companion of the 2021 ACM/IEEE International Conference on Human-Robot InteractionRobot sound spans a wide continuum, from subtle motor hums, through music, bleeps and bloops, to human-inspired vocalizations, and can be an important means of communication for robotic agents. This first workshop on sound in HRI aims to bring together ...
Visual cues effect on the impact of sonification on movement
MOCO '18: Proceedings of the 5th International Conference on Movement and ComputingWhen designing movement sonification, its effect on peoples' movement must be considered. Recent work has shown that using musical expectancy within movement sonification can impact the way people move. However, this body of work has also found that ...
Comments