skip to main content
10.1145/3371382.3377431acmconferencesArticle/Chapter ViewAbstractPublication PageshriConference Proceedingsconference-collections
abstract

Implicit Communication through Distributed Sound Design: Exploring a New Modality in Human-Robot Interaction

Published:01 April 2020Publication History

ABSTRACT

As robots find their way into homes, workplaces, and public spaces, rich and effective human-robot interaction will play an essential role in their success. While most sound-related research in the field of HRI focuses on speech and semantic-free utterances, the potential of sound as an implicit non-verbal channel of communication has only recently received attention and remains largely unexplored. This research will bring design approaches from the fields of sound design and spatial audio into the context of human-robot interaction to influence human perception of robot characteristics and refine non-verbal auditory communication. It will implement sound design systems into various physical robots and evaluate their effect through user studies. By developing design principles for the sonic augmentation of robots, we aim to provide the HRI community with new tools to enrich the way robots communicate with humans.

References

  1. Cindy L Bethel and Robin R Murphy. 2006. Auditory and Other Non-verbal Expressions of Affect for Robots. In AAAI Fall Symposium: Aurally Informed Performance. 1--5.Google ScholarGoogle Scholar
  2. Oliver Bown, Lian Loke, Sam Ferguson, and Dagmar Reinhardt. 2015. Distributed Interactive Audio Devices: Creative strategies and audience responses to novel musical interaction scenarios. (2015), 4.Google ScholarGoogle Scholar
  3. C. Breazeal, C.D. Kidd, A.L. Thomaz, G. Hoffman, and M. Berlin. 2005. Effects of nonverbal communication on efficiency and robustness in human-robot teamwork. In 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE, Edmonton, Alta., Canada, 708--713. https://doi.org/10.1109/IROS.2005.1545011Google ScholarGoogle ScholarCross RefCross Ref
  4. Emma Frid, Roberto Bresin, and Simon Alexanderson. 2018. Perception of Mechanical Sounds Inherent to Expressive Gestures of a NAO Robot-Implications for Movement Sonification of Humanoids. In Sound and Music Computing.Google ScholarGoogle Scholar
  5. Marcel Heerink, Ben Krose, Vanessa Evers, and Bob Wielinga. 2009. Measuring acceptance of an assistive social robot: a suggested toolkit. In RO-MAN 2009 - The 18th IEEE International Symposium on Robot and Human Interactive Communication. IEEE, Toyama, Japan, 528--533. https://doi.org/10.1109/ROMAN.2009. 5326320Google ScholarGoogle ScholarCross RefCross Ref
  6. Kaoru Inoue, Kazuyoshi Wada, and Yuko Ito. 2008. Effective application of Paro: Seal type robots for disabled people in according to ideas of occupational therapists. In International Conference on Computers for Handicapped Persons. Springer, 1321--1324.Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Danijela Kulezic-Wilson. 2008. Sound Design is the New Score. Music, Sound, and the Moving Image 2, 2 (Dec. 2008), 127--131. https://doi.org/10.3828/msmi.2.2.5Google ScholarGoogle ScholarCross RefCross Ref
  8. Dylan Moore, Tobias Dahl, Paula Varela, Wendy Ju, Tormod Næs, and Ingunn Berget. 2019. Unintended Consonances: Methods to Understand Robot Motor Sound Perception. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems - CHI '19. ACM Press, Glasgow, Scotland Uk, 1--12. https://doi.org/10.1145/3290605.3300730Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Dylan Moore, Hamish Tennent, Nikolas Martelaro, and Wendy Ju. 2017. Making Noise Intentional: A Study of Servo Sound Perception. In Proceedings of the 2017 ACM/IEEE International Conference on Human-Robot Interaction - HRI '17. ACM Press, Vienna, Austria, 12--21. https://doi.org/10.1145/2909824.3020238Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Frederic Robinson, Cedric Spindler, Volker Böhm, and Erik Oña. 2015. Gestural control in electronic music performance: sound design based on the 'striking' and 'bowing' movement metaphors. In Proceedings of the Audio Mostly 2015 on Interaction With Sound - AM '15. ACM Press, Thessaloniki, Greece, 1--6. https://doi.org/10.1145/2814895.2814901Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Hamish Tennent, Dylan Moore, Malte Jung, andWendy Ju. 2017. Good vibrations: How consequential sounds affect perception of robotic arms. In 2017 26th IEEE International Symposium on Robot and Human Interactive Communication (ROMAN). IEEE, Lisbon, 928--935. https://doi.org/10.1109/ROMAN.2017.8172414Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Raquel Thiessen, Daniel J Rea, Diljot S Garcha, Cheng Cheng, and James E Young. 2019. Infrasound for HRI: A Robot Using Low-Frequency Vibrations to Impact How People Perceive its Actions. In 2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI). IEEE, 11--18.Google ScholarGoogle ScholarCross RefCross Ref
  13. Gabriele Trovato, Martin Do, Ömer Terlemez, Christian Mandery, Hiroyuki Ishii, Nadia Bianchi-Berthouze, Tamim Asfour, and Atsuo Takanishi. 2016. Is hugging a robot weird? Investigating the influence of robot appearance on users' perception of hugging. In 2016 IEEE-RAS 16th International Conference on Humanoid Robots (Humanoids). IEEE, 318--323.Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Gabriele Trovato, Renato Paredes, Javier Balvin, Francisco Cuellar, Nicolai Baek Thomsen, Soren Bech, and Zheng-Hua Tan. 2018. The Sound or Silence: Investigating the Influence of Robot Noise on Proxemics. In 2018 27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN). IEEE, Nanjing, 713--718. https://doi.org/10.1109/ROMAN.2018.8525795Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. René Tünnermann, Jan Hammerschmidt, and Thomas Hermann. 2013. Blended sonification -- sonification for casual information interaction. Georgia Institute of Technology.Google ScholarGoogle Scholar
  16. René Van Egmond. 2008. The experience of product sounds. In Product experience. Elsevier, 69--89.Google ScholarGoogle Scholar
  17. JasonWolford, Ben Gabaldon, Jordan Rivas, and Brian Min. 2019. Condition-Based Robot Audio Techniques. Google Patents.Google ScholarGoogle Scholar
  18. Selma Yilmazyildiz, Robin Read, Tony Belpeame, and Werner Verhelst. 2016. Review of Semantic-Free Utterances in Social Human--Robot Interaction. International Journal of Human-Computer Interaction 32, 1 (Jan. 2016), 63--85. https://doi.org/10.1080/10447318.2015.1093856Google ScholarGoogle ScholarCross RefCross Ref

Index Terms

  1. Implicit Communication through Distributed Sound Design: Exploring a New Modality in Human-Robot Interaction

              Recommendations

              Comments

              Login options

              Check if you have access through your login credentials or your institution to get full access on this article.

              Sign in
              • Published in

                cover image ACM Conferences
                HRI '20: Companion of the 2020 ACM/IEEE International Conference on Human-Robot Interaction
                March 2020
                702 pages
                ISBN:9781450370578
                DOI:10.1145/3371382

                Copyright © 2020 Owner/Author

                Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

                Publisher

                Association for Computing Machinery

                New York, NY, United States

                Publication History

                • Published: 1 April 2020

                Check for updates

                Qualifiers

                • abstract

                Acceptance Rates

                Overall Acceptance Rate192of519submissions,37%

              PDF Format

              View or Download as a PDF file.

              PDF

              eReader

              View online with eReader.

              eReader