skip to main content
10.1145/3409120.3410667acmconferencesArticle/Chapter ViewAbstractPublication PagesautomotiveuiConference Proceedingsconference-collections
research-article

Sound Decisions: How Synthetic Motor Sounds Improve Autonomous Vehicle-Pedestrian Interactions

Published: 20 September 2020 Publication History

Abstract

Electric vehicles’ (EVs) nearly silent operation has proved to be dangerous for bicyclists and pedestrians, who often use an internal combustion engine’s sound as one of many signals to locate nearby vehicles and predict their behavior. Inspired by regulations currently being implemented that will require EVs and hybrid vehicles (HVs) to play synthetic sound, we used a Wizard-of-Oz AV setup to explore how adding synthetic engine sound to a hybrid autonomous vehicle (AV) will influence how pedestrians interact with the AV in a naturalistic field study. Pedestrians reported increased interaction quality and clarity of intent of the vehicle to yield compared to a baseline condition without any added sound. These findings suggest that synthetic engine sound will not only be effective at helping pedestrians to hear EVs, but also may help AV developers implicitly signal to pedestrians when the vehicle will yield.

References

[1]
2018. The Future Sound of Cars with Richard Devine: How Jaguar’s I-PACE Is Redefining the Engine Sound.
[2]
2018. Sound of Janguar I-Pace Protects Road Users. https://media.jaguar.com/news/2018/10/sound-jaguar-i-pace-protects-road-users.
[3]
2019. Electric and Hybrid Cars: New Rules on Noise Emitting to Protect Vulnerable Road Users | Internal Market, Industry, Entrepreneurship and SMEs. Technical Report. European Commission.
[4]
Claudia Ackermann, Matthias Beggiato, Sarah Schubert, and Josef F. Krems. 2019. An Experimental Study to Investigate Design and Assessment Criteria: What Is Important for Communication between Pedestrians and Automated Vehicles?Applied Ergonomics 75(2019), 272–282. https://doi.org/10.1016/j.apergo.2018.11.002
[5]
National Highway Traffic Safety Administration. 2016. Federal Motor Vehicle Safety Standards; Minimum Sound Requirements for Hybrid and Electric Vehicles. Federal Register 80, 240 (Dec. 2016).
[6]
Ercan Altinsoy. 2013. The Detectability of Conventional, Hybrid and Electric Vehicle Sounds by Sighted, Visually Impaired and Blind Pedestrians. In Inter Noise. Innsbruck, Austria.
[7]
Christoph Bartneck. 2004. From Fiction to Science – A Cultural Reflection of Social Robots. In Proceedings of the CHI2004 Workshop on Shaping Human-Robot Interaction.
[8]
E. Cha, N.T. Fitter, Y. Kim, T. Fong, and M.J. Matari. 2018. Effects of Robot Sound on Auditory Localization in Human-Robot Collaboration. In ACM/IEEE International Conference on Human-Robot Interaction. 434–442. https://doi.org/10.1145/3171221.3171285
[9]
Michael Clamann. 2017. Evaluation of Vehicle-to-Pedestrian Communication Displays for Autonomous Vehicles. Human Factors: The Journal of the Human Factors and Ergonomics Society 57, 3(2017), 407–434.
[10]
Rebecca Currano, So Yeon Park, Lawrence Domingo, Jesus Garcia-Mancilla, Pedro Santana-Mancilla, Victor Gonzalez, and Wendy Ju. 2018. Vamos! Observations of Pedestrian Interactions with Driverless Cars in Mexico. In 10th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI ’18),. ACM, Toronto, Canada. https://doi.org/10.1145/3239060.3241680
[11]
Gayle Dalrymple. 2013. Minimum Sound Requirements for Hybrid and Electric Vehicles. Technical Report. US Department of Transportation - National Highway Traffic Safety Administration. 114 pages.
[12]
Koen de Clercq, Andre Dietrich, Juan Pablo Núñez Velasco, Joost de Winter, and Riender Happee. 2019. External Human-Machine Interfaces on Automated Vehicles: Effects on Pedestrian Crossing Decisions. Human Factors 61, 8 (Dec. 2019), 1353–1370. https://doi.org/10.1177/0018720819836343
[13]
Debargha Dey, Azra Habibovic, Bastian Pfleging, Marieke Martens, and Jacques Terken. 2020. Color and Animation Preferences for a Light Band eHMI in Interactions Between Automated Vehicles and Pedestrians. (2020), 13.
[14]
Debargha Dey and Jacques Terken. 2017. Pedestrian Interaction with Vehicles: Roles of Explicit and Implicit Communication. In AutoUI Adjunct Proceedings. 109–113. https://doi.org/10.1145/3122986.3123009
[15]
Rachel England. 2018. European EVs Must Be Fitted with Sound Emitters by 2021. https://www.engadget.com/2018/05/08/european-evs-fitted-with-sound-emitters-by-2021/.
[16]
Anna Fenko, Hendrik N J Schifferstein, and Paul Hekkert. 2011. Noisy Products: Does Appearance Matter?International Journal of Design 5, 3 (2011), 77–87.
[17]
Sylvain Fleury, Éric Jamet, Vincent Roussarie, Laure Bosc, and Jean Christophe Chamard. 2016. Effect of Additional Warning Sounds on Pedestrians’ Detection of Electric Vehicles: An Ecological Approach. Accident Analysis and Prevention 97 (2016), 176–185. https://doi.org/10.1016/j.aap.2016.09.002
[18]
Emma Frid, Roberto Bresin, and Simon Alexanderson. 2018. Perception of Mechanical Sounds Inherent to Expressive Gestures of a NAO Robot - Implications for Movement Sonification of Humanoids. In 15th Sound and Music Computing Conference. Limassol, Cyprus.
[19]
Azra Habibovic, Victor Malmsten Lundgren, Jonas Andersson, Maria Klingegård, Tobias Lagström, Anna Sirkka, Johan Fagerlönn, Claes Edgren, Rikard Fredriksson, Stas Krupenia, Dennis Saluäär, and Pontus Larsson. 2018. Communicating Intent of Automated Vehicles to Pedestrians. Frontiers in Psychology 9 (Aug. 2018). https://doi.org/10.3389/fpsyg.2018.01336
[20]
Drew Hartwell. 2015. America’s Best-Selling Cars and Trucks Are Built on Lies: The Rise of Fake Engine Noise. The Washington Post (2015).
[21]
Andrew J. Hawkins. 2019. Electric Car Owners Could Choose Which Fake Sounds Their Cars Make under New Proposal. https://www.theverge.com/2019/9/16/20869035/electric-car-ev-fake-noise-nhtsa.
[22]
Anees Ahamed Kaleefathullah, Natasha Merat, Yee Mun Lee, Yke Bauke Eisma, Ruth Madigan, Jorge Garcia, and Joost de Winter. [n.d.]. External Human-Machine Interfaces Can Fail! An Examination of Trust Development and Misuse in a CAVE-Based Pedestrian Simulation Environment. ([n. d.]), 26.
[23]
Lau Langeveld, René van Egmond, Reinier Jansen, and Elif Özcan. 2013. Product Sound Design: Intentional and Consequential Sounds. In Advances in Industrial Design Engineering. InTech, 47–73. https://doi.org/10.5772/3415
[24]
Jamy Li, Rebecca Currano, David Sirkin, David Goedicke, Hamish Tennent, Aaron Levine, Vanessa Evers, and Wendy Ju. 2020. On-Road and Online Studies to Investigate Beliefs and Behaviors of Netherlands and US Pedestrians Encountering Hidden-Driver Vehicles. In 15th Annual ACM/IEEE International Conference on Human-Robot Interaction. Cambridge, England.
[25]
Andrew Liptak. 2019. Hans Zimmer Designed the Sound for BMW’s Futuristic Concept Car. https://www.theverge.com/2019/6/29/19914287/bmw-hans-zimmer-design-bmw-vision-m-next-sound-profile-blade-runner-cyberpunk-3d-print.
[26]
Richard H. Lyon. 2003. Product Sound Quality: From Perception to Design. Sound and Vibration 37, 3 (2003), 18–23. https://doi.org/10.1121/1.4743110
[27]
Dylan Moore, Rebecca Currano, Michael Shanks, and David Sirkin. 2020. Defense Against the Dark Cars: Design Principles for Griefing of Autonomous Vehicles. In Proceedings of the 2020 ACM/IEEE International Conference on Human-Robot Interaction - HRI ’20. Cambridge, UK. https://doi.org/10.1145/3319502.3374796
[28]
Dylan Moore, Rebecca Currano, G. Ella Strack, and David Sirkin. 2019. The Case for Implicit External Human-Machine Interfaces for Autonomous Vehicles. In AutoUI 2019. Utrecht, Netherlands. https://doi.org/10.1145/3342197.3345320
[29]
Dylan Moore, Tobias Dahl, Paula Varela, Wendy Ju, Tormod Næs, and Ingunn Berget. 2019. Unintended Consonances: Methods to Understand Robot Motor Sound Perception. In CHI. Glasgow, UK, 1–12.
[30]
Dylan Moore and Wendy Ju. 2018. Sound as Implicit Influence on Human-Robot Interactions. In Companion of the 2018 ACM/IEEE International Conference on Human-Robot Interaction - HRI ’18. ACM Press, Chicago, IL, USA, 311–312. https://doi.org/10.1145/3173386.3176918
[31]
Dylan Moore, Hamish Tennent, Nikolas Martelaro, and Wendy Ju. 2017. Making Noise Intentional: A Study of Servo Sound Perception. In Human Robot Interaction. Vienna, Austria.
[32]
Man Made Music. 2018. How Silent Electric Vehicles Get Their Sound. https://www.fastcompany.com/video/how-silent-electric-vehicles-get-their-sound/PYPTnTl4.
[33]
John G Neuhoff, Gregory Kramer, and Joseph Wayand. 2000. Sonification and the Interaction of Perceptual Dimensions: Can the Data Get Lost in the Map?. In Proc. Int. Conf. on Auditory Display. Atlanta, GA, USA, 6.
[34]
NHTSA. 2016. NHTSA Sets ’Quiet Car’ Safety Standard to Protect Pedestrians. https://www.nhtsa.gov/press-releases/nhtsa-sets-quiet-car-safety-standard-protect-pedestrians.
[35]
Don Norman. 2014. What Noise Does the Electric Car Make?MIT Technology Review (Feb. 2014), 8.
[36]
Jean-François Petiot, Bjørn G. Kristensen, and Anja M. Maier. 2013. How Should an Electric Vehicle Sound? User and Expert Perception. In Volume 5: 25th International Conference on Design Theory and Methodology; ASME 2013 Power Transmission and Gearing Conference. American Society of Mechanical Engineers, Portland, Oregon, USA, V005T06A028. https://doi.org/10.1115/DETC2013-12535
[37]
Jon Porter. 2019. London’s Electric Buses Are Getting Fake Noise, and It’s Positively Psychedelic. https://www.theverge.com/tldr/2019/12/20/21031524/london-electric-buses-artificial-fake-noise-safety-sound.
[38]
R Development Core Team. 2016. R: A Language and Environment for Statistical Computing. R Foundation for Statistical Computing Vienna Austria 0 (2016), {ISBN} 3–900051–07–0. https://doi.org/10.1038/sj.hdy.6800737
[39]
R Read and T Belpaeme. 2012. How to Use Non-Linguistic Utterances to Convey Emotion in Child-Robot Interaction. Human-Robot Interaction (HRI), 2012 7th ACM/IEEE International Conference on (2012), 219–220. https://doi.org/10.1145/2157689.2157764
[40]
Robin Read and Tony Belpaeme. 2016. People Interpret Robotic Non-Linguistic Utterances Categorically. International Journal of Social Robotics 8, 1 (2016), 31–50. https://doi.org/10.1007/s12369-015-0304-0
[41]
Frederic Anthony Robinson, Oliver Bown, and Mari Velonaki. 2020. Implicit Communication through Distributed Sound Design: Exploring a New Modality in Human-Robot Interaction. In Companion of the 2020 ACM/IEEE International Conference on Human-Robot Interaction. ACM, Cambridge United Kingdom, 597–599. https://doi.org/10.1145/3371382.3377431
[42]
Dirk Rothenbücher, Jamy Li, David Sirkin, Brian Mok, and Wendy Ju. 2016. Ghost Driver: A Field Study Investigating the Interaction between Pedestrians and Driverless Vehicles. In Robot and Human Interactive Communication (RO-MAN), 2016 25th IEEE International Symposium On. 795–802. https://doi.org/10.1109/ROMAN.2016.7745210
[43]
Michael B Sapherstein. 1998. The Trademark Registrability of the Harley-Davidson Roar: A Multimedia Analysis. In Boston College Intellectual Property & Technology Forum. 8.
[44]
Joachim Scheuren, Rolf Schirmacher, and Josef Hobelsberger. 2012. Active Design of Automotive Engine Sound. In Inter Noise. 6.
[45]
Charles Spence and Qian Wang. 2015. Sensory Expectations Elicited by the Sounds of Opening the Packaging and Pouring a Beverage. Flavour 4, 1 (2015), 35. https://doi.org/10.1186/s13411-015-0044-y
[46]
Charles Spence and Massimiliano Zampini. 2006. Auditory Contributions to Multisensory Product Perception. Acta Acustica united with Acustica 92, 6 (2006), 1009–1025.
[47]
Daniël Johannes Swart. 2018. The Psychoacoustics of Electric Vehicle Signature Sound.Ph.D. Dissertation. Stellenbosch University.
[48]
M Sweeney, T Pilarski, W Ross, and C Liu. 2017. Light Output System for a Self-Driving Vehicle. US Patent Office Application No. US20180072218A1.
[49]
Hamish Tennent, Dylan Moore, Malte Jung, and Wendy Ju. 2017. Good Vibrations: How Consequential Sounds Affect Perception of Robotic Arms. In RO-MAN 2017 - 26th IEEE International Symposium on Robot and Human Interactive Communication, Vol. 2017-Janua. Lisbon, Portugal, 928–935. https://doi.org/10.1109/ROMAN.2017.8172414
[50]
Raquel Thiessen, Daniel J Rea, Diljot S Garcha, Cheng Cheng, and James E Young. 2019. Infrasound for HRI: A Robot Using Low-Frequency Vibrations to Impact How People Perceive Its Actions. In HRI. 8.
[51]
René Tünnermann, Jan Hammerschmidt, and Thomas Hermann. 2013. Blended Sonification: Sonification for Causal Information Interaction. In 19th International Conference on Auditory Display. Lodz, Poland.
[52]
usTwo. 2017. A Glance At The Future Of External Vehicular Sound. https://www.ustwo.com/blog/future-of-external-vehicular-sound.
[53]
Nozomiko Yasui. 2019. Subjective Evaluations of Detectability of Alert Sound for Electric and Hybrid Electric Vehicle under Actual Environment. In Inter Noise. Madrid, Spain.
[54]
Selma Yilmazyildiz, Robin Read, Tony Belpeame, and Werner Verhelst. 2016. Review of Semantic-Free Utterances in Social Human–Robot Interaction. International Journal of Human-Computer Interaction 32, February(2016), 63–85. https://doi.org/10.1080/10447318.2015.1093856

Cited By

View all
  • (2024)Music Mode: Transforming Robot Movement into Music Increases Likability and Perceived IntelligenceACM Transactions on Human-Robot Interaction10.1145/368681114:1(1-23)Online publication date: 7-Aug-2024
  • (2024)Move, Connect, Interact: Introducing a Design Space for Cross-Traffic InteractionProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36785808:3(1-40)Online publication date: 9-Sep-2024
  • (2024)External Speech Interface: Effects of Gendered and Aged Voices on Pedestrians' Acceptance of Autonomous Vehicles in Shared SpacesAdjunct Proceedings of the 16th International Conference on Automotive User Interfaces and Interactive Vehicular Applications10.1145/3641308.3685046(190-196)Online publication date: 22-Sep-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
AutomotiveUI '20: 12th International Conference on Automotive User Interfaces and Interactive Vehicular Applications
September 2020
300 pages
ISBN:9781450380652
DOI:10.1145/3409120
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 20 September 2020

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Autonomous vehicles
  2. Driverless cars
  3. External human-machine interfaces
  4. Ghostdriver
  5. Implicit interaction
  6. Pedestrian interaction
  7. Sound design
  8. Wizard-of-Oz

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Funding Sources

  • Robert Bosch, LLC

Conference

AutomotiveUI '20
Sponsor:

Acceptance Rates

Overall Acceptance Rate 248 of 566 submissions, 44%

Upcoming Conference

AutomotiveUI '25

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)84
  • Downloads (Last 6 weeks)13
Reflects downloads up to 15 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Music Mode: Transforming Robot Movement into Music Increases Likability and Perceived IntelligenceACM Transactions on Human-Robot Interaction10.1145/368681114:1(1-23)Online publication date: 7-Aug-2024
  • (2024)Move, Connect, Interact: Introducing a Design Space for Cross-Traffic InteractionProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36785808:3(1-40)Online publication date: 9-Sep-2024
  • (2024)External Speech Interface: Effects of Gendered and Aged Voices on Pedestrians' Acceptance of Autonomous Vehicles in Shared SpacesAdjunct Proceedings of the 16th International Conference on Automotive User Interfaces and Interactive Vehicular Applications10.1145/3641308.3685046(190-196)Online publication date: 22-Sep-2024
  • (2024)Five Years of Automated Shuttles: Surveying Community Experiences and Road Conflicts for Future DevelopmentAdjunct Proceedings of the 16th International Conference on Automotive User Interfaces and Interactive Vehicular Applications10.1145/3641308.3685045(184-189)Online publication date: 22-Sep-2024
  • (2024)Honkable Gestalts: Why Autonomous Vehicles Get Honked AtProceedings of the 16th International Conference on Automotive User Interfaces and Interactive Vehicular Applications10.1145/3640792.3675732(317-328)Online publication date: 22-Sep-2024
  • (2024)Multi-Modal eHMIs: The Relative Impact of Light and Sound in AV-Pedestrian InteractionProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642031(1-16)Online publication date: 11-May-2024
  • (2024)Sound Matters: Auditory Detectability of Mobile Robots2024 33rd IEEE International Conference on Robot and Human Interactive Communication (ROMAN)10.1109/RO-MAN60168.2024.10731238(2233-2239)Online publication date: 26-Aug-2024
  • (2024)Are the External Human-Machine Interfaces (eHMI) Accessible for People with Disabilities? A Systematic Review2024 IEEE 4th International Conference on Human-Machine Systems (ICHMS)10.1109/ICHMS59971.2024.10555703(1-6)Online publication date: 15-May-2024
  • (2024)Not Always Good: Mitigating Pedestrians’ Less Careful Crossing Behavior by External Human-Machine Interfaces on Automated VehiclesInternational Journal of Human–Computer Interaction10.1080/10447318.2024.2352212(1-13)Online publication date: 19-Jun-2024
  • (2024)A preliminary study to identify critical factors for evaluating the effect of car-lock sounds on driversErgonomics10.1080/00140139.2024.2379953(1-17)Online publication date: 18-Jul-2024
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media