skip to main content
research-article

MovEcho: A Gesture-Sound Interface Allowing Blind Manipulations in a Driving Context

Published: 20 August 2021 Publication History

Abstract

Most recent vehicles are equipped with touchscreens, which replace arrays of buttons that control secondary driving functions, such as temperature level, strength of ventilation, GPS, or choice of radio stations. While driving, manipulating such interfaces can be problematic in terms of safety, because they require the drivers’ sight. In this article, we develop an innovative interface, MovEcho, which is piloted with gestures and associated with sounds that are used as informational feedback. We compare this interface to a touchscreen in a perceptual experiment that took place in a driving simulator. The results show that MovEcho allows for a better visual task completion related to traffic and is preferred by the participants. These promising results in a simulator condition have to be confirmed in future studies, in a real vehicle with a comparable expertise for both interfaces.

References

[1]
Miguel A. Alonso-Arevalo, Simon Shelley, Dik Hermes, Jacqueline Hollowood, Michael Pettitt, Sarah Sharples, and Armin Kohlrausch. 2012. Curve shape and curvature perception through interactive sonification. ACM Trans. Appl. Percept. 9, 4 (Oct. 2012).
[2]
Mitsuko Aramaki, Charles Gondre, Richard Kronland-Martinet, Thierry Voinier, and Solvi Ystad. 2009. Thinking the sounds: An intuitive control of an impact sound synthesizer. In Proceedings of the International Conference on Auditory Display (ICAD’09). Copenhague, Danemark, 119–124.
[3]
Mitsuko Aramaki, Charles Gondre, Richard Kronland-Martinet, Thierry Voinier, and Sølvi Ystad. 2010. Imagine the sounds: An intuitive control of an impact sound synthesizer. In Auditory Display (Lecture Notes in Computer Science), Sølvi Ystad, Mitsuko Aramaki, Richard Kronland-Martinet, and Kristoffer Jensen (Eds.). Springer, Berlin, 408–421.
[4]
Pavlo Bazilinskyy, Charles Beaumont, Xander van der Geest, Reinier de Jonge, Koen van der Kroft, and Joost de Winter. 2018. Blind driving by means of a steering-based predictor algorithm. In Advances in Human Aspects of Transportation, Neville A. Stanton (Ed.). Vol. 597. Springer International Publishing, Cham, 457–466.
[5]
Michel Beaudouin-Lafon. 2004. Designing interaction, not interfaces. In Proceedings of the Working Conference on Advanced Visual Interfaces. ACM, 15–22.
[6]
Meera M. Blattner, Denise A. Sumikawa, and Robert M. Greenberg. 1989. Earcons and icons: Their structure and common design principles. Human–Computer Interaction 4, 1 (1989), 11–44.
[7]
Nicolas Bonneel, Clara Suied, Isabelle Viaud-Delmon, and George Drettakis. 2010. Bimodal perception of audio-visual material properties for virtual environments. ACM Trans. Appl. Percept. 7, 1 (2010), 1–16.
[8]
Benjamin Bressolette, Sébastien Denjean, Vincent Roussarie, Mitsuko Aramaki, Sølvi Ystad, and Richard Kronland-Martinet. 2018. Harnessing audio in auto control: The challenge of sonifying virtual objects for gesture control of cars. IEEE Consum. Electron. Mag. 7, 2 (Mar. 2018), 91–100.
[9]
Katia Buchhop, Laura Edel, Sabrin Kenaan, Ulrike Raab, Patricia Böhm, and Daniel Isemann. 2017. In-vehicle touchscreen interaction: Can a head-down display give a heads-up on obstacles on the road? In Proceedings of the 9th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. ACM, 21–30.
[10]
Simon Conan, Olivier Derrien, Mitsuko Aramaki, Solvi Ystad, and Richard Kronland-Martinet. 2014. A synthesis model with intuitive control capabilities for rolling sounds. IEEE/ACM Trans. Audio, Speech, Lang. Process. 22, 8 (2014), 1260–1273.
[11]
Jérémy Danna, Jean-Luc Velay, Vietminh Paz-Villagrán, Annabelle Capel, Céline Petroz, Charles Gondre, Etienne Thoret, Mitsuko Aramaki, Solvi Ystad, and Richard Kronland-Martinet. 2013. Handwriting movement sonification for the rehabilitation of dysgraphia. In Proceedings of the 10th International Symposium on Computer Music Multidisciplinary Research (CMMR’13). Marseille, France, 200–208.
[12]
Mauro Del Signore. 1997. Electromagnetic Detection System for Parking Operation. U.S. Patent 5682136A.
[13]
Thomas A. Dingus, Sheila G. Klauer, Vicki L. Neale, A. Petersen, Suzanne E. Lee, J. D. Sudweeks, M. A. Perez, J. Hankey, D. J. Ramsey, and S. Gupta. 2006. The 100-car Naturalistic Driving Study, Phase II-results of the 100-car Field Experiment. Technical Report.
[14]
Johan Engström, Emma Johansson, and Joakim Östlund. 2005. Effects of visual and cognitive load in real and simulated motorway driving. Transport. Res. Part F: Traffic Psychol. Behav. 8, 2 (Mar. 2005), 97–120.
[15]
Paul M. Fitts. 1954. The information capacity of the human motor system in controlling the amplitude of movement.J. Exper. Psychol. 47, 6 (1954), 381.
[16]
John H. Flowers, Dion C. Buhman, and Kimberly D. Turnage. 2005. Data sonification from the desktop: Should sound be part of standard data analysis software?ACM Trans. Appl. Percept. 2, 4 (Oct. 2005), 467–472.
[17]
Kikuo Fujimura, Lijie Xu, Cuong Tran, Rishabh Bhandari, and Victor Ng-Thow-Hing. 2013. Driver queries using wheel-constrained finger pointing and 3D head-up display visual feedback. In Proceedings of the 5th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI’13). ACM Press, 56–62.
[18]
William W. Gaver. 1993. What in the world do we hear?: An ecological approach to auditory event perception. Ecol. Psychol. 5, 1 (1993), 1–29.
[19]
Daniel Gopher and Rolf Braune. 1984. On the psychophysics of workload: Why bother with subjective measures?Hum. Fact. 26, 5 (1984), 519–532.
[20]
Paul Green. 1999. Visual and Task Demands of Driver Information Systems. Technical Report UMTRI-98-16. The University of Michigan Transportation Research Institute (UMTRI).
[21]
Jože Guna, Grega Jakus, Matevž Pogačnik, Sašo Tomažič, and Jaka Sodnik. 2014. An analysis of the precision and reliability of the leap motion sensor and its suitability for static and dynamic tracking. Sensors 14, 12 (Feb. 2014), 3702–3720.
[22]
Joanne L. Harbluk, Moshe Eisenman, and Y. Ian Noy. 2002. The Impact of Cognitive Distraction on Driver Visual Behaviour and Vehicle Control. Technical Report (TP# 13889 E). Transport Canada, Ottawa. 29 pages.
[23]
Thomas Hermann and Helge Ritter. 2005. Model-based sonification revisited Authors’ comments on Hermann and Ritter, ICAD 2002. ACM Trans. Appl. Percept. 2, 4 (Oct. 2005), 559–563.
[24]
Mark M. J. Houben, Armin Kohlrausch, and Dik J. Hermes. 2004. Perception of the size and speed of rolling balls by sound. Speech Commun. 43, 4 (Sept. 2004), 331–345.
[25]
Joshua B. Hurwitz and David J. Wheatley. 2002. Using driver performance measures to estimate workload. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting. 1804–1808.
[26]
Gregory Kramer, Bruce Walker, Terri Bonebright, Perry Cook, John H. Flowers, Nadine Miner, and John Neuhoff. 1999. Sonification report: Status of the field and research agenda. In Proceedings of the International Conference on Auditory Display.
[27]
Felix Lauber, Anna Follmann, and Andreas Butz. 2014. What you see is what you touch: Visualizing touch screen interaction in the head-up display. In Proceedings of the Conference on Designing Interactive Systems. ACM, Vancouver BC Canada, 171–180.
[28]
Sang Hun Lee and Se-One Yoon. 2020. User interface for in-vehicle systems with on-wheel finger spreading gestures and head-up displays. J. Comput. Design Eng. 7, 6 (Dec. 2020), 700–721.
[29]
Guillaume Lemaitre, Hugo Scurto, Jules Françoise, Frédéric Bevilacqua, Olivier Houix, and Patrick Susini. 2017. Rising tones and rustling noises: Metaphors in gestural depictions of sounds. PLoS ONE 12, 7 (July 2017), e0181786.
[30]
Yulan Liang and John D. Lee. 2010. Combining cognitive and visual distraction: Less than the sum of its parts. Accident Anal. Prevent. 42, 3 (May 2010), 881–890.
[31]
I. Scott MacKenzie. 1992. Fitts’ law as a research and design tool in human-computer interaction. Hum.-Comput. Interact. 7, 1 (1992), 91–139.
[32]
D. Meister. 1976. Behavioral Foundations of System Development. Wiley, New York.
[33]
Frederick A. Muckler and Sally A. Seven. 1992. Selecting performance measures: “Objective” versus “subjective” measurement. Hum. Fact. 34, 4 (1992), 441–455.
[34]
Kristian Nymoen, Baptiste Caramiaux, Mariusz Kozak, and Jim Torresen. 2011. Analyzing sound tracings: A multimodal approach to music information retrieval. In Proceedings of the 1st International ACM Workshop on Music Information Retrieval with User-centered and Multimodal Strategies (MIRUM’11). ACM, 39–44.
[35]
Valeria Occelli, Charles Spence, and Massimiliano Zampini. 2009. Compatibility effects between sound frequency and tactile elevation:. NeuroReport 20, 8 (May 2009), 793–797.
[36]
Carl A. Pickering, Keith J. Burnham, and Michael J. Richardson. 2007. A research study of hand gesture recognition technologies and applications for human vehicle interaction. In Proceedings 3rd Institution of Engineering and Technology Conference on Automotive Electronics. 1–15.
[37]
Edwina Pollock, Paul Chandler, and John Sweller. 2002. Assimilating complex information. Learn. Instruct. 12, 1 (2002), 61–86.
[38]
Jorge Santos, Natasha Merat, Sandra Mouta, Karel Brookhuis, and Dick de Waard. 2005. The interaction between driving and in-vehicle information systems: Comparison of results from laboratory, simulator and real-world studies. Transport. Res. Part F: Traffic Psychol. Behav. 8, 2 (Mar. 2005), 135–146.
[39]
Charles Spence. 2011. Crossmodal correspondences: A tutorial review. Atten., Percept., Psychophys. 73, 4 (May 2011), 971–995.
[40]
John Sweller, Jeroen J. G. Van Merrienboer, and Fred G. W. C. Paas. 1998. Cognitive architecture and instructional design. Edu. Psychol. Rev. 10, 3 (1998), 251–296.
[41]
Omer Tsimhoni and Paul Green. 2001. Visual demand of driving and the execution of display-intensive in-vehicle tasks. In Proceedings of the Human Factors and Ergonomics Society, Vol. 45. Sage Publications, 1586–1590.
[42]
Jan E. B. Törnros and Anne K. Bolling. 2005. Mobile phone use—Effects of handheld and handsfree phones on driving performance. Accident Anal. Prevent. 37, 5 (Sept. 2005), 902–909.
[43]
Jeroen J. G. Van Merriënboer, Richard E. Clark, and Marcel B. M. de Croock. 2002. Blueprints for complex learning: The 4C/ID-model. Edu. Technol. Res. Dev. 50, 2 (June 2002), 39–61.
[44]
Bruce N. Walker. 2002. Magnitude estimation of conceptual data dimensions for use in sonification.J. Exp. Psychol.: Appl. 8, 4 (2002), 211.
[45]
Bruce N. Walker and Gregory Kramer. 2005. Mappings and metaphors in auditory displays: An experimental assessment. ACM Trans. Appl. Percept. 2, 4 (Oct. 2005), 407–412.
[46]
Bruce N. Walker and Gregory Kramer. 2005. Sonification design and metaphors: Comments on Walker and Kramer, ICAD 1996. ACM Trans. Appl. Percept. 2, 4 (Oct. 2005), 413–417.
[47]
F. R. H. Zijlstra. 1993. Efficiency in Work Behaviour: A Design Approach for Modern Tools. Ph.D. Dissertation. Delft University.

Cited By

View all
  • (2022)Eyes-Off Your Fingers: Gradual Surface Haptic Feedback Improves Eyes-Free Touchscreen InteractionProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3501872(1-10)Online publication date: 29-Apr-2022

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Transactions on Applied Perception
ACM Transactions on Applied Perception  Volume 18, Issue 3
July 2021
148 pages
ISSN:1544-3558
EISSN:1544-3965
DOI:10.1145/3467015
Issue’s Table of Contents
© 2021 Association for Computing Machinery. ACM acknowledges that this contribution was authored or co-authored by an employee, contractor or affiliate of a national government. As such, the Government retains a nonexclusive, royalty-free right to publish or reproduce this article, or to allow others to do so, for Government purposes only.

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 20 August 2021
Accepted: 01 May 2021
Revised: 01 April 2021
Received: 01 July 2020
Published in TAP Volume 18, Issue 3

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Multisensory perception
  2. cognitive load
  3. virtual reality

Qualifiers

  • Research-article
  • Research
  • Refereed

Funding Sources

  • French National Research Agency under the SoniMove project
  • Openlab PSA-AMU Automotive Motion Lab’ part of the StelLab network

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)13
  • Downloads (Last 6 weeks)0
Reflects downloads up to 20 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2022)Eyes-Off Your Fingers: Gradual Surface Haptic Feedback Improves Eyes-Free Touchscreen InteractionProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3501872(1-10)Online publication date: 29-Apr-2022

View Options

Login options

Full Access

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media