skip to main content
10.1145/3490355.3490361acmotherconferencesArticle/Chapter ViewAbstractPublication Pageschinese-chiConference Proceedingsconference-collections
research-article

Effectiveness of Multimodal Display in Navigation Situation

Published: 07 February 2022 Publication History

Abstract

With the development of technology, the interactive experience in the car has become more abundant, drivers have to face a large amount of information. In the navigation situation, drivers need to receive navigation information to make correct driving behavior, but this may increase drivers’ workload. The current display includes visual display, auditory display, and haptic display, but they have limitations. In order to solve the above problems and provide more suggestions for driving safety, this research explored the effectiveness of multimodal display and its merits than unimodal display in the navigation situation, dependent variables are driving behavior performance and subjective mental workload, eye tracking behavior. We adopted interview to explore current navigation situations and classified lane changing and turning have high workload than straight driving. The simulated driving experiment conducted in the later stage of research, it is 2 × 5 mixed experiment, the between-subjects factors are navigation situations (high load, low load), and the within-subjects’ factors are information display methods (visual, auditory, haptic, multimodal, control). The experiment recruited 18 participants and randomly divided them into two groups to experience each information display in turn. It is found that the multimodal display is good than visual modality under high load situations, and the lateral speed control of the driver is more stable under the multimodal condition. Although the mental workload of drivers under multimodal conditions did not show a significant difference from other conditions, the scores were still lower than other conditions. The multimodal display has the potential to ensure driving safety, but it need further research to discuss.

References

[1]
M. Yongchun, "Influence of interaction design on vehicle subjective safety," Packaging engineering, vol. 39, no. 22, pp. 225-229, 2018.
[2]
T. Hao, S. Jiahao, G. Daisong, Z. Moli, Q. Jianping and Z. Ying, "Research on the development trend of human computer interaction in intelligent vehicles," Packaging engineering, vol. 40, no. 20, p. 32–42, 2019.
[3]
T. Hao and X. Shihui, "Multi screen interactive experience of vehicle navigation based on Internet of vehicles," Packaging Engineering, 2017.
[4]
T. Hao, Z. Danhua and Z. Jianghong, "Research on automobile human machine interface design for complex interaction situation," Packaging Engineering, 2012.
[5]
N. B. Sarter, "Multimodal Displays: Conceptual Basis, Design Guidance, and Research Needs. The Oxford handbook of cognitive engineering," The Oxford Handbook of Cognitive Engineering, 2013.
[6]
P. J. Hopp-Levine, S. C. A. P., C. Benjamin A. and H. Eric D., "Tactile interruption management: tactile cues as task-switching reminders," Cognition, Technology & Work, vol. 8, no. 2, pp. 137-145, 2006.
[7]
J. W. Jenness, L. Raymond J., O. Maura, T. Nancy and P. Christina, "Effects of 38 Manual versus Voice-Activated Dialing during Simulated Driving," Perceptual and motor skills, vol. 94, no. 2, pp. 363-379, 2002.
[8]
D. Lamble, T. Kauranen, M. Laakso and H. Summala, "Cognitive load and detection thresholds in car following situations: safety implications for using mobile (cellular) telephones while driving.," Accident Analysis & Prevention, vol. 31, no. 6, p. 617–623, 1999.
[9]
J.-H. Lee, "Assessing the benefits of multimodal feedback on dualtask performance under demanding conditions," Interaction, Vols. People and Computers XXII Culture, Creativity, no. 22, pp. 185-192, 2008.
[10]
K. Stanney, S. Shatha, R. Leah, H. Kelly, B. Wendi, B. Clint, G. Brian, N. Denise and L. Stephanie, "A Paradigm Shift in Interactive Computing: Deriving Multimodal Design Principles from Behavioral and Neurological Foundations," nternational Journal of Human-Computer Interaction, vol. 17, no. 2, p. 229–257., 2004.
[11]
P. Bazilinskyy, E. Alexander, P. Bastiaan and W. d. Joost, "Usefulness and satisfaction of take-over requests for highly automated driving.," in n Road Safety & Simulation International Conference, the Hague, Netherlands, 2017.
[12]
C. D. Wickens, Processing Resources in Attention, CRC Press, 2020.
[13]
P. Green, “Visual and Task Demands of Driver Information Systems,” 1999.
[14]
M. Sivak, "The Information That Drivers Use: Is it Indeed 90% Visual?," Perception, no. 25(9), p. 1081–1089., 1996.
[15]
G. Weinberg, H. Bret and M. Zeljko, "Evaluating the usability of a headup display for selection from choice lists in cars," in In Proceedings of the 3rd International Conference on Automotive User Interfaces and Interactive Vehicular Applications, 2011.
[16]
B. Wandtner, S. Nadja and S. Gerald, "Effects of non-driving related task modalities on takeover performance in highly automated driving.," Human factors, vol. 60, no. 6, pp. 870-881, 2018.
[17]
F. Meng and . S. Charles, "Tactile warning signals for in-vehicle systems," Accident Analysis & Prevention, no. 75, p. 333–346., 2015.
[18]
P. Bazilinskyy, S. M. Petermeijer, V. Petrovych, D. Dodou & J. C. de Winter, “Take-over requests in highly automated driving: A crowdsourcing survey on auditory, vibrotactile, and visual displays,” Transportation research part F: traffic psychology and behaviour, vol. 56, p. 82–98., 2018.
[19]
N. Dibben and W. Victoria J., "An exploratory survey of in-vehicle music listening," Psychology of Music 35, vol. 35, no. 4, p. 571, 2007.
[20]
C. Ho, R. Nick & S. Charles, “Multisensory In-Car Warning Signals for Collision Avoidance,” Human factors, vol. 49, 6, p. 1107– 1114, 2007.
[21]
K. L. Ramsey and S. F. Blair, "High-Powered Automobile Stereos," Otolaryngology–head and neck surgery : official journal of American Academy of Otolaryngology-Head and Neck Surgery, vol. 109, no. 1, pp. 108-110, 1993.
[22]
S. M. Petermeijer, "Vibrotactile Displays: A Survey With a View on Highly Automated Driving," IEEE transactions on intelligent transportation systems, vol. 17, no. 4, p. 897–907, 2015.
[23]
F. Biondi, D. L. Strayer, R. Rossi, M. Gastaldi and C. Mulatti, "Advanced driver assistance systems: Using multimodal redundant warnings to enhance road safety," Applied Ergonomics, vol. 58, p. 238–244, 2017.
[24]
J. Van Erp and H. Van Veen, "Vibrotactile in-vehicle navigation system," Transportation Research Part F: Traffic Psychology and Behaviour, vol. 7, no. 4-5, p. 247–256, 2004.
[25]
F. Beruscha, W. Lei, A. Klaus, W. Hartmut and B. R., "Do drivers steer toward or away from lateral directional vibrations at the steering wheel?," in In Proc. 2nd European Conference on Human Centred Design for Intelligent Transport Systems, 2010.
[26]
M. Capallera, L. Peïo Barbé, A. Leonardo, K. Omar Abou and M. Elena, "Convey situation awareness in conditionally automated driving with a haptic seat.," in Proceedings of the 11th International Conference on Automotive User Interfaces and Interactive Vehicular Applications: Adjunct Proceedings, 2019.
[27]
A. Asif and B. Susanne, "Where to turn my car?: comparison of a tactile display and a conventional car navigation system under high load condition," in In Proceedings of the 2nd International Conference on Automotive User Interfaces and Interactive Vehicular Applications, 2010.
[28]
Y.-C. Liu, “Comparative study of the effects of auditory, visual and multimodality displays on drivers' performance in advanced traveller information systems,” Ergonomics, 44(4), p. 425–442, 2001.
[29]
C. D. Wickens, .. Justin G, B. Simon and P. Raja, Engineering psychology and human performance, Psychology Press, 2015.
[30]
H. Yun and Y. Ji Hyun, "Multimodal warning design for take-over request in conditionally automated driving.," European transport research review, vol. 12, p. 1–11, 2020.
[31]
v. E. Jan BF, S. C. de Vries and K. Raymond J., "Direction coding using a tactile chair.," Applied ergonomics, vol. 40, no. 3, p. 477–484, 2009.
[32]
G. Shakeri, W. John H. & B. Stephen, “Novel Multimodal Feedback Techniques for In-Car Mid-Air Gesture Interaction,”In Proceedings of the 9th international conference on automotive user interfaces and interactive vehicular applications, 2017.
[33]
Y. Ji, L. Kwangil and H. Wonil, "Haptic perceptions in the vehicle seat," Human Factors and Ergonomics in Manufacturing & Service Industries, vol. 21, no. 3, p. 305–325, 2011.
[34]
J. Xiaopu, X. xiang and H. Shuke, "Research on the safety principle in the design of automobile center console," Packaging engineering, no. 41(12), p. 38–43, 2020.

Cited By

View all
  • (2024)Open Your Ears and Take a Look: A State‐of‐the‐Art Report on the Integration of Sonification and VisualizationComputer Graphics Forum10.1111/cgf.1511443:3Online publication date: 10-Jun-2024
  • (2024)The Effect of Individual-Level Factors and Task Features on Interface Design for Rule-Verification Crowdsourcing TasksInternational Journal of Human–Computer Interaction10.1080/10447318.2024.2332031(1-28)Online publication date: 16-Apr-2024
  • (2023)Beyond Car Human-Machine Interface (HMI): Mapping Six Intelligent Modes into Future Cockpit ScenariosDesign, User Experience, and Usability10.1007/978-3-031-35696-4_6(75-83)Online publication date: 23-Jul-2023
  • Show More Cited By

Index Terms

  1. Effectiveness of Multimodal Display in Navigation Situation
            Index terms have been assigned to the content through auto-classification.

            Recommendations

            Comments

            Information & Contributors

            Information

            Published In

            cover image ACM Other conferences
            Chinese CHI '21: Proceedings of the Ninth International Symposium of Chinese CHI
            October 2021
            166 pages
            ISBN:9781450386951
            DOI:10.1145/3490355
            Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

            Publisher

            Association for Computing Machinery

            New York, NY, United States

            Publication History

            Published: 07 February 2022

            Permissions

            Request permissions for this article.

            Check for updates

            Qualifiers

            • Research-article
            • Research
            • Refereed limited

            Funding Sources

            • Beijing Normal University

            Conference

            Chinese CHI 2021

            Acceptance Rates

            Overall Acceptance Rate 17 of 40 submissions, 43%

            Contributors

            Other Metrics

            Bibliometrics & Citations

            Bibliometrics

            Article Metrics

            • Downloads (Last 12 months)43
            • Downloads (Last 6 weeks)10
            Reflects downloads up to 01 Mar 2025

            Other Metrics

            Citations

            Cited By

            View all
            • (2024)Open Your Ears and Take a Look: A State‐of‐the‐Art Report on the Integration of Sonification and VisualizationComputer Graphics Forum10.1111/cgf.1511443:3Online publication date: 10-Jun-2024
            • (2024)The Effect of Individual-Level Factors and Task Features on Interface Design for Rule-Verification Crowdsourcing TasksInternational Journal of Human–Computer Interaction10.1080/10447318.2024.2332031(1-28)Online publication date: 16-Apr-2024
            • (2023)Beyond Car Human-Machine Interface (HMI): Mapping Six Intelligent Modes into Future Cockpit ScenariosDesign, User Experience, and Usability10.1007/978-3-031-35696-4_6(75-83)Online publication date: 23-Jul-2023
            • (2022)Designing an Interactive Communication Assistance System for Hearing-Impaired College Students Based on Gesture Recognition and RepresentationFuture Internet10.3390/fi1407019814:7(198)Online publication date: 29-Jun-2022

            View Options

            Login options

            View options

            PDF

            View or Download as a PDF file.

            PDF

            eReader

            View online with eReader.

            eReader

            HTML Format

            View this article in HTML Format.

            HTML Format

            Figures

            Tables

            Media

            Share

            Share

            Share this Publication link

            Share on social media