Abstract
We studied two interaction techniques to perform secondary tasks in a driving simulator environment with the focus on driving safety. In both techniques, the participants (Nā=ā20) used gaze pointing to select virtual task buttons. Toggling the controls was achieved by either mid-air gestures with haptic feedback or physical buttons located on the steering wheel. To evaluate each technique, we compared several measures, such as mean task times, pedestrian detections, lane deviations, and task complexity ratings.
The results showed that both techniques allowed operation without severely compromising driving safety. However, interaction using gestures was rated as more complex, caused more fatigue and frustration, and pedestrians were noticed with longer delays than using physical buttons. The results suggest that gaze pointing accuracy was not always sufficient, while mid-air gestures require more robust algorithms before they can offer functionality comparable to interaction with physical buttons.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Alpern, M., Minardo, K.: Developing a car gesture interface for use as a secondary task. In: CHI 2003 Extended Abstracts on Human Factors in Computing Systems (CHI EA 2003), pp. 932ā933. ACM (2003).Ā https://doi.org/10.1145/765891.766078
Althoff, F., Lindl, R., WalchshƤusl, L.: Robust multimodal hand and head gesture recognition for controlling automotive infotainment systems, In: Proceedings of Fahrer im 21. Jahrhundert: der Mensch als Fahrer und seine Interaktion mit dem Fahrzeug, VDI-Gesellschaft Fahrzeug- und Verkehrstechnik, pp. 187ā206 (2005)
Angelini, L., et al.: A comparison of three interaction modalities in the car: gestures, voice and touch. In: Actes de la 28iĆØme conference francophone sur lāInteraction Homme-Machine (IHM 2016). ACM. (2016).Ā https://doi.org/10.1145/3004107.3004118
Berg, F. J., Bennett, T. J., Davis, A. C., Riefe, R. K., Dybalski, R. H., Phillips, T. M.: Vehicle information system with steering wheel controller. U.S. Patent Application No. US11/239,876 (2006)
Biswas, P., Langdon, P.: Multimodal intelligent eye-gaze tracking system. Int. J. Hum. Comput. Interact. 31(4), 277ā294 (2015). https://doi.org/10.1080/10447318.2014.1001301
Biswas, P., Twist, S., Godsill, S.: Intelligent finger movement controlled interface for automotive environment. In: Proceedings of the 31st British Computer Society Human Computer Interaction Conference (HCI 2017), Lynne Hall, Tom Flint, Suzy OāHara, and Phil Turner (eds.) 31, BCS Learning & Development Ltd., Swindon, UK, Article 25, p. 5 (2017).Ā https://doi.org/10.14236/ewic/HCI2017.25
Bostrƶm, A., Ramstrƶm, F.: Head-up display for enhanced user experience,Ā Ms Thesis, Chalmers University of Technology, Sweden (2014). SSN: 1651ā4769
Cabreira, A. T., Hwang, F.: Movement characteristics and effects of gui design on how older adults swipe in mid-air. In: Proceedings of the 20th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS 2018), pp. 420ā422. ACM (2018).Ā https://doi.org/10.1145/3234695.3241014
Dobbelstein, D., Walch, M., Kƶll, A., Åahin, Ć., Hartmann, T., Rukzio, E.: Reducing in-vehicle interaction complexity: gaze-based mapping of a rotary knob to multiple interfaces. In: Proceedings of the 15th International Conference on Mobile and Ubiquitous Multimedia (MUM 2016), pp. 311ā313. ACM (2016).Ā https://doi.org/10.1145/3012709.3016064
Dƶring, T., et al.: Gestural interaction on the steering wheel: reducing the visual demand. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI 2011), pp. 483ā492. ACM (2011).Ā https://doi.org/10.1145/1978942.1979010
Endres, C., Schwartz, T., MĆ¼ller, C.A.: Geremin: 2D microgestures for drivers based on electric field sensing. In Proceedings of the 16th International Conference on Intelligent user interfaces (IUI 2011), pp. 327ā330. ACM (2011).Ā https://doi.org/10.1145/1943403.1943457
Freeman, E., Vo, D.-B., Brewster, S.: HaptiGlow: helping users position their hands for better mid-air gestures and ultrasound haptic feedback. In: 2019 IEEE World Haptics Conference (WHC), pp. 289ā294 (2019).Ā https://doi.org/10.1109/WHC.2019.8816092
Gable, T. M., May, K. R., Walker, B. N.: Applying popular usability heuristics to gesture interaction in the vehicle. In: Adjunct Proceedings of the 6th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI 2014), pp. 1ā7. ACM (2014).Ā https://doi.org/10.1145/2667239.2667298
Graichen, L., Graichen, M., Krems, J.F.: Evaluation of gesture-based in-vehicle interaction: user experience and the potential to reduce driver distraction. Hum. Factors 61(5), 774ā792 (2019). https://doi.org/10.1177/001872081882425
Graichen, L., Graichen, M., Krems, J.F.: Effects of gesture-based interaction on driving behavior: a driving simulator study using the projection-based vehicle-in-the-loop. Hum. Factors: J. Hum. Factors and Ergon. Soc. 64(2), 1ā19 (2020). https://doi.org/10.1177/0018720820943284
Guttag, K. M., Simpson, D., Michalczuk, P., Madsen, J., Baik, D., Rahimi, A.: Compact heads-up display system. U.S. Patent Application No. 14/806,530 (2016)
Hart, S. G.: NASA-task load index (NASA-TLX); 20 years later. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 50(9), Los Angeles, Sage publications, pp. 904ā908. (2006).Ā https://doi.org/10.1177/154193120605000909
Hart, S. G., Staveland, L. E.: Development of NASA-TLX (Task Load Index): results of empirical and theoretical research. In: Hancock, P.A., Meshkati, N. (eds.) Advances in psychology, 52, Amsterdam, North Holland Press, pp. 139ā183 (1988).Ā https://doi.org/10.1016/S0166-4115(08)62386-9
Hu, Z., et al.: A literature review of the research on interaction mode of self-driving cars. In: Marcus, A., Wang, W. (eds.) Design, User Experience, and Usability. Application Domains. Lecture Notes in Computer Science, vol. 11585, pp. 29ā40. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-23538-3_3
Kang, S., Kim, B., Han, S., Kim, H.: Do you see what I see: towards a gaze-based surroundings query processing system. In: Proceedings of the 7th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI 2015), pp. 93ā100. ACM (2015).Ā https://doi.org/10.1145/2799250.2799285
Kennedy, R.S., Lane, N.E., Berbaum, K.S., Lilienthal, M.G.: Simulator sickness questionnaire: an enhanced method for quantifying simulator sickness. Int. J. Aviat. Psychol., Taylor & Francis 3(3), 203ā220 (1993). https://doi.org/10.1207/s15327108ijap0303_3
Kern, S., Mahr, A., Castronovo, S., Schmidt, A., MĆ¼ller, C.: Making use of driversā glances onto the screen for explicit gaze-based interaction. In: Proceedings of the 2nd International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI 2010), pp. 110ā116. ACM (2010).Ā https://doi.org/10.1145/1969773.1969792
Khedkar, S. B., Kasav, S. M., Mahajan, S. M.: Head up display techniques in cars. Int. J. Eng. Sci. Innov. Technol.Ā 4(2), 119ā124 (2015). ISSN: 2319ā5967
Kim, M., Seong, E., Jwa, Y., Lee, J., Kim, S.: A cascaded multimodal natural user interface to reduce driver distraction. In: IEEE Access vol. 8, pp. 112969ā112984. IEEEĀ (2020).Ā https://doi.org/10.1109/ACCESS.2020.3002775
Kopinski, T., Eberwein, J., Geisler, S., Handmann, U.: Touch versus mid-air gesture interfaces in road scenarios-measuring driver performance degradation. In: 2016 IEEE 19th International Conference on Intelligent Transportation Systems (ITSC), pp. 661ā666. IEEE, (2016).Ā https://doi.org/10.1109/ITSC.2016.7795624
Lee, J.D., Caven, B., Haake, S., Brown, T.L.: Speech-based interaction with in-vehicle computers: the effect of speech-based e-mail on driversā attention to the roadway. J. Hum. Factors Ergon. Soc. 43(4), 631ā640 (2001). https://doi.org/10.1518/001872001775870340
Lux, B., Schmidl, D., Eibl, M., Hinterleitner, B., Bƶhm, P., Isemann, D.: Efficiency and user experience of gaze interaction in an automotive environment. In: Harris, D. (ed.) Engineering Psychology and Cognitive Ergonomics. Lecture Notes in Computer Science (Lecture Notes in Artificial Intelligence), vol. 10906, pp. 429ā444. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-91122-9_35
Maroto, M., CaƱo, E., GonzƔlez, P., Villegas, D.: Head-up Displays (HUD) in driving. ArXiv, abs/1803.08383 (2018)
May, K.R., Gable, T.M., Walker, B.N.: A multimodal air gesture interface for in vehicle menu navigation. In: Adjunct Proceedings of the 6th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI 2014), pp.Ā 1ā6. ACM (2014).Ā https://doi.org/10.1145/2667239.2667280
May, K.R., Gable, T.M., Walker, B.N.: Designing an in-vehicle air gesture set using elicitation methods. In: Proceedings of the 9th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI 2017), pp. 74ā83. ACM (2017).Ā https://doi.org/10.1145/3122986.3123015
Meschtscherjakov, A.: The steering wheel: a design space exploration. In: Meixner, G., MĆ¼ller, C. (eds.) Automotive User Interfaces. HumanāComputer Interaction Series, pp. 349ā373. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-49448-7_13
Meschtscherjakov, A., Wilfinger, D., Murer, M., Osswald, S., Tscheligi, M.: Hands-on-the-wheel: exploring the design space on the back side of a steering wheel. In: Aarts, E., de Ruyter, B., Markopoulos, P., van Loenen, E., Wichert, R., Schouten, B., Terken, J., Van Kranenburg, R., Den Ouden, E., OāHare, G. (eds.) Ambient Intelligence, pp. 299ā314. Springer International Publishing, Cham (2014). https://doi.org/10.1007/978-3-319-14112-1_24
Ohn-Bar, E., Trivedi, M.M.: Hand gesture recognition in real time for automotive interfaces: a multimodal vision-based approach and evaluations. IEEE Trans. Intell. Transp. Syst. 15(6), 2368ā2377 (2014). https://doi.org/10.1109/TITS.2014.2337331
Pfleging, B., Schneegass, S., Schmidt, A.: Multimodal interaction in the car: combining speech and gestures on the steering wheel. In: Proceedings of the 4th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI 2012), pp. 155ā162. ACM (2012).Ā https://doi.org/10.1145/2390256.2390282
Poitschke, T., Laquai, F., Stamboliev, S., Rigoll, G.: Gaze-based interaction on multiple displays in an automotive environment. In: 2011 IEEE International Conference on Systems, Man, and Cybernetics, Anchorage, AK, pp. 543ā548. IEEE (2011).Ā https://doi.org/10.1109/ICSMC.2011.6083740
Reinhard, R., et al.: The best way to assess visually induced motion sickness in a fixed-base driving simulator. Transp. Res. Part F: Traffic Psychol. Behav. 48, 74ā88 (2017). https://doi.org/10.1016/j.trf.2017.05.005
Roider, F., Gross, T.: I see your point: integrating gaze to enhance pointing gesture accuracy while driving. In: Proceedings of the 10th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI 2018), pp. 351ā358. ACM (2018).Ā https://doi.org/10.1145/3239060.3239084
Roider, F., RĆ¼melin, S., Pfleging, B., Gross, T.: The effects of situational demands on gaze, speech and gesture input in the vehicle. In: Proceedings of the 9th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI 2017), pp. 94ā102. ACM (2017).Ā https://doi.org/10.1145/3122986.3122999
Sauras-Perez, P., Gil, A., Gill, J. S., Pisu, P., Taiber, J.: VoGe: a voice and gesture system for interacting with autonomous cars. In WCXā¢ 17 SAE World Congress Experience (2017).Ā https://doi.org/10.4271/2017-01-0068
Shakeri, G., Williamson, J. H., Brewster, S.: Novel multimodal feedback techniques for in-car mid-air gesture interaction. In: Proceedings of the 9th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI 2017), pp. 84ā93. ACM (2017).Ā https://doi.org/10.1145/3122986.3123011
Shakeri, G., Williamson, J. H., Brewster, S.: May the force be with you: ultrasound haptic feedback for mid-air gesture interaction in cars. In: Proceedings of the 10th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI 2018), pp. 1ā10. ACM (2018).Ā https://doi.org/10.1145/3239060.3239081
Trƶsterer, S., Meschtscherjakov, A., Wilfinger, D., Tscheligi, M.: Eye tracking in the car: challenges in a dual-task scenario on a test track. In: Adjunct Proceedings of the 6th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI 2014), pp. 1ā6. ACM (2014).Ā https://doi.org/10.1145/2667239.2667277
Zhang, C., Zhu, A.: Research on application of interaction design in head-up display on automobile. In: Proceedings of the 2nd International Conference on Information Technologies and Electrical Engineering (ICITEE-2019), Article 113, pp. 1ā6. ACM (2019).Ā https://doi.org/10.1145/3386415.3387060
Zobl, M., Nieschulz, R., Geiger, M., Lang, M., Rigoll, G.: Gesture components for natural interaction with in-car devices. In: Camurri, A., Volpe, G. (eds.) Gesture-Based Communication in Human-Computer Interaction. Lecture Notes in Computer Science (Lecture Notes in Artificial Intelligence), vol. 2915, pp. 448ā459. Springer, Heidelberg (2004). https://doi.org/10.1007/978-3-540-24598-8_41
Young, R.A.: Evaluation of the total eyes-off-road time glance criterion in the nhtsa visual-manual guidelines. Transp. Res. Rec.: J. Transp. Res. Board 2602(1), 1ā9 (2016). https://doi.org/10.3141/2602-01
Acknowledgements
This research was carried out as part of the Adaptive Multimodal In-Vehicle Interaction project (AMICI), which was funded by Business Finland (grant 1316/31/2021).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
Ā© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Spakov, O., Venesvirta, H., Lylykangas, J., Farooq, A., Raisamo, R., Surakka, V. (2023). Multimodal Gaze-Based Interaction in Cars: Are Mid-Air Gestures with Haptic Feedback Safer Than Buttons?. In: Marcus, A., Rosenzweig, E., Soares, M.M. (eds) Design, User Experience, and Usability. HCII 2023. Lecture Notes in Computer Science, vol 14032. Springer, Cham. https://doi.org/10.1007/978-3-031-35702-2_24
Download citation
DOI: https://doi.org/10.1007/978-3-031-35702-2_24
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-35701-5
Online ISBN: 978-3-031-35702-2
eBook Packages: Computer ScienceComputer Science (R0)