Skip to main content

Multimodal Gaze-Based Interaction in Cars: Are Mid-Air Gestures with Haptic Feedback Safer Than Buttons?

  • Conference paper
  • First Online:
Design, User Experience, and Usability (HCII 2023)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 14032))

Included in the following conference series:

  • 832 Accesses

Abstract

We studied two interaction techniques to perform secondary tasks in a driving simulator environment with the focus on driving safety. In both techniques, the participants (Nā€‰=ā€‰20) used gaze pointing to select virtual task buttons. Toggling the controls was achieved by either mid-air gestures with haptic feedback or physical buttons located on the steering wheel. To evaluate each technique, we compared several measures, such as mean task times, pedestrian detections, lane deviations, and task complexity ratings.

The results showed that both techniques allowed operation without severely compromising driving safety. However, interaction using gestures was rated as more complex, caused more fatigue and frustration, and pedestrians were noticed with longer delays than using physical buttons. The results suggest that gaze pointing accuracy was not always sufficient, while mid-air gestures require more robust algorithms before they can offer functionality comparable to interaction with physical buttons.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 109.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 139.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Alpern, M., Minardo, K.: Developing a car gesture interface for use as a secondary task. In: CHI 2003 Extended Abstracts on Human Factors in Computing Systems (CHI EA 2003), pp. 932ā€“933. ACM (2003).Ā https://doi.org/10.1145/765891.766078

  2. Althoff, F., Lindl, R., WalchshƤusl, L.: Robust multimodal hand and head gesture recognition for controlling automotive infotainment systems, In: Proceedings of Fahrer im 21. Jahrhundert: der Mensch als Fahrer und seine Interaktion mit dem Fahrzeug, VDI-Gesellschaft Fahrzeug- und Verkehrstechnik, pp. 187ā€“206 (2005)

    Google ScholarĀ 

  3. Angelini, L., et al.: A comparison of three interaction modalities in the car: gestures, voice and touch. In: Actes de la 28iĆØme conference francophone sur lā€™Interaction Homme-Machine (IHM 2016). ACM. (2016).Ā https://doi.org/10.1145/3004107.3004118

  4. Berg, F. J., Bennett, T. J., Davis, A. C., Riefe, R. K., Dybalski, R. H., Phillips, T. M.: Vehicle information system with steering wheel controller. U.S. Patent Application No. US11/239,876 (2006)

    Google ScholarĀ 

  5. Biswas, P., Langdon, P.: Multimodal intelligent eye-gaze tracking system. Int. J. Hum. Comput. Interact. 31(4), 277ā€“294 (2015). https://doi.org/10.1080/10447318.2014.1001301

    ArticleĀ  Google ScholarĀ 

  6. Biswas, P., Twist, S., Godsill, S.: Intelligent finger movement controlled interface for automotive environment. In: Proceedings of the 31st British Computer Society Human Computer Interaction Conference (HCI 2017), Lynne Hall, Tom Flint, Suzy Oā€™Hara, and Phil Turner (eds.) 31, BCS Learning & Development Ltd., Swindon, UK, Article 25, p. 5 (2017).Ā https://doi.org/10.14236/ewic/HCI2017.25

  7. Bostrƶm, A., Ramstrƶm, F.: Head-up display for enhanced user experience,Ā Ms Thesis, Chalmers University of Technology, Sweden (2014). SSN: 1651ā€“4769

    Google ScholarĀ 

  8. Cabreira, A. T., Hwang, F.: Movement characteristics and effects of gui design on how older adults swipe in mid-air. In: Proceedings of the 20th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS 2018), pp. 420ā€“422. ACM (2018).Ā https://doi.org/10.1145/3234695.3241014

  9. Dobbelstein, D., Walch, M., Kƶll, A., Şahin, Ɩ., Hartmann, T., Rukzio, E.: Reducing in-vehicle interaction complexity: gaze-based mapping of a rotary knob to multiple interfaces. In: Proceedings of the 15th International Conference on Mobile and Ubiquitous Multimedia (MUM 2016), pp. 311ā€“313. ACM (2016).Ā https://doi.org/10.1145/3012709.3016064

  10. Dƶring, T., et al.: Gestural interaction on the steering wheel: reducing the visual demand. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI 2011), pp. 483ā€“492. ACM (2011).Ā https://doi.org/10.1145/1978942.1979010

  11. Endres, C., Schwartz, T., MĆ¼ller, C.A.: Geremin: 2D microgestures for drivers based on electric field sensing. In Proceedings of the 16th International Conference on Intelligent user interfaces (IUI 2011), pp. 327ā€“330. ACM (2011).Ā https://doi.org/10.1145/1943403.1943457

  12. Freeman, E., Vo, D.-B., Brewster, S.: HaptiGlow: helping users position their hands for better mid-air gestures and ultrasound haptic feedback. In: 2019 IEEE World Haptics Conference (WHC), pp. 289ā€“294 (2019).Ā https://doi.org/10.1109/WHC.2019.8816092

  13. Gable, T. M., May, K. R., Walker, B. N.: Applying popular usability heuristics to gesture interaction in the vehicle. In: Adjunct Proceedings of the 6th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI 2014), pp. 1ā€“7. ACM (2014).Ā https://doi.org/10.1145/2667239.2667298

  14. Graichen, L., Graichen, M., Krems, J.F.: Evaluation of gesture-based in-vehicle interaction: user experience and the potential to reduce driver distraction. Hum. Factors 61(5), 774ā€“792 (2019). https://doi.org/10.1177/001872081882425

    ArticleĀ  Google ScholarĀ 

  15. Graichen, L., Graichen, M., Krems, J.F.: Effects of gesture-based interaction on driving behavior: a driving simulator study using the projection-based vehicle-in-the-loop. Hum. Factors: J. Hum. Factors and Ergon. Soc. 64(2), 1ā€“19 (2020). https://doi.org/10.1177/0018720820943284

    ArticleĀ  Google ScholarĀ 

  16. Guttag, K. M., Simpson, D., Michalczuk, P., Madsen, J., Baik, D., Rahimi, A.: Compact heads-up display system. U.S. Patent Application No. 14/806,530 (2016)

    Google ScholarĀ 

  17. Hart, S. G.: NASA-task load index (NASA-TLX); 20 years later. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 50(9), Los Angeles, Sage publications, pp. 904ā€“908. (2006).Ā https://doi.org/10.1177/154193120605000909

  18. Hart, S. G., Staveland, L. E.: Development of NASA-TLX (Task Load Index): results of empirical and theoretical research. In: Hancock, P.A., Meshkati, N. (eds.) Advances in psychology, 52, Amsterdam, North Holland Press, pp. 139ā€“183 (1988).Ā https://doi.org/10.1016/S0166-4115(08)62386-9

  19. Hu, Z., et al.: A literature review of the research on interaction mode of self-driving cars. In: Marcus, A., Wang, W. (eds.) Design, User Experience, and Usability. Application Domains. Lecture Notes in Computer Science, vol. 11585, pp. 29ā€“40. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-23538-3_3

    ChapterĀ  Google ScholarĀ 

  20. Kang, S., Kim, B., Han, S., Kim, H.: Do you see what I see: towards a gaze-based surroundings query processing system. In: Proceedings of the 7th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI 2015), pp. 93ā€“100. ACM (2015).Ā https://doi.org/10.1145/2799250.2799285

  21. Kennedy, R.S., Lane, N.E., Berbaum, K.S., Lilienthal, M.G.: Simulator sickness questionnaire: an enhanced method for quantifying simulator sickness. Int. J. Aviat. Psychol., Taylor & Francis 3(3), 203ā€“220 (1993). https://doi.org/10.1207/s15327108ijap0303_3

    ArticleĀ  Google ScholarĀ 

  22. Kern, S., Mahr, A., Castronovo, S., Schmidt, A., MĆ¼ller, C.: Making use of driversā€™ glances onto the screen for explicit gaze-based interaction. In: Proceedings of the 2nd International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI 2010), pp. 110ā€“116. ACM (2010).Ā https://doi.org/10.1145/1969773.1969792

  23. Khedkar, S. B., Kasav, S. M., Mahajan, S. M.: Head up display techniques in cars. Int. J. Eng. Sci. Innov. Technol.Ā 4(2), 119ā€“124 (2015). ISSN: 2319ā€“5967

    Google ScholarĀ 

  24. Kim, M., Seong, E., Jwa, Y., Lee, J., Kim, S.: A cascaded multimodal natural user interface to reduce driver distraction. In: IEEE Access vol. 8, pp. 112969ā€“112984. IEEEĀ  (2020).Ā https://doi.org/10.1109/ACCESS.2020.3002775

  25. Kopinski, T., Eberwein, J., Geisler, S., Handmann, U.: Touch versus mid-air gesture interfaces in road scenarios-measuring driver performance degradation. In: 2016 IEEE 19th International Conference on Intelligent Transportation Systems (ITSC), pp. 661ā€“666. IEEE, (2016).Ā https://doi.org/10.1109/ITSC.2016.7795624

  26. Lee, J.D., Caven, B., Haake, S., Brown, T.L.: Speech-based interaction with in-vehicle computers: the effect of speech-based e-mail on driversā€™ attention to the roadway. J. Hum. Factors Ergon. Soc. 43(4), 631ā€“640 (2001). https://doi.org/10.1518/001872001775870340

    ArticleĀ  Google ScholarĀ 

  27. Lux, B., Schmidl, D., Eibl, M., Hinterleitner, B., Bƶhm, P., Isemann, D.: Efficiency and user experience of gaze interaction in an automotive environment. In: Harris, D. (ed.) Engineering Psychology and Cognitive Ergonomics. Lecture Notes in Computer Science (Lecture Notes in Artificial Intelligence), vol. 10906, pp. 429ā€“444. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-91122-9_35

    ChapterĀ  Google ScholarĀ 

  28. Maroto, M., CaƱo, E., GonzƔlez, P., Villegas, D.: Head-up Displays (HUD) in driving. ArXiv, abs/1803.08383 (2018)

    Google ScholarĀ 

  29. May, K.R., Gable, T.M., Walker, B.N.: A multimodal air gesture interface for in vehicle menu navigation. In: Adjunct Proceedings of the 6th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI 2014), pp.Ā  1ā€“6. ACM (2014).Ā https://doi.org/10.1145/2667239.2667280

  30. May, K.R., Gable, T.M., Walker, B.N.: Designing an in-vehicle air gesture set using elicitation methods. In: Proceedings of the 9th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI 2017), pp. 74ā€“83. ACM (2017).Ā https://doi.org/10.1145/3122986.3123015

  31. Meschtscherjakov, A.: The steering wheel: a design space exploration. In: Meixner, G., MĆ¼ller, C. (eds.) Automotive User Interfaces. Humanā€“Computer Interaction Series, pp. 349ā€“373. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-49448-7_13

    ChapterĀ  Google ScholarĀ 

  32. Meschtscherjakov, A., Wilfinger, D., Murer, M., Osswald, S., Tscheligi, M.: Hands-on-the-wheel: exploring the design space on the back side of a steering wheel. In: Aarts, E., de Ruyter, B., Markopoulos, P., van Loenen, E., Wichert, R., Schouten, B., Terken, J., Van Kranenburg, R., Den Ouden, E., Oā€™Hare, G. (eds.) Ambient Intelligence, pp. 299ā€“314. Springer International Publishing, Cham (2014). https://doi.org/10.1007/978-3-319-14112-1_24

    ChapterĀ  Google ScholarĀ 

  33. Ohn-Bar, E., Trivedi, M.M.: Hand gesture recognition in real time for automotive interfaces: a multimodal vision-based approach and evaluations. IEEE Trans. Intell. Transp. Syst. 15(6), 2368ā€“2377 (2014). https://doi.org/10.1109/TITS.2014.2337331

    ArticleĀ  Google ScholarĀ 

  34. Pfleging, B., Schneegass, S., Schmidt, A.: Multimodal interaction in the car: combining speech and gestures on the steering wheel. In: Proceedings of the 4th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI 2012), pp. 155ā€“162. ACM (2012).Ā https://doi.org/10.1145/2390256.2390282

  35. Poitschke, T., Laquai, F., Stamboliev, S., Rigoll, G.: Gaze-based interaction on multiple displays in an automotive environment. In: 2011 IEEE International Conference on Systems, Man, and Cybernetics, Anchorage, AK, pp. 543ā€“548. IEEE (2011).Ā https://doi.org/10.1109/ICSMC.2011.6083740

  36. Reinhard, R., et al.: The best way to assess visually induced motion sickness in a fixed-base driving simulator. Transp. Res. Part F: Traffic Psychol. Behav. 48, 74ā€“88 (2017). https://doi.org/10.1016/j.trf.2017.05.005

    ArticleĀ  Google ScholarĀ 

  37. Roider, F., Gross, T.: I see your point: integrating gaze to enhance pointing gesture accuracy while driving. In: Proceedings of the 10th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI 2018), pp. 351ā€“358. ACM (2018).Ā https://doi.org/10.1145/3239060.3239084

  38. Roider, F., RĆ¼melin, S., Pfleging, B., Gross, T.: The effects of situational demands on gaze, speech and gesture input in the vehicle. In: Proceedings of the 9th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI 2017), pp. 94ā€“102. ACM (2017).Ā https://doi.org/10.1145/3122986.3122999

  39. Sauras-Perez, P., Gil, A., Gill, J. S., Pisu, P., Taiber, J.: VoGe: a voice and gesture system for interacting with autonomous cars. In WCXā„¢ 17 SAE World Congress Experience (2017).Ā https://doi.org/10.4271/2017-01-0068

  40. Shakeri, G., Williamson, J. H., Brewster, S.: Novel multimodal feedback techniques for in-car mid-air gesture interaction. In: Proceedings of the 9th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI 2017), pp. 84ā€“93. ACM (2017).Ā https://doi.org/10.1145/3122986.3123011

  41. Shakeri, G., Williamson, J. H., Brewster, S.: May the force be with you: ultrasound haptic feedback for mid-air gesture interaction in cars. In: Proceedings of the 10th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI 2018), pp. 1ā€“10. ACM (2018).Ā https://doi.org/10.1145/3239060.3239081

  42. Trƶsterer, S., Meschtscherjakov, A., Wilfinger, D., Tscheligi, M.: Eye tracking in the car: challenges in a dual-task scenario on a test track. In: Adjunct Proceedings of the 6th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI 2014), pp. 1ā€“6. ACM (2014).Ā https://doi.org/10.1145/2667239.2667277

  43. Zhang, C., Zhu, A.: Research on application of interaction design in head-up display on automobile. In: Proceedings of the 2nd International Conference on Information Technologies and Electrical Engineering (ICITEE-2019), Article 113, pp. 1ā€“6. ACM (2019).Ā https://doi.org/10.1145/3386415.3387060

  44. Zobl, M., Nieschulz, R., Geiger, M., Lang, M., Rigoll, G.: Gesture components for natural interaction with in-car devices. In: Camurri, A., Volpe, G. (eds.) Gesture-Based Communication in Human-Computer Interaction. Lecture Notes in Computer Science (Lecture Notes in Artificial Intelligence), vol. 2915, pp. 448ā€“459. Springer, Heidelberg (2004). https://doi.org/10.1007/978-3-540-24598-8_41

    ChapterĀ  Google ScholarĀ 

  45. Young, R.A.: Evaluation of the total eyes-off-road time glance criterion in the nhtsa visual-manual guidelines. Transp. Res. Rec.: J. Transp. Res. Board 2602(1), 1ā€“9 (2016). https://doi.org/10.3141/2602-01

    ArticleĀ  Google ScholarĀ 

Download references

Acknowledgements

This research was carried out as part of the Adaptive Multimodal In-Vehicle Interaction project (AMICI), which was funded by Business Finland (grant 1316/31/2021).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ahmed Farooq .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

Ā© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Spakov, O., Venesvirta, H., Lylykangas, J., Farooq, A., Raisamo, R., Surakka, V. (2023). Multimodal Gaze-Based Interaction in Cars: Are Mid-Air Gestures with Haptic Feedback Safer Than Buttons?. In: Marcus, A., Rosenzweig, E., Soares, M.M. (eds) Design, User Experience, and Usability. HCII 2023. Lecture Notes in Computer Science, vol 14032. Springer, Cham. https://doi.org/10.1007/978-3-031-35702-2_24

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-35702-2_24

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-35701-5

  • Online ISBN: 978-3-031-35702-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics