Skip to main content
Log in

Interactive gaze and finger controlled HUD for cars

  • Original Paper
  • Published:
Journal on Multimodal User Interfaces Aims and scope Submit manuscript

Abstract

Modern infotainment systems in automobiles facilitate driving at the cost of secondary tasks in addition to the primary task of driving. These secondary tasks have considerable chance to distract a driver from his primary driving task, thereby reducing safety or increasing cognitive workload. This paper presents an intelligent interactive head up display (HUD) on the windscreen of the driver that does not require them to take eyes off from road while undertaking secondary tasks like playing music, operating vent controls, watching navigation map and so on. The interactive HUD allows interaction in the form of pointing and selection just like traditional graphical user interfaces, however tracking operators’ eye gaze or finger movements. Additionally, the system can also estimate drivers’ cognitive load and distraction level. User studies show the system improves driving performance in terms of mean deviation from lane in an ISO 26022 lane changing task compared to touchscreen system and participants can undertake ISO 9241 pointing tasks in less than 2 s on average inside a Toyota Etios car.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18
Fig. 19
Fig. 20
Fig. 21
Fig. 22
Fig. 23
Fig. 24
Fig. 25
Fig. 26
Fig. 27
Fig. 28
Fig. 29
Fig. 30
Fig. 31

Similar content being viewed by others

References

  1. Adrian W, Bhanji A (1991) Fundamentals of disability glare: a formula to describe stray light in the eye as a function of glare angle and age. In: Proceedings of the 1st international symposium on glare. Lighting Research Institute, New York, pp 185–193

  2. Ashdown M, Oka K, Sato Y (2005) Combining head tracking and mouse in-put for a GUI on multiple monitors. In: CHI late breaking result

  3. Babu M, JeevithaShree D, Prabhakar G, Saluja KP, Pashilkar A, Biswas P (2019) Estimating pilots’ cognitive load from ocular parameters through simulation and in-flight studies. J Eye Mov Res. https://doi.org/10.16910/jemr.12.3.3

    Article  Google Scholar 

  4. Bates R (1999) Multimodal eye-based interaction for zoomed target selection on a standard graphical user interface. In: Proceedings of INTERACT. British Computer Society, London

  5. Biswas P, Langdon P (2015) Multimodal intelligent eye-gaze tracking system. Int J Hum Comput Interact 31(4):277–294

    Article  Google Scholar 

  6. Biswas P (2016) Exploring the use of eye gaze controlled interfaces in automotive environments. Springer, Berlin. ISBN 978-3-319-40708-1

    Book  Google Scholar 

  7. Biswas P, Prabhakar G (2018) Detecting drivers’ cognitive load from saccadic intrusion. Transp Res F Traffic Psychol Behav 54:63–78

    Article  Google Scholar 

  8. Biswas P, Langdon P (2013) A new interaction technique involving eye gaze tracker and scanning systems. In: ACM Eye Tracking South Africa (ETSA) 2013

  9. Borgo R, Kehrer J, Chung DHS, Maguire E, Laramee RS, Hauser H, Ward M, Chen M (2013) Glyph-based visualization: foundations, design guidelines, techniques and applications. In: Eurographics

  10. CameraMouse. http://www.cameramouse.com. Accessed on 22nd Sept 2018

  11. Campbell JL, Brown JL, Graving JS, Richard CM, Lichty MG, Bacon LP, Sanquist T (2018). Human factors design guidance for level 2 and level 3 automated driving concepts (Report No. DOT HS 812 555). National Highway Traffic Safety Administration, Washington, DC

  12. Chang W, Hwang W, Ji YG (2011) Haptic seat interfaces for driver information and warning systems. Int J Hum Comput Interact 27(12):1119–1132. https://doi.org/10.1080/10447318.2011.555321

    Article  Google Scholar 

  13. Dostal J, Kristensson PO, Quigley A (2013) Subtle gaze-dependent techniques for visualising display changes in multi-display environments. In: ACM international conference of intelligent user interfaces (IUI) 2013

  14. Emotiv Insight EEG Tracker. https://www.emotiv.com/insight/. Accessed on 12 Sept 2019

  15. Farrell S, Zhai S (2005) System and method for selectively expanding or contracting a portion of a display using eye-gaze tracking. US Patent No.: 20050047629 A1

  16. Fejtova M et al (2009) Hands-free interaction with a computer and other technologies. Universal Access in the Information Society 8

  17. Feld M, Meixner G, Mahr A, Seissler M, Kalyanasundaram B (2013) Generating a personalized UI for the car: a user-adaptive rendering architecture, UMAP 2013, LNCS 7899, pp 344–346

    Chapter  Google Scholar 

  18. Fu Y, Huang TS (2007) hMouse: head tracking driven virtual computer mouse. In: IEEE workshop on applications of computer vision

  19. Jacob M, Hurwitz B, Kamhi G (2013) Eye tracking based selective accentuation of portions of a display. WO Patent No.: 2013169237 A1

  20. Jang S, Goodman BD, Deluca LS (2012) Confirming input intent using eye tracking. UK Patent, GB 2497206

  21. Kern D, Mahr A, Castronovo S, Schmidt A, Müller C (2010) Making use of drivers’ glances onto the screen for explicit gaze-based interaction. In: Proceedings of the 2nd international conference on automotive user interfaces and interactive vehicular applications

  22. Kim J, Lim J, Jo C, Kim K (2015) Utilization of visual information perception characteristics to improve classification accuracy of driver’s visual search intention for intelligent vehicle. Int J Hum Comput Interact. https://doi.org/10.1080/10447318.2015.1070561

    Article  Google Scholar 

  23. Lutteroth C, Penkar M, Weber G (2015) Gaze vs. mouse: a fast and accurate gaze-only click alternative. In: Proceedings of the 28th annual ACM symposium on user interface software & technology. ACM, pp 385–394

  24. Marshall S (2007) Identifying cognitive state from eye metrics. Aviat Space Environ Med 78(Suppl. 1):B165–B175

    Google Scholar 

  25. Martins FCM (2003) Passive gaze-driven browsing. US Patent No.: 6608615 B1

  26. May KR, Walker BN, Gable TM, A multimodal air gesture interface for in vehicle menu navigation. In: Adjunct proceedings of ACM automotive UI 2014

  27. Milekic S (2009) Using gaze actions to interact with a display. US Patent No.: 7561143 B1

  28. Mondragon CK, Bleacher B, Eye tracking control of vehicle entertainment systems. Patent No. WO2013036632

  29. Mourant RR, Langolf GD (1976) Luminance specifications for automobile instrument panels. Hum Factors 18(1):71–84

    Article  Google Scholar 

  30. Normark CJ (2015) Design and evaluation of a touch-based personalizable in-vehicle user interface. Int J Hum Comput Interact. https://doi.org/10.1080/10447318.2015.1045240

    Article  Google Scholar 

  31. Ohn-Bar E, Trivedi M (2014) Hand gesture recognition in real-time for automotive interfaces: a multimodal vision-based approach and evaluations. IEEE Trans Intell Transp Syst 15:2368–2377

    Article  Google Scholar 

  32. Pfeuffer K, Gellersen H (2016) Gaze and touch interaction on tablets. In: Proceedings of the 29th annual symposium on user interface software and technology. ACM, pp 301–311

  33. Poitschke T, Laquai F, Stamboliev S, Rigoll G (2011) Gaze-based interaction on multiple displays in an automotive environment. In: IEEE international conference on systems, man, and cybernetics (SMC), 2011, pp 543–548, ISSN: 1062-922X

  34. Seder TA, Szczerba JF, Cui D, Virtual cursor for road scene object selection on full windshield head-up display. Patent No.: US20120174004

  35. System Usability Scale. http://en.wikipedia.org/wiki/System_usability_scale. Accessed on 12 July 2014

  36. Tobii EyeX Eye Tracker. http://www.tobii.com/xperience/. Accessed on 31st Aug 2018

  37. Vahtola MJ, Apparatus and associated methods for touch user input, US Patent No. US 2014/0368442 A1

  38. Voelker S, Matviienko A, Schöning J, Borchers J (2015) Combining direct and indirect touch input for interactive desktop workspaces using gaze input. In: Proceedings of the 3rd ACM symposium on spatial user interaction, pp 79–88

  39. Voronka N, Jacobus CJ (2001) Low-cost non-imaging eye tracker system for computer control. US Patent No.: 6299308 B1

  40. Weinberg G, Knowles A, Langer P (2014) BullsEye: an au automotive touch interface that’s always on target. In: Adjunct proceedings of ACM automotive UI 2014

  41. Zander TO, Gaertner M, Kothe C, Vilimek R (2010. Combining eye gaze input with a brain–computer interface for touchless human–computer interaction. Int J Hum-Comput Int 27(1):38–51

    Article  Google Scholar 

  42. Zhai S, Hunter M, Smith BA (2000) The metropols keyboard—an exploration of quantitative techniques for virtual keyboard design. In: Proceedings of ACM symposium on user interface software and technology (UIST 2000), 5–8 Nov 2000

  43. Zhang Y, Bulling A, Gellersen H (2013) SideWays: a gaze interface for spontaneous interaction with situated displays. In: ACM SIGCHI conference on human factors in computing system (CHI) 2013

Download references

Acknowledgements

The funding was provided by Faurecia India Private Ltd. (Grant No. PC99309).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Pradipta Biswas.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Prabhakar, G., Ramakrishnan, A., Madan, M. et al. Interactive gaze and finger controlled HUD for cars. J Multimodal User Interfaces 14, 101–121 (2020). https://doi.org/10.1007/s12193-019-00316-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12193-019-00316-9

Keywords

Navigation