Abstract
Modern infotainment systems in automobiles facilitate driving at the cost of secondary tasks in addition to the primary task of driving. These secondary tasks have considerable chance to distract a driver from his primary driving task, thereby reducing safety or increasing cognitive workload. This paper presents an intelligent interactive head up display (HUD) on the windscreen of the driver that does not require them to take eyes off from road while undertaking secondary tasks like playing music, operating vent controls, watching navigation map and so on. The interactive HUD allows interaction in the form of pointing and selection just like traditional graphical user interfaces, however tracking operators’ eye gaze or finger movements. Additionally, the system can also estimate drivers’ cognitive load and distraction level. User studies show the system improves driving performance in terms of mean deviation from lane in an ISO 26022 lane changing task compared to touchscreen system and participants can undertake ISO 9241 pointing tasks in less than 2 s on average inside a Toyota Etios car.
Similar content being viewed by others
References
Adrian W, Bhanji A (1991) Fundamentals of disability glare: a formula to describe stray light in the eye as a function of glare angle and age. In: Proceedings of the 1st international symposium on glare. Lighting Research Institute, New York, pp 185–193
Ashdown M, Oka K, Sato Y (2005) Combining head tracking and mouse in-put for a GUI on multiple monitors. In: CHI late breaking result
Babu M, JeevithaShree D, Prabhakar G, Saluja KP, Pashilkar A, Biswas P (2019) Estimating pilots’ cognitive load from ocular parameters through simulation and in-flight studies. J Eye Mov Res. https://doi.org/10.16910/jemr.12.3.3
Bates R (1999) Multimodal eye-based interaction for zoomed target selection on a standard graphical user interface. In: Proceedings of INTERACT. British Computer Society, London
Biswas P, Langdon P (2015) Multimodal intelligent eye-gaze tracking system. Int J Hum Comput Interact 31(4):277–294
Biswas P (2016) Exploring the use of eye gaze controlled interfaces in automotive environments. Springer, Berlin. ISBN 978-3-319-40708-1
Biswas P, Prabhakar G (2018) Detecting drivers’ cognitive load from saccadic intrusion. Transp Res F Traffic Psychol Behav 54:63–78
Biswas P, Langdon P (2013) A new interaction technique involving eye gaze tracker and scanning systems. In: ACM Eye Tracking South Africa (ETSA) 2013
Borgo R, Kehrer J, Chung DHS, Maguire E, Laramee RS, Hauser H, Ward M, Chen M (2013) Glyph-based visualization: foundations, design guidelines, techniques and applications. In: Eurographics
CameraMouse. http://www.cameramouse.com. Accessed on 22nd Sept 2018
Campbell JL, Brown JL, Graving JS, Richard CM, Lichty MG, Bacon LP, Sanquist T (2018). Human factors design guidance for level 2 and level 3 automated driving concepts (Report No. DOT HS 812 555). National Highway Traffic Safety Administration, Washington, DC
Chang W, Hwang W, Ji YG (2011) Haptic seat interfaces for driver information and warning systems. Int J Hum Comput Interact 27(12):1119–1132. https://doi.org/10.1080/10447318.2011.555321
Dostal J, Kristensson PO, Quigley A (2013) Subtle gaze-dependent techniques for visualising display changes in multi-display environments. In: ACM international conference of intelligent user interfaces (IUI) 2013
Emotiv Insight EEG Tracker. https://www.emotiv.com/insight/. Accessed on 12 Sept 2019
Farrell S, Zhai S (2005) System and method for selectively expanding or contracting a portion of a display using eye-gaze tracking. US Patent No.: 20050047629 A1
Fejtova M et al (2009) Hands-free interaction with a computer and other technologies. Universal Access in the Information Society 8
Feld M, Meixner G, Mahr A, Seissler M, Kalyanasundaram B (2013) Generating a personalized UI for the car: a user-adaptive rendering architecture, UMAP 2013, LNCS 7899, pp 344–346
Fu Y, Huang TS (2007) hMouse: head tracking driven virtual computer mouse. In: IEEE workshop on applications of computer vision
Jacob M, Hurwitz B, Kamhi G (2013) Eye tracking based selective accentuation of portions of a display. WO Patent No.: 2013169237 A1
Jang S, Goodman BD, Deluca LS (2012) Confirming input intent using eye tracking. UK Patent, GB 2497206
Kern D, Mahr A, Castronovo S, Schmidt A, Müller C (2010) Making use of drivers’ glances onto the screen for explicit gaze-based interaction. In: Proceedings of the 2nd international conference on automotive user interfaces and interactive vehicular applications
Kim J, Lim J, Jo C, Kim K (2015) Utilization of visual information perception characteristics to improve classification accuracy of driver’s visual search intention for intelligent vehicle. Int J Hum Comput Interact. https://doi.org/10.1080/10447318.2015.1070561
Lutteroth C, Penkar M, Weber G (2015) Gaze vs. mouse: a fast and accurate gaze-only click alternative. In: Proceedings of the 28th annual ACM symposium on user interface software & technology. ACM, pp 385–394
Marshall S (2007) Identifying cognitive state from eye metrics. Aviat Space Environ Med 78(Suppl. 1):B165–B175
Martins FCM (2003) Passive gaze-driven browsing. US Patent No.: 6608615 B1
May KR, Walker BN, Gable TM, A multimodal air gesture interface for in vehicle menu navigation. In: Adjunct proceedings of ACM automotive UI 2014
Milekic S (2009) Using gaze actions to interact with a display. US Patent No.: 7561143 B1
Mondragon CK, Bleacher B, Eye tracking control of vehicle entertainment systems. Patent No. WO2013036632
Mourant RR, Langolf GD (1976) Luminance specifications for automobile instrument panels. Hum Factors 18(1):71–84
Normark CJ (2015) Design and evaluation of a touch-based personalizable in-vehicle user interface. Int J Hum Comput Interact. https://doi.org/10.1080/10447318.2015.1045240
Ohn-Bar E, Trivedi M (2014) Hand gesture recognition in real-time for automotive interfaces: a multimodal vision-based approach and evaluations. IEEE Trans Intell Transp Syst 15:2368–2377
Pfeuffer K, Gellersen H (2016) Gaze and touch interaction on tablets. In: Proceedings of the 29th annual symposium on user interface software and technology. ACM, pp 301–311
Poitschke T, Laquai F, Stamboliev S, Rigoll G (2011) Gaze-based interaction on multiple displays in an automotive environment. In: IEEE international conference on systems, man, and cybernetics (SMC), 2011, pp 543–548, ISSN: 1062-922X
Seder TA, Szczerba JF, Cui D, Virtual cursor for road scene object selection on full windshield head-up display. Patent No.: US20120174004
System Usability Scale. http://en.wikipedia.org/wiki/System_usability_scale. Accessed on 12 July 2014
Tobii EyeX Eye Tracker. http://www.tobii.com/xperience/. Accessed on 31st Aug 2018
Vahtola MJ, Apparatus and associated methods for touch user input, US Patent No. US 2014/0368442 A1
Voelker S, Matviienko A, Schöning J, Borchers J (2015) Combining direct and indirect touch input for interactive desktop workspaces using gaze input. In: Proceedings of the 3rd ACM symposium on spatial user interaction, pp 79–88
Voronka N, Jacobus CJ (2001) Low-cost non-imaging eye tracker system for computer control. US Patent No.: 6299308 B1
Weinberg G, Knowles A, Langer P (2014) BullsEye: an au automotive touch interface that’s always on target. In: Adjunct proceedings of ACM automotive UI 2014
Zander TO, Gaertner M, Kothe C, Vilimek R (2010. Combining eye gaze input with a brain–computer interface for touchless human–computer interaction. Int J Hum-Comput Int 27(1):38–51
Zhai S, Hunter M, Smith BA (2000) The metropols keyboard—an exploration of quantitative techniques for virtual keyboard design. In: Proceedings of ACM symposium on user interface software and technology (UIST 2000), 5–8 Nov 2000
Zhang Y, Bulling A, Gellersen H (2013) SideWays: a gaze interface for spontaneous interaction with situated displays. In: ACM SIGCHI conference on human factors in computing system (CHI) 2013
Acknowledgements
The funding was provided by Faurecia India Private Ltd. (Grant No. PC99309).
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Prabhakar, G., Ramakrishnan, A., Madan, M. et al. Interactive gaze and finger controlled HUD for cars. J Multimodal User Interfaces 14, 101–121 (2020). https://doi.org/10.1007/s12193-019-00316-9
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12193-019-00316-9