skip to main content
10.1145/3448018.3457993acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
short-paper

55 Rides: attention annotated head and gaze data during naturalistic driving

Published: 25 May 2021 Publication History

Abstract

Trained eye patterns are essential for safe driving. Whether for exploration of the surrounding traffic or to make sure that a lane is clear through a shoulder check - quick and effective perception is the key to driving safety. Surprisingly though, free and open access data on gaze behavior during driving are yet extremely sparse. The environment inside a vehicle is challenging for eye-tracking technology due to rapidly changing illumination conditions, such as exiting a tunnel to brightest sunlight, proper calibration and safety. So far, available data exhibits environments that likely influence the viewing behavior, sometimes dramatically (e.g., driving simulators without mirrors, limited field of view).
We propose crowd-sourced eye-tracking data collected during real-world driving using NIR-cameras and illuminators that were placed within the driver’s cabin. We analyze this data using a deep learning appearance-based gaze estimation, with raw videos not being part of the data set due to legal restrictions. Our data set contains four different drivers in their habitual cars and 55 rides of an average of 30 minutes length. At least three human raters rated each ride continuously with regard to driver attention and vigilance level on a ten-point scale. From the recorded videos we extracted drivers’ head and eye movements as well as eye opening angle. For this data, we apply a normalization with respect to different placement of the driver monitoring camera and demonstrate a baseline for driver attention monitoring based on eye gaze and head movement features.

References

[1]
Yehya Abouelnaga, Hesham M Eraqi, and Mohamed N Moustafa. 2017. Real-time distracted driver posture classification. arXiv preprint arXiv:1706.09498(2017).
[2]
Stefano Alletto, Andrea Palazzi, Francesco Solera, Simone Calderara, and Rita Cucchiara. 2016. Dr (eye) ve: a dataset for attention-based tasks with applications to autonomous and assisted driving. In Proceedings of the ieee conference on computer vision and pattern recognition workshops. 54–60.
[3]
Christian Braunagel, David Geisler, Wolfgang Rosenstiel, and Enkelejda Kasneci. 2017. Online recognition of driver-activity based on visual scanpath classification. IEEE Intelligent Transportation Systems Magazine 9, 4 (2017), 23–36.
[4]
Christian Braunagel, Wolfgang Stolzmann, Enkelejda Kasneci, Thomas C Kübler, Wolfgang Fuhl, and Wolfgang Rosenstiel. 2015. Exploiting the potential of eye movements analysis in the driving context. In 15. Internationales Stuttgarter Symposium. Springer, 1093–1105.
[5]
Adrian Bulat and Georgios Tzimiropoulos. 2017. How far are we from solving the 2d & 3d face alignment problem?(and a dataset of 230,000 3d facial landmarks). In Proceedings of the IEEE International Conference on Computer Vision. 1021–1030.
[6]
Alyssa Byrnes and Cynthia Sturton. 2018. On Using Drivers’ Eyes to Predict Accident-Causing Drowsiness Levels. In 2018 21st International Conference on Intelligent Transportation Systems (ITSC). IEEE, 2092–2097.
[7]
Richard Dewhurst, Marcus Nyström, Halszka Jarodzka, Tom Foulsham, Roger Johansson, and Kenneth Holmqvist. 2012. It depends on how you look at it: Scanpath comparison in multiple dimensions with MultiMatch, a vector-based approach. Behavior research methods 44, 4 (2012), 1079–1100.
[8]
Isha Dua, Akshay Uttama Nambi, CV Jawahar, and Venkat Padmanabhan. 2019. AutoRate: How attentive is the driver?. In 2019 14th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2019). IEEE, 1–8.
[9]
Jianwu Fang, Dingxin Yan, Jiahuan Qiao, Jianru Xue, He Wang, and Sen Li. 2019. DADA-2000: Can Driving Accident be Predicted by Driver Attentionƒ Analyzed by A Benchmark. In 2019 IEEE Intelligent Transportation Systems Conference (ITSC). IEEE, 4303–4309.
[10]
Luke Fletcher, Gareth Loy, Nick Barnes, and Alexander Zelinsky. 2005. Correlating driver gaze with the road scene for driver assistance systems. Robotics and Autonomous Systems 52, 1 (2005), 71–84.
[11]
Lex Fridman, Philipp Langhans, Joonbum Lee, and Bryan Reimer. 2016. Driver gaze region estimation without use of eye movement. IEEE Intelligent Systems 31, 3 (2016), 49–56.
[12]
Wolfgang Fuhl, Efe Bozkir, Benedikt Hosp, Nora Castner, David Geisler, Thiago C Santini, and Enkelejda Kasneci. 2019a. Encodji: encoding gaze data into emoji space for an amusing scanpath classification approach. In Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications. 1–4.
[13]
Wolfgang Fuhl, Nora Castner, and Enkelejda Kasneci. 2018a. Histogram of oriented velocities for eye movement detection. In International Conference on Multimodal Interaction Workshops, ICMIW.
[14]
Wolfgang Fuhl, Nora Castner, and Enkelejda Kasneci. 2018b. Rule based learning for eye movement type detection. In International Conference on Multimodal Interaction Workshops, ICMIW.
[15]
Wolfgang Fuhl, Nora Castner, Thomas C. Kübler, Alexander Lotz, Wolfgang Rosenstiel, and Enkelejda Kasneci. 2019b. Ferns for area of interest free scanpath classification. In Proceedings of the 2019 ACM Symposium on Eye Tracking Research & Applications (ETRA).
[16]
Wolfgang Fuhl, Nora Castner, Lin Zhuang, Markus Holzer, Wolfgang Rosenstiel, and Enkelejda Kasneci. 2018c. Mam: Transfer learning for fully automatic video annotation and specialized detector creation. In Proceedings of the European Conference on Computer Vision (ECCV) Workshops. 0–0.
[17]
Wolfgang Fuhl, Hong Gao, and Enkelejda Kasneci. 2020a. Neural networks for optical vector and eye ball parameter estimation. In ACM Symposium on Eye Tracking Research & Applications, ETRA 2020. ACM.
[18]
Wolfgang Fuhl, David Geisler, Wolfgang Rosenstiel, and Enkelejda Kasneci. 2019c. The applicability of Cycle GANs for pupil and eyelid segmentation, data generation and image refinement. In International Conference on Computer Vision Workshops, ICCVW.
[19]
Wolfgang Fuhl and Enkelejda Kasneci. 2018. Eye movement velocity and gaze data generator for evaluation, robustness testing and assess of eye tracking software and visualization tools. In Poster at Egocentric Perception, Interaction and Computing, EPIC.
[20]
Wolfgang Fuhl and Enkelejda Kasneci. 2019. Learning to validate the quality of detected landmarks. In International Conference on Machine Vision, ICMV.
[21]
Wolfgang Fuhl and Enkelejda Kasneci. 2020. Weight and Gradient Centralization in Deep Neural Networks. arXiv preprint arXiv:2010.00866 (08 2020).
[22]
Wolfgang Fuhl, Yao Rong, and Kasneci Enkelejda. 2020b. Fully Convolutional Neural Networks for Raw Eye Tracking Data Segmentation, Generation, and Reconstruction. In Proceedings of the International Conference on Pattern Recognition. 0–0.
[23]
Wolfgang Fuhl, Yao Rong, Thomas Motz, Michael Scheidt, Andreas Hartel, Andreas Koch, and Enkelejda Kasneci. 2020c. Explainable Online Validation of Machine Learning Models for Practical Applications. In Proceedings of the International Conference on Pattern Recognition. 0–0.
[24]
Wolfgang Fuhl, Wolfgang Rosenstiel, and Enkelejda Kasneci. 2019d. 500,000 images closer to eyelid and pupil segmentation. In Computer Analysis of Images and Patterns, CAIP.
[25]
Wolfgang Fuhl, Thiago Santini, David Geisler, Thomas C. Kübler, Wolfgang Rosenstiel, and Enkelejda Kasneci. 2016. Eyes Wide Open? Eyelid Location and Eye Aperture Estimation for Pervasive Eye Tracking in Real-World Scenarios. In ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct publication – PETMEI 2016.
[26]
Wolfgang Fuhl, Thiago Santini, and Enkelejda Kasneci. 2017. Fast and Robust Eyelid Outline and Aperture Detection in Real-World Scenarios. In IEEE Winter Conference on Applications of Computer Vision (WACV 2017).
[27]
Wolfgang Fuhl, Thiago Santini, Thomas C. Kübler, Nora Castner, Wolfgang Rosenstiel, and Enkelejda Kasneci. 2018d. Eye movement simulation and detector creation to reduce laborious parameter adjustments. arXiv preprint arXiv:1804.00970(2018).
[28]
Takahiro Ishikawa. 2004. Passive driver gaze tracking with active appearance models. (2004).
[29]
Ashesh Jain, Hema S Koppula, Shane Soh, Bharad Raghavan, Avi Singh, and Ashutosh Saxena. 2016. Brain4cars: Car that knows before you do via sensory-fusion deep learning architecture. arXiv preprint arXiv:1601.00740(2016).
[30]
Murray W Johns, Andrew Tucker, Robert Chapman, Kate Crowley, and Natalie Michael. 2007. Monitoring eye and eyelid movements by infrared reflectance oculography to measure drowsiness in drivers. Somnologie-Schlafforschung und Schlafmedizin 11, 4(2007), 234–242.
[31]
Enkelejda Kasneci, Katrin Sippel, Kathrin Aehling, Martin Heister, Wolfgang Rosenstiel, Ulrich Schiefer, and Elena Papageorgiou. 2014. Driving with binocular visual field loss? A study on a supervised on-road parcours with simultaneous eye and head tracking. PloS one 9, 2 (2014), e87470.
[32]
Sheila G Klauer, Thomas A Dingus, Vicki L Neale, Jeremy D Sudweeks, David J Ramsey, 2006. The impact of driver inattention on near-crash/crash risk: An analysis using the 100-car naturalistic driving study data. (2006).
[33]
Thomas C Kübler, Enkelejda Kasneci, and Wolfgang Rosenstiel. 2014a. Subsmatch: Scanpath similarity in dynamic scenes based on subsequence frequencies. In Proceedings of the Symposium on Eye Tracking Research and Applications. 319–322.
[34]
Thomas C Kübler, Enkelejda Kasneci, Wolfgang Rosenstiel, Ulrich Schiefer, Katja Nagel, and Elena Papageorgiou. 2014b. Stress-indicators and exploratory gaze for the analysis of hazard perception in patients with visual field loss. Transportation Research Part F: Traffic Psychology and Behaviour 24 (2014), 231–243.
[35]
Thomas C Kübler, Colleen Rothe, Ulrich Schiefer, Wolfgang Rosenstiel, and Enkelejda Kasneci. 2017. SubsMatch 2.0: Scanpath comparison and classification based on subsequence frequencies. Behavior research methods 49, 3 (2017), 1048–1064.
[36]
Manuel Martin, Alina Roitberg, Monica Haurilet, Matthias Horne, Simon Reiß, Michael Voit, and Rainer Stiefelhagen. 2019. Drive&act: A multi-modal dataset for fine-grained driver behavior recognition in autonomous vehicles. In Proceedings of the IEEE international conference on computer vision. 2801–2810.
[37]
Eshed Ohn-Bar, Sujitha Martin, Ashish Tawari, and Mohan M Trivedi. 2014. Head, eye, and hand patterns for driver activity recognition. In 2014 22nd International Conference on Pattern Recognition. IEEE, 660–665.
[38]
Salah Taamneh, Panagiotis Tsiamyrtzis, Malcolm Dcosta, Pradeep Buddharaju, Ashik Khatri, Michael Manser, Thomas Ferris, Robert Wunderlich, and Ioannis Pavlidis. 2017. A multimodal dataset for various forms of distracted driving. Scientific data 4(2017), 170110.
[39]
Mohan M Trivedi 2019. Attention monitoring and hazard assessment with bio-sensing and vision: Empirical analysis utilizing CNNs on the kitti dataset. In 2019 IEEE Intelligent Vehicles Symposium (IV). IEEE, 1673–1678.
[40]
Francisco Vicente, Zehua Huang, Xuehan Xiong, Fernando De la Torre, Wende Zhang, and Dan Levi. 2015. Driver gaze tracking and eyes off the road detection system. IEEE Transactions on Intelligent Transportation Systems 16, 4(2015), 2014–2027.
[41]
Ye Xia, Danqing Zhang, Jinkyu Kim, Ken Nakayama, Karl Zipser, and David Whitney. 2018. Predicting driver attention in critical situations. In Asian conference on computer vision. Springer, 658–674.
[42]
Chao Yan, Frans Coenen, and Bailing Zhang. 2016. Driving posture recognition by convolutional neural networks. IET Computer Vision 10, 2 (2016), 103–114.
[43]
Lichao Yang, Kuo Dong, Arkadiusz Jan Dmitruk, James Brighton, and Yifan Zhao. 2019. A dual-cameras-based driver gaze mapping system with an application on non-driving activities monitoring. IEEE Transactions on Intelligent Transportation Systems (2019).

Cited By

View all
  • (2024)Gaze Zone Classification for Driving Studies Using YOLOv8 Image ClassificationSensors10.3390/s2422725424:22(7254)Online publication date: 13-Nov-2024
  • (2022)On the Use of Distribution-based Metrics for the Evaluation of Drivers’ Fixation Maps Against Spatial Baselines2022 Symposium on Eye Tracking Research and Applications10.1145/3517031.3529629(1-7)Online publication date: 8-Jun-2022
  • (2022)Estimation of Older Driver's Cognitive Performance and Workload Using Features of Eye movement and Pupil Response on Test Routes2022 26th International Conference Information Visualisation (IV)10.1109/IV56949.2022.00033(155-160)Online publication date: Jul-2022

Index Terms

  1. 55 Rides: attention annotated head and gaze data during naturalistic driving
          Index terms have been assigned to the content through auto-classification.

          Recommendations

          Comments

          Information & Contributors

          Information

          Published In

          cover image ACM Conferences
          ETRA '21 Short Papers: ACM Symposium on Eye Tracking Research and Applications
          May 2021
          232 pages
          ISBN:9781450383455
          DOI:10.1145/3448018
          Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

          Sponsors

          Publisher

          Association for Computing Machinery

          New York, NY, United States

          Publication History

          Published: 25 May 2021

          Permissions

          Request permissions for this article.

          Check for updates

          Author Tags

          1. datasets
          2. driver attention
          3. gaze detection
          4. neural networks

          Qualifiers

          • Short-paper
          • Research
          • Refereed limited

          Funding Sources

          • Bundesministerium für Bildung und Forschung

          Conference

          ETRA '21
          Sponsor:

          Acceptance Rates

          Overall Acceptance Rate 69 of 137 submissions, 50%

          Upcoming Conference

          ETRA '25

          Contributors

          Other Metrics

          Bibliometrics & Citations

          Bibliometrics

          Article Metrics

          • Downloads (Last 12 months)24
          • Downloads (Last 6 weeks)6
          Reflects downloads up to 08 Mar 2025

          Other Metrics

          Citations

          Cited By

          View all
          • (2024)Gaze Zone Classification for Driving Studies Using YOLOv8 Image ClassificationSensors10.3390/s2422725424:22(7254)Online publication date: 13-Nov-2024
          • (2022)On the Use of Distribution-based Metrics for the Evaluation of Drivers’ Fixation Maps Against Spatial Baselines2022 Symposium on Eye Tracking Research and Applications10.1145/3517031.3529629(1-7)Online publication date: 8-Jun-2022
          • (2022)Estimation of Older Driver's Cognitive Performance and Workload Using Features of Eye movement and Pupil Response on Test Routes2022 26th International Conference Information Visualisation (IV)10.1109/IV56949.2022.00033(155-160)Online publication date: Jul-2022

          View Options

          Login options

          View options

          PDF

          View or Download as a PDF file.

          PDF

          eReader

          View online with eReader.

          eReader

          HTML Format

          View this article in HTML Format.

          HTML Format

          Figures

          Tables

          Media

          Share

          Share

          Share this Publication link

          Share on social media