skip to main content
10.1145/3490100.3516463acmotherconferencesArticle/Chapter ViewAbstractPublication PagesiuiConference Proceedingsconference-collections
Work in Progress

Distraction Detection in Automotive Environment using Appearance-based Gaze Estimation

Authors Info & Claims
Published:22 March 2022Publication History

ABSTRACT

Distraction detection systems in automotive has great importance due to the prime safety of passengers. Earlier approaches confined to use indirect methods of driving performance metrics to detect visual distraction. Recent methods attempted to develop dedicated classification models for gaze zone estimation whose cross-domain performance was not investigated. We adopt a more generic appearance-based gaze estimation approach where no assumption on setting or participant was made. We proposed MAGE-Net with less number of parameters while achieving on par performance with state of the art techniques on MPIIGaze dataset. We utilized the proposed MAGE-Net and performed a cross-domain evaluation in automotive setting with 10 participants. We observed that the gaze region error using MAGE-Net for interior regions of car is 15.61 cm and 15.13 cm in x and y directions respectively. We utilized these results and demonstrated the capability of proposed system to detect visual distraction using a driving simulator.

Skip Supplemental Material Section

Supplemental Material

Distraction_Detection_Demo_SupplementaryVideo.mp4

mp4

63.6 MB

References

  1. [n. d.]. Tobii Dynavox PCEye Mini. http://tdvox.web-downloads.s3.amazonaws.com/PCEye/documents/TobiiDynavox_PCEyeMini_UserManual_v1-2_en-US_WEB.pdf. Accessed: 2021-03-08.Google ScholarGoogle Scholar
  2. Dzmitry Bahdanau, Kyunghyun Cho, and Yoshua Bengio. 2014. Neural machine translation by jointly learning to align and translate. arXiv preprint arXiv:1409.0473(2014).Google ScholarGoogle Scholar
  3. Zhaokang Chen and Bertram E Shi. 2018. Appearance-based gaze estimation using dilated-convolutions. In Asian Conference on Computer Vision. Springer, 309–324.Google ScholarGoogle Scholar
  4. Yihua Cheng, Shiyao Huang, Fei Wang, Chen Qian, and Feng Lu. 2020. A coarse-to-fine adaptive network for appearance-based gaze estimation. In Proceedings of the AAAI Conference on Artificial Intelligence, Vol. 34. 10623–10630.Google ScholarGoogle ScholarCross RefCross Ref
  5. Yihua Cheng, Xucong Zhang, Feng Lu, and Yoichi Sato. 2020. Gaze estimation by exploring two-eye asymmetry. IEEE Transactions on Image Processing 29 (2020), 5259–5272.Google ScholarGoogle ScholarCross RefCross Ref
  6. Dagmar Kern and Albrecht Schmidt. 2009. Design space for driver-based automotive user interfaces. In Proceedings of the 1st International Conference on Automotive User Interfaces and Interactive Vehicular Applications. 3–10.Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Matti Kutila, Maria Jokela, Gustav Markkula, and Maria Romera Rué. 2007. Driver distraction detection with a camera vision system. In 2007 IEEE International Conference on Image Processing, Vol. 6. IEEE, VI–201.Google ScholarGoogle ScholarCross RefCross Ref
  8. Anh Son Le, Tatsuya Suzuki, and Hirofumi Aoki. 2020. Evaluating driver cognitive distraction by eye tracking: From simulator to driving. Transportation research interdisciplinary perspectives 4 (2020), 100087.Google ScholarGoogle Scholar
  9. Zhaojian Li, Shan Bao, Ilya V Kolmanovsky, and Xiang Yin. 2017. Visual-manual distraction detection using driving performance indicators with naturalistic driving data. IEEE Transactions on Intelligent Transportation Systems 19, 8(2017), 2528–2535.Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Fumio Mizoguchi, Hiroyuki Nishiyama, and Hirotoshi Iwasaki. 2014. A new approach to detecting distracted car drivers using eye-movement data. In 2014 IEEE 13th International Conference on Cognitive Informatics and Cognitive Computing. IEEE, 266–272.Google ScholarGoogle ScholarCross RefCross Ref
  11. LRD Murthy and Pradipta Biswas. 2021. Appearance-based Gaze Estimation using Attention and Difference Mechanism. In 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW). IEEE, 3137–3146.Google ScholarGoogle Scholar
  12. LRD Murthy, Siddhi Brahmbhatt, Somnath Arjun, and Pradipta Biswas. 2021. I2DNet-Design and real-time evaluation of an appearance-based gaze estimation system. Journal of Eye Movement Research 14, 4 (2021).Google ScholarGoogle Scholar
  13. Gowdham Prabhakar, Abhishek Mukhopadhyay, Lrd Murthy, Madan Modiksha, Deshmukh Sachin, and Pradipta Biswas. 2020. Cognitive load estimation using ocular parameters in automotive. Transportation Engineering 2 (2020), 100008.Google ScholarGoogle ScholarCross RefCross Ref
  14. Fabio Tango and Marco Botta. 2013. Real-time detection system of driver distraction using machine learning. IEEE Transactions on Intelligent Transportation Systems 14, 2(2013), 894–905.Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Ashish Tawari and Mohan M Trivedi. 2014. Robust and continuous estimation of driver gaze zone by dynamic analysis of multiple face videos. In 2014 IEEE Intelligent Vehicles Symposium Proceedings. IEEE, 344–349.Google ScholarGoogle ScholarCross RefCross Ref
  16. Sourabh Vora, Akshay Rangesh, and Mohan M Trivedi. 2017. On generalizing driver gaze zone estimation using convolutional neural networks. In 2017 IEEE Intelligent Vehicles Symposium (IV). IEEE, 849–854.Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Sourabh Vora, Akshay Rangesh, and Mohan Manubhai Trivedi. 2018. Driver gaze zone estimation using convolutional neural networks: A general framework and ablative analysis. IEEE Transactions on Intelligent Vehicles 3, 3 (2018), 254–265.Google ScholarGoogle ScholarCross RefCross Ref
  18. John Wang and Edwin Olson. 2016. AprilTag 2: Efficient and robust fiducial detection. In 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 4193–4198.Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Martin Wollmer, Christoph Blaschke, Thomas Schindl, Björn Schuller, Berthold Farber, Stefan Mayer, and Benjamin Trefflich. 2011. Online driver distraction detection using long short-term memory. IEEE Transactions on Intelligent Transportation Systems 12, 2(2011), 574–582.Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Xucong Zhang, Yusuke Sugano, and Andreas Bulling. 2018. Revisiting Data Normalization for Appearance-Based Gaze Estimation. In Proc. International Symposium on Eye Tracking Research and Applications (ETRA) (2018-03-28). 12:1–12:9.Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Xucong Zhang, Yusuke Sugano, Mario Fritz, and Andreas Bulling. 2017. It’s written all over your face: Full-face appearance-based gaze estimation. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops. 51–60.Google ScholarGoogle ScholarCross RefCross Ref
  22. Xucong Zhang, Yusuke Sugano, Mario Fritz, and Andreas Bulling. 2017. Mpiigaze: Real-world dataset and deep appearance-based gaze estimation. IEEE transactions on pattern analysis and machine intelligence 41, 1(2017), 162–175.Google ScholarGoogle Scholar

Index Terms

  1. Distraction Detection in Automotive Environment using Appearance-based Gaze Estimation
      Index terms have been assigned to the content through auto-classification.

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in
      • Published in

        cover image ACM Other conferences
        IUI '22 Companion: Companion Proceedings of the 27th International Conference on Intelligent User Interfaces
        March 2022
        142 pages
        ISBN:9781450391450
        DOI:10.1145/3490100

        Copyright © 2022 Owner/Author

        Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 22 March 2022

        Check for updates

        Qualifiers

        • Work in Progress
        • Research
        • Refereed limited

        Acceptance Rates

        Overall Acceptance Rate746of2,811submissions,27%

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      HTML Format

      View this article in HTML Format .

      View HTML Format