ABSTRACT
Distraction detection systems in automotive has great importance due to the prime safety of passengers. Earlier approaches confined to use indirect methods of driving performance metrics to detect visual distraction. Recent methods attempted to develop dedicated classification models for gaze zone estimation whose cross-domain performance was not investigated. We adopt a more generic appearance-based gaze estimation approach where no assumption on setting or participant was made. We proposed MAGE-Net with less number of parameters while achieving on par performance with state of the art techniques on MPIIGaze dataset. We utilized the proposed MAGE-Net and performed a cross-domain evaluation in automotive setting with 10 participants. We observed that the gaze region error using MAGE-Net for interior regions of car is 15.61 cm and 15.13 cm in x and y directions respectively. We utilized these results and demonstrated the capability of proposed system to detect visual distraction using a driving simulator.
Supplemental Material
- [n. d.]. Tobii Dynavox PCEye Mini. http://tdvox.web-downloads.s3.amazonaws.com/PCEye/documents/TobiiDynavox_PCEyeMini_UserManual_v1-2_en-US_WEB.pdf. Accessed: 2021-03-08.Google Scholar
- Dzmitry Bahdanau, Kyunghyun Cho, and Yoshua Bengio. 2014. Neural machine translation by jointly learning to align and translate. arXiv preprint arXiv:1409.0473(2014).Google Scholar
- Zhaokang Chen and Bertram E Shi. 2018. Appearance-based gaze estimation using dilated-convolutions. In Asian Conference on Computer Vision. Springer, 309–324.Google Scholar
- Yihua Cheng, Shiyao Huang, Fei Wang, Chen Qian, and Feng Lu. 2020. A coarse-to-fine adaptive network for appearance-based gaze estimation. In Proceedings of the AAAI Conference on Artificial Intelligence, Vol. 34. 10623–10630.Google ScholarCross Ref
- Yihua Cheng, Xucong Zhang, Feng Lu, and Yoichi Sato. 2020. Gaze estimation by exploring two-eye asymmetry. IEEE Transactions on Image Processing 29 (2020), 5259–5272.Google ScholarCross Ref
- Dagmar Kern and Albrecht Schmidt. 2009. Design space for driver-based automotive user interfaces. In Proceedings of the 1st International Conference on Automotive User Interfaces and Interactive Vehicular Applications. 3–10.Google ScholarDigital Library
- Matti Kutila, Maria Jokela, Gustav Markkula, and Maria Romera Rué. 2007. Driver distraction detection with a camera vision system. In 2007 IEEE International Conference on Image Processing, Vol. 6. IEEE, VI–201.Google ScholarCross Ref
- Anh Son Le, Tatsuya Suzuki, and Hirofumi Aoki. 2020. Evaluating driver cognitive distraction by eye tracking: From simulator to driving. Transportation research interdisciplinary perspectives 4 (2020), 100087.Google Scholar
- Zhaojian Li, Shan Bao, Ilya V Kolmanovsky, and Xiang Yin. 2017. Visual-manual distraction detection using driving performance indicators with naturalistic driving data. IEEE Transactions on Intelligent Transportation Systems 19, 8(2017), 2528–2535.Google ScholarDigital Library
- Fumio Mizoguchi, Hiroyuki Nishiyama, and Hirotoshi Iwasaki. 2014. A new approach to detecting distracted car drivers using eye-movement data. In 2014 IEEE 13th International Conference on Cognitive Informatics and Cognitive Computing. IEEE, 266–272.Google ScholarCross Ref
- LRD Murthy and Pradipta Biswas. 2021. Appearance-based Gaze Estimation using Attention and Difference Mechanism. In 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW). IEEE, 3137–3146.Google Scholar
- LRD Murthy, Siddhi Brahmbhatt, Somnath Arjun, and Pradipta Biswas. 2021. I2DNet-Design and real-time evaluation of an appearance-based gaze estimation system. Journal of Eye Movement Research 14, 4 (2021).Google Scholar
- Gowdham Prabhakar, Abhishek Mukhopadhyay, Lrd Murthy, Madan Modiksha, Deshmukh Sachin, and Pradipta Biswas. 2020. Cognitive load estimation using ocular parameters in automotive. Transportation Engineering 2 (2020), 100008.Google ScholarCross Ref
- Fabio Tango and Marco Botta. 2013. Real-time detection system of driver distraction using machine learning. IEEE Transactions on Intelligent Transportation Systems 14, 2(2013), 894–905.Google ScholarDigital Library
- Ashish Tawari and Mohan M Trivedi. 2014. Robust and continuous estimation of driver gaze zone by dynamic analysis of multiple face videos. In 2014 IEEE Intelligent Vehicles Symposium Proceedings. IEEE, 344–349.Google ScholarCross Ref
- Sourabh Vora, Akshay Rangesh, and Mohan M Trivedi. 2017. On generalizing driver gaze zone estimation using convolutional neural networks. In 2017 IEEE Intelligent Vehicles Symposium (IV). IEEE, 849–854.Google ScholarDigital Library
- Sourabh Vora, Akshay Rangesh, and Mohan Manubhai Trivedi. 2018. Driver gaze zone estimation using convolutional neural networks: A general framework and ablative analysis. IEEE Transactions on Intelligent Vehicles 3, 3 (2018), 254–265.Google ScholarCross Ref
- John Wang and Edwin Olson. 2016. AprilTag 2: Efficient and robust fiducial detection. In 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 4193–4198.Google ScholarDigital Library
- Martin Wollmer, Christoph Blaschke, Thomas Schindl, Björn Schuller, Berthold Farber, Stefan Mayer, and Benjamin Trefflich. 2011. Online driver distraction detection using long short-term memory. IEEE Transactions on Intelligent Transportation Systems 12, 2(2011), 574–582.Google ScholarDigital Library
- Xucong Zhang, Yusuke Sugano, and Andreas Bulling. 2018. Revisiting Data Normalization for Appearance-Based Gaze Estimation. In Proc. International Symposium on Eye Tracking Research and Applications (ETRA) (2018-03-28). 12:1–12:9.Google ScholarDigital Library
- Xucong Zhang, Yusuke Sugano, Mario Fritz, and Andreas Bulling. 2017. It’s written all over your face: Full-face appearance-based gaze estimation. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops. 51–60.Google ScholarCross Ref
- Xucong Zhang, Yusuke Sugano, Mario Fritz, and Andreas Bulling. 2017. Mpiigaze: Real-world dataset and deep appearance-based gaze estimation. IEEE transactions on pattern analysis and machine intelligence 41, 1(2017), 162–175.Google Scholar
Index Terms
- Distraction Detection in Automotive Environment using Appearance-based Gaze Estimation
Recommendations
Distraction Detection through Facial Attributes of Transport Network Vehicle Service Drivers
IHIP 2018: Proceedings of the 2018 International Conference on Information Hiding and Image ProcessingNOTICE OF RETRACTION: While investigating potential publication-related misconduct in connection with the IHIP 2018 Conference Proceedings, serious concerns were raised that cast doubt on the integrity of the peer-review process and all papers published ...
Driver Distraction Assessment Using Driver Modeling
SMC '13: Proceedings of the 2013 IEEE International Conference on Systems, Man, and CyberneticsCharacterizing individual human drivers is of increasing interest for applications like adaptive driver assistance or monitoring. Describing the human driver by means of control-theoretic driver models constitutes a promising approach. In this paper, we ...
A survey on vision-based driver distraction analysis
AbstractMotor vehicle crashes are great threats to our life, which may result in numerous fatalities, as well as tremendous economic and societal costs. Driver inattention, either distraction or fatigue, is the major cause among human factors ...
Comments