skip to main content
10.1145/3408127.3408179acmotherconferencesArticle/Chapter ViewAbstractPublication PagesicdspConference Proceedingsconference-collections
research-article

Eye Tracking in Driving Environment Based on Multichannel Convolutional Neural Network

Published: 10 September 2020 Publication History

Abstract

Gaze is the most important way for human to obtain information from the outside world, and it is the most direct and significant cue to analysis human behavior and intention. In driving environment, eye tracking is usually applied to model driver's fixations and gaze allocations, which is important in advanced driver assistance system (ADAS). In this paper, we have proposed a new eye tracking method in driving environment, which is based on multichannel convolutional neural network. Firstly, we establish the dataset for driver's eye tracking, which includes the left eye region image, the right eye region image and the face region image. After that, the multi-channel convolutional neural network is training using the dataset. Finally, the driver's gaze zone will be estimated using the pre-trained network. Experimental results show that the accuracy of the proposed method is 94.60% for seven gaze zone estimation, and it can be used in ADAS to analysis the driver's behavior and detect driver distraction.

References

[1]
Sattar, H., Muller, S., Fritz, M., and Bulling, A. 2015. Prediction of search targets from fixations in open-world settings. in Proc. IEEE Conf. on Computer Vision and Pattern Recognition, 981--990.
[2]
Nagamatsu, T., Kamahara, J., and Tanaka, N. 2008. 3D gaze tracking with easy calibration using stereo cameras for robot and human communication. in Proc. 17th Int. Symp. Robot Human Interactive Commun., 59--64.
[3]
Fu, X., Guan, X., Peli, E., Liu, H., and Luo, G. 2013. Automatic calibration method for driver's head orientation in nature driving environment. IEEE Trans. Intell. Transp. Syst., 14, 1, 303--312.
[4]
Vicente, F., Huang, Z., Xiong, X., De la Torre, F., Zhang, W., and Levi, D. 2015. Driver gaze tracking and eyes off the road detection system. IEEE Trans. Intell. Transp. Syst., 16, 4, 2014--2027.
[5]
Corcoran, P., Nanu, F., Petrescu, S., and Bigioi, P. 2012. Real-time eye gaze tracking for gaming design and consumer electronics systems. IEEE Trans. Consum. Electron., 58, 2, 347--355.
[6]
Hennessey, C. and Lawrence, P. 2009. Noncontact binocular eye-gaze tracking for point-of-gaze estimation in three dimensions. IEEE Trans. Biomedical Eng., 56, 3, 790--799.
[7]
Wu, X., Li, J., Wu, Q., and Sun, J. 2017. Appearance-based gaze block estimation via CNN classification. in Proc. IEEE International Workshop on Multimedia Signal Processing, 1--5.
[8]
Wang, Y., Shen, T., Yuan, G., Bian, J., and Fu, X. 2016. Appearance-based gaze estimation using deep features and random forest regression. Knowledge-Based Systems, 110, 293--301.
[9]
Hao, X., Zhang, G., and Ma, S. 2016. Deep learning. International Journal of Semantic Computing, 10, 03, 417--439.
[10]
Sourabh, V., Rangesh, A., and Trivedi, M. M. 2017. On generalizing driver gaze zone estimation using convolutional neural networks. in Proc. IEEE Intelligent Vehicles Symposium.
[11]
Sourabh, V., Rangesh, A., and Trivedi, M. M. 2018. Driver Gaze Zone Estimation using Convolutional Neural Networks: A General Framework and Ablative Analysis. IEEE Transactions on Intelligent Vehicles, 3, 3, 254 - 265.
[12]
Choi, I.-H., Hong S. K., and Kim Y.-G. 2016. Real-time categorization of driver's gaze zone using the deep learning techniques. in Proc. IEEE International Conference on Big Data and Smart Computing.
[13]
Zhang, X., Sugano, Y., Fritz, M., and Bulling, A. 2017. It's written all over your face: full-face appearance-based gaze estimation. in Proc. IEEE Conf. on Computer Vision and Pattern Recognition Workshops, 51--60.
[14]
Deng, H. and Zhu, W. 2017. Monocular free-head 3d gaze tracking with deep learning and geometry constraints. in Proc. IEEE Conf. on Computer Vision and Pattern Recognition, 3143--3152.
[15]
Krafka, K., Khosla, A., Kellnhofer, P., Kannan, H., Bhandarkar, S., Matusik, W., and Torralba, A. 2016. Eye tracking for everyone. in Proc. IEEE Conf. on Computer Vision and Pattern Recognition, 2176--2184.
[16]
Tayibnapis, I. R., Choi, M. K., and Kwon, S. 2018. Driver's gaze zone estimation by transfer learning. in Proc. IEEE International Conference on Consumer Electronics.
[17]
Naqvi, R. A., Arsalan, M., Batchuluun, G., Yoon, H. S., Park, K. R. 2018. Deep learning-based gaze detection system for automobile drivers using a NIR camera sensor. Sensors, 18, 456.
[18]
Wong, E. T., Yean, S. L., Hu, Q. Y., Lee, B. S., Liu, J., and Deepu, R. 2019. Gaze estimation using residual neural network. in Proc. IEEE International Conference on Pervasive Computing and Communications Workshop, 411--414.
[19]
Lin. M., Chen, Q., and Yan, S. 2013. Network in network. arXiv:1312.4400.

Cited By

View all
  • (2024)Driver Gaze Zone Estimation Based on Three-Channel Convolution-Optimized Vision Transformer With Transfer LearningIEEE Sensors Journal10.1109/JSEN.2024.348637324:24(42064-42078)Online publication date: 15-Dec-2024
  • (2021)Driver Gaze Zone Estimation via Head Pose Fusion Assisted Supervision and Eye Region Weighted EncodingIEEE Transactions on Consumer Electronics10.1109/TCE.2021.312700667:4(275-284)Online publication date: 1-Nov-2021

Index Terms

  1. Eye Tracking in Driving Environment Based on Multichannel Convolutional Neural Network

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    ICDSP '20: Proceedings of the 2020 4th International Conference on Digital Signal Processing
    June 2020
    383 pages
    ISBN:9781450376877
    DOI:10.1145/3408127
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    In-Cooperation

    • University of Electronic Science and Technology of China: University of Electronic Science and Technology of China

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 10 September 2020

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Deep learning
    2. advanced driver assistance system
    3. driver distraction
    4. eye tracking
    5. gaze zone estimation

    Qualifiers

    • Research-article
    • Research
    • Refereed limited

    Funding Sources

    • the Shandong Provincial Key Research & Development Plan
    • the National Natural Science Foundation of China

    Conference

    ICDSP 2020

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)7
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 23 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Driver Gaze Zone Estimation Based on Three-Channel Convolution-Optimized Vision Transformer With Transfer LearningIEEE Sensors Journal10.1109/JSEN.2024.348637324:24(42064-42078)Online publication date: 15-Dec-2024
    • (2021)Driver Gaze Zone Estimation via Head Pose Fusion Assisted Supervision and Eye Region Weighted EncodingIEEE Transactions on Consumer Electronics10.1109/TCE.2021.312700667:4(275-284)Online publication date: 1-Nov-2021

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media