Skip to main content

Eye Movement and Visual Target Synchronization Level Detection Using Deep Learning

  • Conference paper
  • First Online:
  • 1781 Accesses

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 13151))

Abstract

In recent years, deep learning has been widely used in the eye-tracking area. Eye-tracking has been studied to diagnose neurological and psychological diseases early since it is a simple, non-invasive, and objective proxy measurement of cognitive function. This project aims to develop a system to automatically track the synchronisation of eye movement data and its visual target. To achieve this goal, we employ a deep learning algorithm (Points-CNN and Head-CNN) to detect the eye centre location and classify the synchronisation level of the eye movement and visual target. Moreover, we modify the eyediap dataset to assist with our research objective. The video data in the eyediap dataset is used to track the eye movement trajectory, while the visual target movement data is used to extract the direction change window. The movement feature vectors are extracted from the eye movement data and the visual target movement data with the direction change window. Euclidean distance, Cosine similarity, and Jaccard similarity coefficient are used to assist the synchronization detection of the eye and visual target movement vector. In the synchronisation detection part, K-Nearest Neighbors, Support Vector Machine, Logistic Regression are investigated.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   89.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   119.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Pavisic, I.M., et al.: Eyetracking metrics in young onset Alzheimer’s disease: a window into cognitive visual functions. Front. Neurol. 8, 377 (2017)

    Article  Google Scholar 

  2. Crawford, T.J., Devereaux, A., Higham, S., Kelly, C.: The disengagement of visual attention in Alzheimer’s disease: a longitudinal eye-tracking study. Front. Aging Neurosci. (2015). https://doi.org/10.3389/fnagi.2015.00118

    Article  Google Scholar 

  3. Wilcockson, T.D.W., et al.: Abnormalities of saccadic eye movements in dementia due to Alzheimer’s disease and mild cognitive impairment. Aging (Albany NY) 11(15), 5389–5398 (2019). https://doi.org/10.18632/aging.102118

    Article  Google Scholar 

  4. Perez, A., Ratté, S.: Automatic analysis of Alzheimer’s disease, evaluation of eye movements in natural conversations. In: 2020 Alzheimer’s Association International Conference. ALZ (2020)

    Google Scholar 

  5. Nakashima, Y., Morita, K., Ishii, Y., Shouji, Y., Uchimura, N.: Characteristics of exploratory eye movements in elderly people: possibility of early diagnosis of dementia. Psychogeriatrics 10, 124–130 (2010)

    Article  Google Scholar 

  6. Lage, C., et al.: Distinctive oculomotor behaviors in Alzheimer’s disease and frontotemporal dementia. Front. Aging Neurosci. 12, 525 (2021)

    Article  Google Scholar 

  7. Falck-Ytter, T., Bölte, S., Gredebäck, G.: Eye tracking in early autism research. J. Neurodev. Disord. 5, 1–13 (2013)

    Article  Google Scholar 

  8. Liu, W., Yu, X., Raj, B., Yi, L., Zou, X., Li, M.: Efficient autism spectrum disorder prediction with eye movement: A machine learning framework. In: 2015 International Conference on Affective Computing and Intelligent Interaction (ACII), pp. 649–655. IEEE (2015)

    Google Scholar 

  9. Maruta, J., Suh, M., Niogi, S.N., Mukherjee, P., Ghajar, J.: Visual tracking synchronization as a metric for concussion screening. J. Head Trauma Rehabil. 25, 293–305 (2010)

    Article  Google Scholar 

  10. Currie, J., Ramsden, B., McArthur, C., Maruff, P.: Validation of a clinical antisaccadic eye movement test in the assessment of dementia. Arch. Neurol. 48, 644–648 (1991)

    Article  Google Scholar 

  11. Dar, A.H., Wagner, A.S., Hanke, M.: REMoDNaV: Robust Eye-Movement Classification for Dynamic Stimulation. bioRxiv 619254 (2020)

    Google Scholar 

  12. Wang, X., Zhao, X., Ren, J.: A new type of eye movement model based on recurrent neural networks for simulating the gaze behavior of human reading. Complexity 2019, 1–12 (2019)

    Google Scholar 

  13. Zemblys, R., Niehorster, D.C., Komogortsev, O., Holmqvist, K.: Using machine learning to detect events in eye-tracking data. Behav. Res. Methods 50(1), 160–181 (2017). https://doi.org/10.3758/s13428-017-0860-3

    Article  Google Scholar 

  14. Zemblys, R.: Eye-movement event detection meets machine learning. Biomed. Eng. 2016, 20 (2016)

    Google Scholar 

  15. Lorenz, O., Thomas, U.: Real time eye gaze tracking system using CNN-based facial features for human attention measurement. In: VISIGRAPP (5: VISAPP), pp. 598–606 (2019)

    Google Scholar 

  16. Eivazi, S., Santini, T., Keshavarzi, A., Kübler, T., Mazzei, A.: Improving real-time CNN-based pupil detection through domain-specific data augmentation. In: Proceedings of the 11th ACM Symposium on Eye Tracking Research and Applications, pp. 1–6 (2019)

    Google Scholar 

  17. Kan, N., Kondo, N., Chinsatit, W., Saitoh, T.: Effectiveness of data augmentation for CNN-based pupil center point detection. In: 2018 57th Annual Conference of the Society of Instrument and Control Engineers of Japan (SICE), pp. 41–46. IEEE, (2018)

    Google Scholar 

  18. Ioffe, S., Szegedy, C.: Batch normalization: accelerating deep network training by reducing internal covariate shift. arXiv preprint arXiv:1502.03167 (2015)

    Google Scholar 

  19. He, K., Zhang, X., Ren, S., Sun, J.: Delving deep into rectifiers: surpassing human-level performance on imagenet classification. In: Proceedings of the IEEE International Conference on Computer Vision, pp. 1026–1034 (2015)

    Google Scholar 

  20. Colaco, S., Han, D.S.: Facial keypoint detection with convolutional neural networks. In: 2020 International Conference on Artificial Intelligence in Information and Communication (ICAIIC), pp. 671–674. IEEE (2020)

    Google Scholar 

  21. Hessels, R.S., Niehorster, D.C., Kemner, C., Hooge, I.T.C.: Noise-robust fixation detection in eye movement data: Identification by two-means clustering (I2MC). Behav. Res. Methods 49(5), 1802–1823 (2016). https://doi.org/10.3758/s13428-016-0822-1

    Article  Google Scholar 

  22. Chen, L.-C., Papandreou, G., Kokkinos, I., Murphy, K., Yuille, A.L.: Deeplab: semantic image segmentation with deep convolutional nets, atrous convolution, and fully connected CRFS. IEEE Trans. Pattern Anal. Mach. Intell. 40, 834–848 (2017)

    Article  Google Scholar 

  23. Long, J., Shelhamer, E., Darrell, T.: Fully convolutional networks for semantic segmentation. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 3431–3440 (2015)

    Google Scholar 

  24. Ranjan, R., De Mello, S., Kautz, J.: Light-weight head pose invariant gaze tracking. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops, pp. 2156–2164 (2018)

    Google Scholar 

  25. Ali, A., Kim, Y.-G.: Deep fusion for 3D gaze estimation from natural face images using multi-stream CNNs. IEEE Access 8, 69212–69221 (2020)

    Article  Google Scholar 

  26. Zhang, X., Sugano, Y., Fritz, M., Bulling, A.: Appearance-based gaze estimation in the wild. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 4511–4520 (2015)

    Google Scholar 

  27. Meng, C., Zhao, X.: Webcam-based eye movement analysis using CNN. IEEE Access 5, 19581–19587 (2017)

    Article  Google Scholar 

  28. Liu, N., Han, J., Liu, T., Li, X.: Learning to predict eye fixations via multiresolution convolutional neural networks. IEEE Trans. Neural Netw. Learn. Syst. 29, 392–404 (2018)

    Article  MathSciNet  Google Scholar 

  29. Zhu, W., Deng, H.: Monocular free-head 3d gaze tracking with deep learning and geometry constraints. In: Proceedings of the IEEE International Conference on Computer Vision, pp. 3143–3152 (2017)

    Google Scholar 

  30. Stefanov, K.: Webcam-based Eye Gaze Tracking under Natural Head Movement. arXiv preprint arXiv:1803.11088 (2018)

    Google Scholar 

  31. Lahnakoski, J.M., et al.: Synchronous brain activity across individuals underlies shared psychological perspectives. Neuroimage 100, 316–324 (2014)

    Article  Google Scholar 

  32. Wittevrongel, B., Van Hulle, M.M.: Spatiotemporal beamforming: a transparent and unified decoding approach to synchronous visual brain-computer interfacing. Front. Neurosci. 11, 630 (2017)

    Article  Google Scholar 

  33. Copeland, L., Gedeon, T., Mendis, B.S.U.: Predicting reading comprehension scores from eye movements using artificial neural networks and fuzzy output error. Artif. Intell. Res. 3, 35–48 (2014)

    Article  Google Scholar 

  34. Chambayil, B., Singla, R., Jha, R.: EEG eye blink classification using neural network. In: Proceedings of the World Congress on Engineering, pp. 2–5 (2010)

    Google Scholar 

  35. Singla, R., Chambayil, B., Khosla, A., Santosh, J.: Comparison of SVM and ANN for classification of eye events in EEG. J. Biomed. Sci. Eng. 4, 62 (2011)

    Article  Google Scholar 

  36. Chatterjee, D., Gavas, R.D., Chakravarty, K., Sinha, A., Lahiri, U.: Eye movements-an early marker of cognitive dysfunctions. In: 2018 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), pp. 4012–4016. IEEE (2018)

    Google Scholar 

  37. Gruca, A., Harezlak, K., Kasprowski, P.: Application of dimensionality reduction methods for eye movement data classification. In: Gruca, A., Brachman, A., Kozielski, S., Czachórski, T. (eds.) Man–Machine Interactions 4. AISC, vol. 391, pp. 291–303. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-23437-3_25

    Chapter  Google Scholar 

  38. Ghasemi, A., Zahediasl, S.: Normality tests for statistical analysis: a guide for non-statisticians. Int. J. Endocrinol. Metab. 10, 486–489 (2012)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Liuchun Yao .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Yao, L., Park, M., Grag, S., Bai, Q. (2022). Eye Movement and Visual Target Synchronization Level Detection Using Deep Learning. In: Long, G., Yu, X., Wang, S. (eds) AI 2021: Advances in Artificial Intelligence. AI 2022. Lecture Notes in Computer Science(), vol 13151. Springer, Cham. https://doi.org/10.1007/978-3-030-97546-3_54

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-97546-3_54

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-97545-6

  • Online ISBN: 978-3-030-97546-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics