skip to main content
10.1145/3314111.3319829acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
research-article

Getting (more) real: bringing eye movement classification to HMD experiments with equirectangular stimuli

Published: 25 June 2019 Publication History

Abstract

The classification of eye movements is a very important part of eye tracking research and has been studied since its early days. Over recent years, we have experienced an increasing shift towards more immersive experimental scenarios with the use of eye-tracking enabled glasses and head-mounted displays. In these new scenarios, however, most of the existing eye movement classification algorithms cannot be applied robustly anymore because they were developed with monitor-based experiments using regular 2D images and videos in mind. In this paper, we describe two approaches that reduce artifacts of eye movement classification for 360° videos shown in head-mounted displays. For the first approach, we discuss how decision criteria have to change in the space of 360° videos, and use these criteria to modify five popular algorithms from the literature. The modified algorithms are publicly available at https://web.gin.g-node.org/ioannis.agtzidis/360_em_algorithms. For cases where an existing algorithm cannot be modified, e.g. because it is closed-source, we present a second approach that maps the data instead of the algorithm to the 360° space. An empirical evaluation of both approaches shows that they significantly reduce the artifacts of the initial algorithm, especially in the areas further from the horizontal midline.

References

[1]
Ioannis Agtzidis, Mikhail Startsev, and Michael Dorr. 2016. Smooth pursuit detection based on multiple observers. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications (ETRA '16). ACM, New York, NY, USA, 303--306.
[2]
Ioannis Agtzidis, Mikhail Startsev, and Michael Dorr. 2019. A Ground-Truth Data Set and a Classification Algorithm for Eye Movements in 360-degree Videos. arXiv preprint arXiv:1903.06474 (2019).
[3]
Michael Barz. 2015. PUPIL fixation detection. https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/fixation_detector.py. (2015).
[4]
Mungyeong Choe, Yeongcheol Choi, Jaehyun Park, and Hyun K Kim. 2018. Comparison of Gaze Cursor Input Methods for Virtual Reality Devices. International Journal of Human-Computer Interaction (2018), 1--10.
[5]
Erwan J David, Jesús Gutiérrez, Antoine Coutrot, Matthieu Perreira Da Silva, and Patrick Le Callet. 2018. A dataset of head and eye movements for 360° videos. In Proceedings of the 9th ACM Multimedia Systems Conference. ACM, 432--437.
[6]
Gabriel Diaz, Joseph Cooper, Dmitry Kit, and Mary Hayhoe. 2013. Real-time recording and classification of eye movements in an immersive virtual environment. Journal of Vision 13, 12 (2013), 5--5.
[7]
Michael Dorr, Thomas Martinetz, Karl R Gegenfurtner, and Erhardt Barth. 2010. Variability of eye movements when viewing dynamic natural scenes. Journal of Vision 10, 10 (2010), 28--28.
[8]
Andrew T Duchowski, Eric Medlin, Nathan Cournia, Anand Gramopadhye, Brian Melloy, and Santosh Nair. 2002. 3D eye movement analysis for VR visual inspection training. In Proceedings of the 2002 symposium on Eye tracking research & applications. ACM, 103--110.
[9]
Mohamad A Eid, Nikolas Giakoumidis, and Abdulmotaleb El-Saddik. 2016. A Novel Eye-Gaze-Controlled Wheelchair System for Navigating Unknown Environments: Case Study With a Person With ALS. IEEE Access 4 (2016), 558--573.
[10]
Sabrina Hoppe and Andreas Bulling. 2016. End-to-End Eye Movement Detection Using Convolutional Neural Networks. ArXiv e-prints (Sept. 2016). arXiv:cs.CV/1609.02452
[11]
Akdas Hossain and Emma Miléus. 2016. Eye Movement Event Detection for Wearable Eye Trackers. (2016). Linköpings Universitet thesis LiTH-MAT-EX-2016/02-SE.
[12]
Thomas Kinsman, Karen Evans, Glenn Sweeney, Tommy Keane, and Jeff Pelz. 2012. Ego-motion compensation improves fixation detection in wearable eye tracking. In Proceedings of the Symposium on Eye Tracking Research and Applications. ACM, 221--224.
[13]
Oleg V Komogortsev, Denise V Gobert, Sampath Jayarathna, Do Hyong Koh, and Sandeep M Gowda. 2010. Standardization of automated analyses of oculomotor fixation and saccadic behaviors. IEEE Transactions on Biomedical Engineering 57, 11 (2010), 2635--2645.
[14]
S. I. Ktena, W. Abbott, and A. A. Faisal. 2015. A virtual reality platform for safe evaluation and training of natural gaze-based wheelchair driving. In 2015 7th International IEEE/EMBS Conference on Neural Engineering (NER). 236--239.
[15]
Linnéa Larsson, Marcus Nyström, Richard Andersson, and Martin Stridh. 2015. Detection of fixations and smooth pursuit movements in high-speed eye-tracking data. Biomedical signal processing and control 18 (2015), 145--152.
[16]
Otto Hans-Martin Lutz, Charlotte Burmeister, Luara Ferreira dos Santos, Nadine Morkisch, Christian Dohle, and Jörg Krüger. 2017. Application of head-mounted devices with eye-tracking in virtual reality therapy. Current Directions in Biomedical Engineering 3, 1 (2017), 53--56.
[17]
Anneli Olsen. 2012. The Tobii I-VT fixation filter. Tobii Technology (2012).
[18]
Yashas Rai, Jesús Gutiérrez, and Patrick Le Callet. 2017. A dataset of head and eye movements for 360 degree images. In Proceedings of the 8th ACM on Multimedia Systems Conference. ACM, 205--210.
[19]
Dario D Salvucci and Joseph H Goldberg. 2000. Identifying fixations and saccades in eye-tracking protocols. In Proceedings of the 2000 symposium on Eye tracking research & applications. ACM, 71--78.
[20]
Nikolaos Sidorakis, George Alex Koulieris, and Katerina Mania. 2015. Binocular eye-tracking for the control of a 3D immersive multimedia user interface. In Everyday Virtual Reality (WEVR), 2015 IEEE 1st Workshop on. IEEE, 15--18.
[21]
Vincent Sitzmann, Ana Serrano, Amy Pavel, Maneesh Agrawala, Diego Gutierrez, Belen Masia, and Gordon Wetzstein. 2018. Saliency in VR: How do people explore virtual environments? IEEE Transactions on Visualization and Computer Graphics 24, 4 (2018), 1633--1642.
[22]
Mikhail Startsev, Ioannis Agtzidis, and Michael Dorr. 2016. Smooth Pursuit. http://michaeldorr.de/smoothpursuit/. (2016).
[23]
Mikhail Startsev, Ioannis Agtzidis, and Michael Dorr. 2018. 1D CNN with BLSTM for automated classification of fixations, saccades, and smooth pursuits. Behavior Research Methods (2018), 1--17.
[24]
Julian Steil, Michael Xuelin Huang, and Andreas Bulling. 2018. Fixation detection for head-mounted eye tracking based on visual similarity of gaze targets. In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications. ACM, 23.
[25]
Yanyu Xu, Yanbing Dong, Junru Wu, Zhengzhong Sun, Zhiru Shi, Jingyi Yu, and Shenghua Gao. 2018. Gaze Prediction in Dynamic 360° Immersive Videos. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 5333--5342.
[26]
Sergey Zagoruyko and Nikos Komodakis. 2015. Learning to compare image patches via convolutional neural networks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 4353--4361.

Cited By

View all

Index Terms

  1. Getting (more) real: bringing eye movement classification to HMD experiments with equirectangular stimuli

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      ETRA '19: Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications
      June 2019
      623 pages
      ISBN:9781450367097
      DOI:10.1145/3314111
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 25 June 2019

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. 360° content
      2. event detection
      3. eye movement classification

      Qualifiers

      • Research-article

      Funding Sources

      Conference

      ETRA '19

      Acceptance Rates

      Overall Acceptance Rate 69 of 137 submissions, 50%

      Upcoming Conference

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • 0
        Total Citations
      • 205
        Total Downloads
      • Downloads (Last 12 months)9
      • Downloads (Last 6 weeks)1
      Reflects downloads up to 12 Feb 2025

      Other Metrics

      Citations

      Cited By

      View all

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Figures

      Tables

      Media

      Share

      Share

      Share this Publication link

      Share on social media