skip to main content
research-article
Public Access

Practical Perception-Based Evaluation of Gaze Prediction for Gaze Contingent Rendering

Published: 18 May 2023 Publication History

Abstract

This paper proposes a novel evaluation framework, termed "critical evaluation periods," for evaluating continuous gaze prediction models. This framework emphasizes prediction performance when it is most critical for gaze prediction to be accurate relative to user perception. Based on perceptual characteristics of the human visual system such as saccadic suppression, this framework provides a more practical assessment of gaze prediction performance for gaze-contingent rendering compared to the dominant sample-by-sample evaluation strategy employed in literature, which overemphasizes performance during easy-to-predict periods of fixation. Using a case study with a lightweight deep learning gaze prediction model, we observe a significant discrepancy in the reported prediction accuracy between the proposed critical evaluation periods and the dominant evaluation strategy employed in literature. Based on our findings, we suggest that the proposed framework is more suitable for evaluating the performance of continuous gaze prediction models intended for gaze-contingent rendering applications.

Supplemental Material

MP4 File
Presentation video

References

[1]
Rachel A. Albert, Anjul Patney, David P. Luebke, and Joohwan Kim. 2017. Latency Requirements for Foveated Rendering in Virtual Reality. ACM Transactions on Applied Perception (TAP), Vol. 14 (2017), 1 -- 13.
[2]
James Anliker. 1976. Eye movements - On-line measurement, analysis, and control.
[3]
Elena Arabadzhiyska, Okan Tarhan Tursun, Karol Myszkowski, Hans-Peter Seidel, and Piotr Didyk. 2017. Saccade Landing Position Prediction for Gaze-Contingent Rendering. ACM Trans. Graph., Vol. 36, 4, Article 50 (jul 2017), bibinfonumpages12 pages. https://doi.org/10.1145/3072959.3073642
[4]
Fergus W. Campbell and Robert H. Wurtz. 1978. Saccadic omission: Why we do not see a grey-out during a saccadic eye movement. Vision Research, Vol. 18, 10 (1978), 1297--1303. https://doi.org/10.1016/0042--6989(78)90219--5
[5]
William Falcon et al. 2019. PyTorch Lightning. https://github.com/Lightning-AI/lightning
[6]
Lee Friedman, Ioannis Rigas, Evgeny Abdulin, and Oleg V. Komogortsev. 2018. A novel evaluation of two related and two independent algorithms for eye movement classification during reading. Behavior Research Methods, Vol. 50, 4 (01 Aug 2018), 1374--1397. https://doi.org/10.3758/s13428-018--1050--7
[7]
Henry Griffith, Dillon Lohr, Evgeny Abdulin, and Oleg Komogortsev. 2021. GazeBase, a large-scale, multi-stimulus, longitudinal eye movement dataset. Scientific Data, Vol. 8 (07 2021). https://doi.org/10.1038/s41597-021-00959-y
[8]
Peng Han, Daniel R. Saunders, Russell L. Woods, and Gang Luo. 2013. Trajectory prediction of saccadic eye movements using a compressed exponential model. Journal of vision, Vol. 13, 8 (31 Jul 2013), 27. https://doi.org/10.1167/13.8.27 23902753[pmid].
[9]
Kenneth Holmqvist, Marcus Nyström, Richard Andersson, Richard Dewhurst, Halszka Jarodzka, and Joost Van de Weijer. 2011. Eye tracking: A comprehensive guide to methods and measures. OUP Oxford.
[10]
Z. Hu, A. Bulling, S. Li, and G. Wang. 2021. FixationNet: Forecasting Eye Fixations in Task-Oriented Virtual Environments. IEEE Transactions on Visualization and Computer Graphics, Vol. 27, 05 (may 2021), 2681--2690. https://doi.org/10.1109/TVCG.2021.3067779
[11]
Zhiming Hu, Congyi Zhang, Sheng Li, Guoping Wang, and Dinesh Manocha. 2019. SGaze: A Data-Driven Eye-Head Coordination Model for Realtime Gaze Prediction. IEEE Transactions on Visualization and Computer Graphics, Vol. 25, 5 (2019), 2002--2010. https://doi.org/10.1109/TVCG.2019.2899187
[12]
Michael Ibbotson and Bart Krekelberg. 2011. Visual perception and saccadic eye movements. Current opinion in neurobiology, Vol. 21, 4 (Aug 2011), 553--558. https://doi.org/10.1016/j.conb.2011.05.012 21646014[pmid].
[13]
Gazi Karam Illahi, Matti Siekkinen, Teemu K"am"ar"ainen, and Antti Yl"a-J"a"aski. 2022. Real-Time Gaze Prediction in Virtual Reality. In Proceedings of the 14th International Workshop on Immersive Mixed and Virtual Environment Systems (Athlone, Ireland) (MMVE '22). Association for Computing Machinery, New York, NY, USA, 12--18. https://doi.org/10.1145/3534086.3534331
[14]
Yashas Joshi and Charalambos Poullis. 2020. Inattentional Blindness for Redirected Walking Using Dynamic Foveated Rendering. IEEE Access, Vol. 8 (2020), 39013--39024. https://doi.org/10.1109/ACCESS.2020.2975032
[15]
Diederik P. Kingma and Jimmy Ba. 2014. Adam: A Method for Stochastic Optimization. https://doi.org/10.48550/ARXIV.1412.6980
[16]
Oleg V. Komogortsev and Alex Karpov. 2013. Automated classification and scoring of smooth pursuit eye movements in the presence of fixations and saccades. Behavior Research Methods, Vol. 45, 1 (01 Mar 2013), 203--215. https://doi.org/10.3758/s13428-012-0234--9
[17]
Oleg V. Komogortsev and Javed I. Khan. 2008. Eye Movement Prediction by Kalman Filter with Integrated Linear Horizontal Oculomotor Plant Mechanical Model. In Proceedings of the 2008 Symposium on Eye Tracking Research and Applications (Savannah, Georgia) (ETRA '08). Association for Computing Machinery, New York, NY, USA, 229--236. https://doi.org/10.1145/1344471.1344525
[18]
R. John Leigh and David S. Zee. 2006. The Neurology of Eye Movements. Oxford University Press.
[19]
Lester C. Loschky and Gary S. Wolverton. 2007. How Late Can You Update Gaze-Contingent Multiresolutional Displays without Detection? ACM Trans. Multimedia Comput. Commun. Appl., Vol. 3, 4, Article 7 (dec 2007), bibinfonumpages10 pages. https://doi.org/10.1145/1314303.1314310
[20]
Martin Meißner, Jella Pfeiffer, Thies Pfeiffer, and Harmen Oppewal. 2019. Combining virtual reality and mobile eye tracking to provide a naturalistic experimental environment for shopper research. Journal of Business Research, Vol. 100 (2019), 445--458. https://doi.org/10.1016/j.jbusres.2017.09.028
[21]
Aythami Morales, Francisco M. Costela, and Russell L. Woods. 2021. Saccade Landing Point Prediction Based on Fine-Grained Learning Method. IEEE Access, Vol. 9 (2021), 52474--52484. https://doi.org/10.1109/ACCESS.2021.3070511
[22]
Adam Paszke, Sam Gross, Francisco Massa, Adam Lerer, James Bradbury, Gregory Chanan, Trevor Killeen, Zeming Lin, Natalia Gimelshein, Luca Antiga, Alban Desmaison, Andreas Kopf, Edward Yang, Zachary DeVito, Martin Raison, Alykhan Tejani, Sasank Chilamkurthy, Benoit Steiner, Lu Fang, Junjie Bai, and Soumith Chintala. 2019. PyTorch: An Imperative Style, High-Performance Deep Learning Library. In Advances in Neural Information Processing Systems 32, H. Wallach, H. Larochelle, A. Beygelzimer, F. dtextquotesingle Alché-Buc, E. Fox, and R. Garnett (Eds.). Curran Associates, Inc., 8024--8035. http://papers.neurips.cc/paper/9015-pytorch-an-imperative-style-high-performance-deep-learning-library.pdf
[23]
Anitha S. Pillai and Prabha Susy Mathew. 2019. Impact of Virtual Reality in Healthcare: A Review. IGI Global, Hershey, PA, USA, 17--31. https://doi.org/10.4018/978--1--5225--7168--1.ch002
[24]
Giancarlo D. Salton and John D. Kelleher. 2019. Persistence pays off: Paying Attention to What the LSTM Gating Mechanism Persists. In RANLP.
[25]
Daniel R. Saunders and Russell L. Woods. 2014. Direct measurement of the system latency of gaze-contingent displays. Behavior Research Methods, Vol. 46, 2 (01 Jun 2014), 439--447. https://doi.org/10.3758/s13428-013-0375--5
[26]
Nicholas T. Swafford, José A. Iglesias-Guitian, Charalampos Koniaris, Bochang Moon, Darren Cosker, and Kenny Mitchell. 2016. User, Metric, and Computational Evaluation of Foveated Rendering Methods. In Proceedings of the ACM Symposium on Applied Perception (Anaheim, California) (SAP '16). Association for Computing Machinery, New York, NY, USA, 7--14. https://doi.org/10.1145/2931002.2931011
[27]
Matthias Tangemann, Matthias Kümmerer, Thomas S. A. Wallis, and Matthias Bethge. 2020. Measuring the Importance of Temporal Features in Video Saliency. In Computer Vision -- ECCV 2020, Andrea Vedaldi, Horst Bischof, Thomas Brox, and Jan-Michael Frahm (Eds.). Springer International Publishing, Cham, 667--684.
[28]
Tobii. 2022. Foveated Rendering: Benefits and Costs. https://vr.tobii.com/sdk/learn/foveation/rendering/benefits-costs/
[29]
Frances C. Volkmann. 1986. Human visual suppression. Vision Research, Vol. 26, 9 (1986), 1401--1416. https://doi.org/10.1016/0042--6989(86)90164--1 Twenty-Fifth Anniversary Issue of Vision Research.
[30]
Shuhang Wang, Russell L. Woods, Francisco M. Costela, and Gang Luo. 2017. Dynamic gaze-position prediction of saccadic eye movements using a Taylor series. Journal of Vision, Vol. 17 (2017).
[31]
James P. Wilmott, Ian M. Erkelens, T. Scott Murdison, and Kevin W. Rio. 2022. Perceptibility of Jitter in Augmented Reality Head-Mounted Displays. In 2022 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). 470--478. https://doi.org/10.1109/ISMAR55827.2022.00063 io

Cited By

View all
  • (2025)Oculomotor Plant Mathematical Model in Kalman Filter Form With Peak Velocity-Based Neural Pulse for Continuous Gaze PredictionIEEE Access10.1109/ACCESS.2025.352810413(11544-11559)Online publication date: 2025

Recommendations

Comments

Information & Contributors

Information

Published In

cover image Proceedings of the ACM on Human-Computer Interaction
Proceedings of the ACM on Human-Computer Interaction  Volume 7, Issue ETRA
ETRA
May 2023
234 pages
EISSN:2573-0142
DOI:10.1145/3597645
Issue’s Table of Contents
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 18 May 2023
Published in PACMHCI Volume 7, Issue ETRA

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. eye tracking
  2. foveated rendering
  3. gaze prediction
  4. machine learning

Qualifiers

  • Research-article

Data Availability

Funding Sources

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)128
  • Downloads (Last 6 weeks)18
Reflects downloads up to 16 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2025)Oculomotor Plant Mathematical Model in Kalman Filter Form With Peak Velocity-Based Neural Pulse for Continuous Gaze PredictionIEEE Access10.1109/ACCESS.2025.352810413(11544-11559)Online publication date: 2025

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Login options

Full Access

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media