skip to main content
10.1145/3206343.3206351acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
research-article

3D gaze estimation in the scene volume with a head-mounted eye tracker

Published: 15 June 2018 Publication History

Abstract

Most applications involving gaze-based interaction are supported by estimation techniques that find a mapping between gaze data and corresponding targets on a 2D surface. However, in Virtual and Augmented Reality (AR) environments, interaction occurs mostly in a volumetric space, which poses a challenge to such techniques. Accurate point-of-regard (PoR) estimation, in particular, is of great importance to AR applications, since most known setups are prone to parallax error and target ambiguity. In this work, we expose the limitations of widely used techniques for PoR estimation in 3D and propose a new calibration procedure using an uncalibrated head-mounted binocular eye tracker coupled with an RGB-D camera to track 3D gaze within the scene volume. We conducted a study to evaluate our setup with real-world data using a geometric and an appearance-based method. Our results show that accurate estimation in this setting still is a challenge, though some gaze-based interaction techniques in 3D should be possible.

References

[1]
William Welby Abbott and Aldo Ahmed Faisal. 2012. Ultra-low-cost 3D gaze estimation: an intuitive high information throughput compliment to direct brain-machine interfaces. Journal of neural engineering 9, 4 (2012), 046016.
[2]
Andrew T. Duchowski, Donald H. House, Jordan Gestring, Robert Congdon, Lech Swirski, Neil A. Dodgson, Krzysztof Krejtz, and Izabela Krejtz. 2014. Comparing estimated gaze depth in virtual and physical environments. In Eye Tracking Research and Applications, ETRA '14, Safety Harbor, FL, USA, March 26-28, 2014. 103--110.
[3]
Andrew T. Duchowski, Brandon Pelfrey, Donald H. House, and Rui I. Wang. 2011. Measuring gaze depth with an eye tracker during stereoscopic display. In Proceedings of the 8th Symposium on Applied Perception in Graphics and Visualization, APGV 2011, Toulouse, France, August 27-28, 2011. 15--22.
[4]
Kai Essig, Marc Pomplun, and Helge J. Ritter. 2006. A neural network for 3D gaze recording with binocular eye trackers. IJPEDS 21, 2 (2006), 79--95.
[5]
Mark Fiala. 2005. ARTag, a Fiducial Marker System Using Digital Techniques. In 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR 2005), 20-26 June 2005, San Diego, CA, USA. 590--596.
[6]
Elias Daniel Guestrin and Moshe Eizenman. 2006. General theory of remote gaze estimation using the pupil center and corneal reflections. IEEE Transactions on biomedical engineering 53, 6 (2006), 1124--1133.
[7]
Dan Witzner Hansen and Qiang Ji. 2010. In the Eye of the Beholder: A Survey of Models for Eyes and Gaze. IEEE Trans. Pattern Anal. Mach. Intell. 32, 3 (2010), 478--500.
[8]
Craig Hennessey and Peter D. Lawrence. 2009. Noncontact Binocular Eye-Gaze Tracking for Point-of-Gaze Estimation in Three Dimensions. IEEE Trans. Biomed. Engineering 56, 3 (2009), 790--799.
[9]
Yuta Itoh and Gudrun Klinker. 2014. Interaction-free calibration for optical see-through head-mounted displays based on 3D Eye localization. In IEEE Symposium on 3D User Interfaces, 3DUI 2014, Minneapolis, MN, USA, March 29-30, 2014. 75--82.
[10]
Yong-Moo Kwon, Kyeong-Won Jeon, Jeongseok Ki, Qonita M. Shahab, Sangwoo Jo, and Sung-Kyu Kim. 2006. 3D Gaze Estimation and Interaction to Stereo Dispaly. IJVR 5, 3 (2006), 41--45. http://www.ijvr.org/sub/issues/issue3/16-1394-KIST-YMKWON-20061021.pdf
[11]
Christian Lander, Sven Gehring, Antonio Krüger, Sebastian Boring, and Andreas Bulling. 2015. GazeProjector: Accurate Gaze Estimation and Seamless Gaze Interaction Across Multiple Displays. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology, UIST 2015, Charlotte, NC, USA, November 8-11, 2015. 395--404.
[12]
Youngho Lee, Choonsung Shin, Alexander Plopski, Yuta Itoh, Thammathip Piumsomboon, Arindam Dey, Gun A. Lee, Seungwon Kim, and Mark Billinghurst. 2017. Estimating Gaze Depth Using Multi-Layer Perceptron. In 2017 International Symposium on Ubiquitous Virtual Reality, ISUVR 2017, Nara, Japan, June 27-29, 2017. 26--29.
[13]
Jianfeng Li and Shigang Li. 2014. Eye-Model-Based Gaze Estimation by RGB-D Camera. In IEEE Conference on Computer Vision and Pattern Recognition, CVPR Workshops 2014, Columbus, OH, USA, June 23-28, 2014. 606--610.
[14]
Morten Lidegaard, Dan Witzner Hansen, and Norbert Krüger. 2014. Head mounted device for point-of-gaze estimation in three dimensions. In Eye Tracking Research and Applications, ETRA '14, Safety Harbor, FL, USA, March 26-28, 2014. 83--86.
[15]
Mohsen Mansouryar, Julian Steil, Yusuke Sugano, and Andreas Bulling. 2016. 3D gaze estimation from 2D pupil positions on monocular head-mounted eye trackers. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications, ETRA 2016, Charleston, SC, USA, March 14-17, 2016. 197--200.
[16]
Diako Mardanbegi and Dan Witzner Hansen. 2011. Mobile gaze-based screen interaction in 3D environments. In NGCA 2011, First Conference on Novel Gaze-Controlled Applications, Karlskrona, Sweden, May 26 - 27, 2011. 2.
[17]
Diako Mardanbegi and Dan Witzner Hansen. 2012. Parallax error in the monocular head-mounted eye trackers. In The 2012 ACM Conference on Ubiquitous Computing, Ubicomp '12, Pittsburgh, PA, USA, September 5-8, 2012. 689--694.
[18]
Christopher McMurrough, Christopher Conly, Vassilis Athitsos, and Fillia Makedon. 2012. 3D point of gaze estimation using head-mounted RGB-D cameras. In The 14th International ACM SIGACCESS Conference on Computers and Accessibility, ASSETS '12, Boulder, CO, USA, October 22-24, 2012. 283--284.
[19]
Ikuhisa Mitsugami, Norimichi Ukita, and Masatsugu Kidode. 2003. Estimation of 3D gazed position using view lines. In 12th International Conference on Image Analysis and Processing (ICIAP 2003), 17-19 September 2003, Mantova, Italy. 466--471.
[20]
Kenneth Alberto Funes Mora and Jean-Marc Odobez. 2014. Geometric Generative Gaze Estimation (G3E) for Remote RGB-D Cameras. In 2014 IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2014, Columbus, OH, USA, June 23---28, 2014. 1773--1780.
[21]
Susan M. Munn and Jeff B. Pelz. 2008. 3D point-of-regard, position and head orientation from a portable monocular video-based eye tracker. In Proceedings of the Eye Tracking Research & Application Symposium, ETRA 2008, Savannah, Georgia, USA, March 26-28, 2008. 181--188.
[22]
Lucas Paletta, Katrin Santner, Gerald Fritz, Heinz Mayer, and Johann Schrammel. 2013. 3D attention: measurement of visual saliency using eye tracking glasses. In 2013 ACM SIGCHI Conference on Human Factors in Computing Systems, CHI '13, Paris, France, April 27 - May 2, 2013, Extended Abstracts. 199--204.
[23]
Thies Pfeiffer and Patrick Renner. 2014. EyeSee3D: a low-cost approach for analyzing mobile 3D eye tracking data using computer vision and augmented reality technology. In Eye Tracking Research and Applications, ETRA '14, Safety Harbor, FL, USA, March 26-28, 2014. 195--202.
[24]
Fiora Pirri, Matia Pizzoli, and Alessandro Rudi. 2011. A general method for the point of regard estimation in 3D space. In The 24th IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2011, Colorado Springs, CO, USA, 20-25 June 2011. 921--928.
[25]
Carl Edward Rasmussen and Christopher K. I. Williams. 2006. Gaussian processes for machine learning. MIT Press. http://www.worldcat.org/oclc/61285753
[26]
Laura Sesma-Sanchez, Yanxia Zhang, Andreas Bulling, and Hans Gellersen. 2016. Gaussian processes as an alternative to polynomial gaze estimation functions. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications, ETRA 2016, Charleston, SC, USA, March 14-17, 2016. 229--232.
[27]
Yusuke Sugano, Yasuyuki Matsushita, and Yoichi Sato. 2013. Appearance-Based Gaze Estimation Using Visual Saliency. IEEE Trans. Pattern Anal. Mach. Intell. 35, 2 (2013), 329--341.
[28]
Lech Swirski and Neil Dodgson. 2013. A fully-automatic, temporal approach to single camera, glint-free 3d eye model fitting. Proc. PETMEI (2013).
[29]
Kentaro Takemura, Yuji Kohashi, Tsuyoshi Suenaga, Jun Takamatsu, and Tsukasa Ogasawara. 2010. Estimating 3D point-of-regard and visualizing gaze trajectories under natural head movements. In Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications, ETRA 2010, Austin, Texas, USA, March 22-24, 2010. 157--160.
[30]
Vildan Tanriverdi and Robert J. K. Jacob. 2000. Interacting with eye movements in virtual environments. In Proceedings of the CHI 2000 Conference on Human factors in computing systems, The Hague, The Netherlands, April 1-6, 2000. 265--272.
[31]
Xuehan Xiong, Qin Cai, Zicheng Liu, and Zhengyou Zhang. 2014. Eye gaze tracking using an RGBD camera: a comparison with a RGB solution. In The 2014 ACM Conference on Ubiquitous Computing, UbiComp '14 Adjunct, Seattle, WA, USA - September 13 - 17, 2014. 1113--1121.

Cited By

View all
  • (2025)Gaze depth estimation using vestibulo-ocular reflex and GDENet for 3D target disambiguationDisplays10.1016/j.displa.2025.102978(102978)Online publication date: Jan-2025
  • (2024)Towards Adapting CLIP for Gaze Object PredictionProceedings of the ACM on Human-Computer Interaction10.1145/36981328:ISS(155-164)Online publication date: 24-Oct-2024
  • (2024)Self-Calibrating Gaze Estimation With Optical Axes Projection for Head-Mounted Eye TrackingIEEE Transactions on Industrial Informatics10.1109/TII.2023.327632220:2(1397-1407)Online publication date: Feb-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
COGAIN '18: Proceedings of the Workshop on Communication by Gaze Interaction
June 2018
69 pages
ISBN:9781450357906
DOI:10.1145/3206343
  • General Chairs:
  • Carlos Morimoto,
  • Thies Pfeiffer
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 15 June 2018

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. 3D dataset
  2. calibration
  3. gaze estimation
  4. head-mounted eye tracking

Qualifiers

  • Research-article

Funding Sources

Conference

ETRA '18

Upcoming Conference

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)89
  • Downloads (Last 6 weeks)13
Reflects downloads up to 10 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2025)Gaze depth estimation using vestibulo-ocular reflex and GDENet for 3D target disambiguationDisplays10.1016/j.displa.2025.102978(102978)Online publication date: Jan-2025
  • (2024)Towards Adapting CLIP for Gaze Object PredictionProceedings of the ACM on Human-Computer Interaction10.1145/36981328:ISS(155-164)Online publication date: 24-Oct-2024
  • (2024)Self-Calibrating Gaze Estimation With Optical Axes Projection for Head-Mounted Eye TrackingIEEE Transactions on Industrial Informatics10.1109/TII.2023.327632220:2(1397-1407)Online publication date: Feb-2024
  • (2024)A Hybrid Gaze Distance Estimation via Cross-Reference of Vergence and DepthIEEE Access10.1109/ACCESS.2024.351035712(182618-182626)Online publication date: 2024
  • (2024)A Real-Time Eye Gaze Tracking Based Digital MouseInnovative Mobile and Internet Services in Ubiquitous Computing10.1007/978-3-031-64766-6_5(39-46)Online publication date: 1-Jul-2024
  • (2023)GlassesValidator: A data quality tool for eye tracking glassesBehavior Research Methods10.3758/s13428-023-02105-556:3(1476-1484)Online publication date: 8-Jun-2023
  • (2023)An Improved Unscented Kalman Filtering Combined with Feature Triangle for Head Position TrackingElectronics10.3390/electronics1212266512:12(2665)Online publication date: 14-Jun-2023
  • (2023)Driver Gaze Fixation and Pattern Analysis in Safety Critical Events2023 IEEE Intelligent Vehicles Symposium (IV)10.1109/IV55152.2023.10186718(1-8)Online publication date: 4-Jun-2023
  • (2022)Estimating 3D spatiotemporal point of regard: a device evaluationJournal of the Optical Society of America A10.1364/JOSAA.45766339:8(1343)Online publication date: 11-Jul-2022
  • (2022)Gaze-Vergence-Controlled See-Through Vision in Augmented RealityIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2022.320311028:11(3843-3853)Online publication date: Nov-2022
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media