skip to main content
10.1145/3450341.3458880acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
short-paper

Sub-centimeter 3D gaze vector accuracy on real-world tasks: an investigation of eye and motion capture calibration routines

Published: 25 May 2021 Publication History

Abstract

Measuring where people look in real-world tasks has never been easier but analyzing the resulting data remains laborious. One solution integrates head-mounted eye tracking with motion capture but no best practice exists regarding what calibration data to collect. Here, we compared four ~1 min calibration routines used to train linear regression gaze vector models and examined how the coordinate system, eye data used and location of fixation changed gaze vector accuracy on three trial types: calibration, validation (static fixation to task relevant locations), and task (naturally occurring fixations during object interaction). Impressively, predicted gaze vectors show ~1 cm of error when looking straight ahead toward objects during natural arms-length interaction. This result was achieved predicting fixations in a Spherical coordinate frame, from the best monocular data, and, surprisingly, depends little on the calibration routine.

References

[1]
JASP Team. 2020. JASP (Version 0.14.1)[Computer software]. https://jasp-stats.org/
[2]
Moritz Kassner, William Patera, and Andreas Bulling. 2014. Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction. In Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication(Seattle, Washington) (UbiComp ’14 Adjunct). Association for Computing Machinery, New York, NY, USA, 1151–1160.
[3]
M F Land and M Hayhoe. 2001. In what ways do eye movements contribute to everyday activities?Vision Res. 41, 25-26 (2001), 3559–3565.
[4]
Ewen B Lavoie, Aïda M Valevicius, Quinn A Boser, Ognjen Kovic, Albert H Vette, Patrick M Pilarski, Jacqueline S Hebert, and Craig S Chapman. 2018. Using synchronized eye and motion tracking to determine high-precision eye-movement patterns during object-interaction tasks. J. Vis. 18, 6 (June 2018), 18.
[5]
Marcus Nyström, Richard Andersson, Kenneth Holmqvist, and Joost van de Weijer. 2013. The influence of calibration method and eye physiology on eyetracking data quality. Behav. Res. Methods 45, 1 (March 2013), 272–288.
[6]
SCCN. 2021. Lab Streaming Layer.
[7]
Aïda M Valevicius, Quinn A Boser, Ewen B Lavoie, Glyn S Murgatroyd, Patrick M Pilarski, Craig S Chapman, Albert H Vette, and Jacqueline S Hebert. 2018. Characterization of normative hand movements during two functional upper limb tasks. PLoS One 13, 6 (June 2018), e0199549.
[8]
Heather E Williams, Craig S Chapman, Patrick M Pilarski, Albert H Vette, and Jacqueline S Hebert. 2019. Gaze and Movement Assessment (GaMA): Inter-site validation of a visuomotor upper limb functional protocol. PLoS One 14, 12 (Dec. 2019), e0219333.

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
ETRA '21 Adjunct: ACM Symposium on Eye Tracking Research and Applications
May 2021
78 pages
ISBN:9781450383578
DOI:10.1145/3450341
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 25 May 2021

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. calibration
  2. eye tracking
  3. gaze vector
  4. linear regression
  5. motion capture

Qualifiers

  • Short-paper
  • Research
  • Refereed limited

Funding Sources

  • Canadian Foundation for Innovation
  • NSERC Discovery Grant

Conference

ETRA '21
Sponsor:

Acceptance Rates

Overall Acceptance Rate 69 of 137 submissions, 50%

Upcoming Conference

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 109
    Total Downloads
  • Downloads (Last 12 months)7
  • Downloads (Last 6 weeks)1
Reflects downloads up to 10 Feb 2025

Other Metrics

Citations

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media