skip to main content
10.1145/2638728.2641693acmconferencesArticle/Chapter ViewAbstractPublication PagesubicompConference Proceedingsconference-collections
research-article

Compensation of head movements in mobile eye-tracking data using an inertial measurement unit

Published: 13 September 2014 Publication History

Abstract

Analysis of eye movements recorded with a mobile eye-tracker is difficult since the eye-tracking data are severely affected by simultaneous head and body movements. Automatic analysis methods developed for remote-, and tower-mounted eye-trackers do not take this into account and are therefore not suitable to use for data where also head- and body movements are present. As a result, data recorded with a mobile eye-tracker are often analyzed manually. In this work, we investigate how simultaneous recordings of eye- and head movements can be employed to isolate the motion of the eye in the eye-tracking data. We recorded eye-in-head movements with a mobile eye-tracker and head movements with an Inertial Measurement Unit (IMU). Preliminary results show that by compensating the eye-tracking data with the estimated head orientation, the standard deviation of the data during vestibular-ocular reflex (VOR) eye movements, was reduced from 8.0° to 0.9° in the vertical direction and from 12.9° to 0.6° in the horizontal direction. This suggests that a head compensation algorithm based on IMU data can be used to isolate the movements of the eye and therefore simplify the analysis of data recorded using a mobile eye-tracker.

References

[1]
Ahlström, C., Victor, T., Wege, C., and Steinmetz, E. Processing of eye/head-tracking data in large-scale naturalistic driving data sets. IEEE Transactions on intelligent transportation system 13, 2 (2012), 553--564.
[2]
Allison, R. S., Eizenman, M., and Cheung, B. S. K. Combined head and eye tracking system for dynamic testing of the vestibular system. IEEE Transactions on Biomedical Engineering 41, 11 (1996), 1073--1082.
[3]
Essig, K., Sand, N., Schack, T., Künsemöller, J., Weigelt, M., and Ritter, H. Fully-automatic annotation of scene videos. In SICE Annual Conference (2010), 3304--3307.
[4]
Holmqvist, K., Nyström, M., Andersson, R., Dewhurst, R., Jarodzka, H., and van de Weijer, J. Eye tracking: A comprehensive guide to methods and measures. Oxford University Press, 2011.
[5]
Kinsman, T., Evans, K., Sweeney, G., Keane, T., and Pelz, J. B. Ego-motion compensation improves fixations detection in wearable eye tracking. In ETRA '12 Proceedings of the Symposium on Eye Tracking Research and Applications, ACM (2012), 221--224.
[6]
Madgwick, S. O. H., Harrison, A. J. L., and Vaidyanathan, R. Estimation of imu and marg orientation using a gradient descent algorithm. In IEEE International Conference of Rehabilitation Robotics (2011).
[7]
Rothkopf, C. A., and Pelz, J. B. Head movement estimation for wearable eye tracker. In ETRA '04 Proceedings of the Symposium on Eye Tracking Research and Applications, ACM (2004), 123--130.
[8]
Tafaj, E., Kübler, T., Kasneci, G., Rosenstiel, W., and Bogdan, M. Online classification of eye tracking data for automated analysis of traffic hazard perception. In Artificial Neural Networks and Machine Learning -- ICANN 2013, Springer (2013), 442--450.
[9]
x ioTechnologies. x-IMU User Manual 5.2, x-io Technologies, November 2013. http://www.x-io.co.uk/downloads/.

Cited By

View all

Index Terms

  1. Compensation of head movements in mobile eye-tracking data using an inertial measurement unit

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    UbiComp '14 Adjunct: Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication
    September 2014
    1409 pages
    ISBN:9781450330473
    DOI:10.1145/2638728
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    In-Cooperation

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 13 September 2014

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. eye-tracking
    2. head movement measurement
    3. inertial measurement unit
    4. signal processing

    Qualifiers

    • Research-article

    Conference

    UbiComp '14
    UbiComp '14: The 2014 ACM Conference on Ubiquitous Computing
    September 13 - 17, 2014
    Washington, Seattle

    Acceptance Rates

    Overall Acceptance Rate 764 of 2,912 submissions, 26%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)46
    • Downloads (Last 6 weeks)1
    Reflects downloads up to 14 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)The Salient360! toolboxComputers and Graphics10.1016/j.cag.2024.103890119:COnline publication date: 1-Apr-2024
    • (2022)Comparison of visual SLAM and IMU in tracking head movement outdoorsBehavior Research Methods10.3758/s13428-022-01941-1Online publication date: 11-Aug-2022
    • (2022)Deep-SAGA: a deep-learning-based system for automatic gaze annotation from eye-tracking dataBehavior Research Methods10.3758/s13428-022-01833-455:3(1372-1391)Online publication date: 1-Jun-2022
    • (2022)RETRACTED ARTICLE: Eye tracking: empirical foundations for a minimal reporting guidelineBehavior Research Methods10.3758/s13428-021-01762-855:1(364-416)Online publication date: 6-Apr-2022
    • (2022)Consider the Head Movements! Saccade Computation in Mobile Eye-Tracking2022 Symposium on Eye Tracking Research and Applications10.1145/3517031.3529624(1-7)Online publication date: 8-Jun-2022
    • (2020)SteadEye-Head—Improving MARG-Sensor Based Head Orientation Measurements Through Eye Tracking DataSensors10.3390/s2010275920:10(2759)Online publication date: 12-May-2020
    • (2020)Multimodal Interaction for Real and Virtual EnvironmentsCompanion Proceedings of the 25th International Conference on Intelligent User Interfaces10.1145/3379336.3381506(29-30)Online publication date: 17-Mar-2020
    • (2020)Positional head-eye tracking outside the lab: an open-source solutionACM Symposium on Eye Tracking Research and Applications10.1145/3379156.3391365(1-5)Online publication date: 2-Jun-2020
    • (2019)On the Visuomotor Behavior of Amputees and Able-Bodied People During GraspingFrontiers in Bioengineering and Biotechnology10.3389/fbioe.2019.003167Online publication date: 15-Nov-2019
    • (2016)A geometric method for computing ocular kinematics and classifying gaze events using monocular remote eye tracking in a robotic environmentJournal of NeuroEngineering and Rehabilitation10.1186/s12984-015-0107-413:1Online publication date: 26-Jan-2016
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media