skip to main content
10.1145/2638728.2641693acmconferencesArticle/Chapter ViewAbstractPublication PagesubicompConference Proceedingsconference-collections
research-article

Compensation of head movements in mobile eye-tracking data using an inertial measurement unit

Published:13 September 2014Publication History

ABSTRACT

Analysis of eye movements recorded with a mobile eye-tracker is difficult since the eye-tracking data are severely affected by simultaneous head and body movements. Automatic analysis methods developed for remote-, and tower-mounted eye-trackers do not take this into account and are therefore not suitable to use for data where also head- and body movements are present. As a result, data recorded with a mobile eye-tracker are often analyzed manually. In this work, we investigate how simultaneous recordings of eye- and head movements can be employed to isolate the motion of the eye in the eye-tracking data. We recorded eye-in-head movements with a mobile eye-tracker and head movements with an Inertial Measurement Unit (IMU). Preliminary results show that by compensating the eye-tracking data with the estimated head orientation, the standard deviation of the data during vestibular-ocular reflex (VOR) eye movements, was reduced from 8.0° to 0.9° in the vertical direction and from 12.9° to 0.6° in the horizontal direction. This suggests that a head compensation algorithm based on IMU data can be used to isolate the movements of the eye and therefore simplify the analysis of data recorded using a mobile eye-tracker.

References

  1. Ahlström, C., Victor, T., Wege, C., and Steinmetz, E. Processing of eye/head-tracking data in large-scale naturalistic driving data sets. IEEE Transactions on intelligent transportation system 13, 2 (2012), 553--564.Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Allison, R. S., Eizenman, M., and Cheung, B. S. K. Combined head and eye tracking system for dynamic testing of the vestibular system. IEEE Transactions on Biomedical Engineering 41, 11 (1996), 1073--1082.Google ScholarGoogle Scholar
  3. Essig, K., Sand, N., Schack, T., Künsemöller, J., Weigelt, M., and Ritter, H. Fully-automatic annotation of scene videos. In SICE Annual Conference (2010), 3304--3307.Google ScholarGoogle Scholar
  4. Holmqvist, K., Nyström, M., Andersson, R., Dewhurst, R., Jarodzka, H., and van de Weijer, J. Eye tracking: A comprehensive guide to methods and measures. Oxford University Press, 2011.Google ScholarGoogle Scholar
  5. Kinsman, T., Evans, K., Sweeney, G., Keane, T., and Pelz, J. B. Ego-motion compensation improves fixations detection in wearable eye tracking. In ETRA '12 Proceedings of the Symposium on Eye Tracking Research and Applications, ACM (2012), 221--224. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Madgwick, S. O. H., Harrison, A. J. L., and Vaidyanathan, R. Estimation of imu and marg orientation using a gradient descent algorithm. In IEEE International Conference of Rehabilitation Robotics (2011).Google ScholarGoogle ScholarCross RefCross Ref
  7. Rothkopf, C. A., and Pelz, J. B. Head movement estimation for wearable eye tracker. In ETRA '04 Proceedings of the Symposium on Eye Tracking Research and Applications, ACM (2004), 123--130. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Tafaj, E., Kübler, T., Kasneci, G., Rosenstiel, W., and Bogdan, M. Online classification of eye tracking data for automated analysis of traffic hazard perception. In Artificial Neural Networks and Machine Learning -- ICANN 2013, Springer (2013), 442--450.Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. x ioTechnologies. x-IMU User Manual 5.2, x-io Technologies, November 2013. http://www.x-io.co.uk/downloads/.Google ScholarGoogle Scholar

Index Terms

  1. Compensation of head movements in mobile eye-tracking data using an inertial measurement unit

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      UbiComp '14 Adjunct: Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication
      September 2014
      1409 pages
      ISBN:9781450330473
      DOI:10.1145/2638728

      Copyright © 2014 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 13 September 2014

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      Overall Acceptance Rate764of2,912submissions,26%

      Upcoming Conference

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader