skip to main content
10.1145/2168556.2168561acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
research-article

Automatic analysis of 3D gaze coordinates on scene objects using data from eye-tracking and motion-capture systems

Published: 28 March 2012 Publication History

Abstract

We implemented a system, called the VICON-EyeTracking Visualizer, that combines mobile eye tracking data with motion capture data to calculate and visualize the 3D gaze vector within the motion capture co-ordinate system. To ensure that both devices were temporally synchronized we used previously developed software by us. By placing reflective markers on objects in the scene, their positions are known and by spatially synchronizing both the eye tracker and the motion capture system allows us to automatically compute how many times and where fixations occur, thus overcoming the time consuming and error-prone disadvantages of the traditional manual annotation process. We evaluated our approach by comparing its outcome for a simple looking task and a more complex grasping task against the average results produced by the manual annotation process. Preliminary data reveals that the program only differed from the average manual annotation results by approximately 3 percent in the looking task with regard to the number of fixations and cumulative fixation duration on each point in the scene. In case of the more complex grasping task the results depend on the object size: for larger objects there was good agreement (less than 16 percent (or 950ms)), but this degraded for smaller objects, where there are more saccades towards object boundaries. The advantages of our approach are easy user calibration, the ability to have unrestricted body movements (due to the mobile eye-tracking system), and that it can be used with any wearable eye tracker and marker based motion tracking system. Extending existing approaches, our system is also able to monitor fixations on moving objects. The automatic analysis of gaze and movement data in complex 3D scenes can be applied to a variety of research domains, i. e., Human Computer Interaction, Virtual Reality or grasping and gesture research.

References

[1]
Allison, R., Eizenman, M., and Cheung, B. 1996. Combined head and eye tracking system for dynamic testing of the vestibular system. IEEE Transactions on Biomedical Engineering 43, 11, 1073--1081.
[2]
Duchowski, A. T., Medlin, E., Gramopadhye, A., Melloy, B. J., and Nair, S. 2001. Binocular eye tracking in vr for visual inspection training. In Proceedings of the Symposium on Virtual Reality Software and Technology, ACM, 1--9.
[3]
Duchowski, A. 2003. Eye Tracking Methodology: Theory and Practice. Springer.
[4]
ELAN Annotation Tool. {Online}. http://www.lat-mpi.eu/tool/elan/.
[5]
Ergoneers Dikablis Eye Tracking Solution. {Online}. http://www.ergoneers.com.
[6]
Essig, K., Pfeiffer, T., Sand, N., Kuensemoeller, J., Ritter, H., and Schack, T. 2011. Jvideogazer - towards an automatic annotation of gaze videos from natural scenes. In 2011 World Congress on Engineering and Technology (cet), Shanghai, China, 1--4.
[7]
Essig, K. 2008. Vision-Based Image Retrieval (VBIR) - A new Eye-tracking Based Approach to Efficient and Intuitive Image Retrieval. VDM Verlag.
[8]
Eye Tracking Integration with Bonita. {Online}. http://www.vicon.com/products/bonitavideos.html.
[9]
Henderson, J. 2005. Real World Scene Perception. Psychology Press.
[10]
Herholz, S., Chang, L. L., Tanner, T. G., Buelthoff, H. H., and Fleming, R. W. 2008. Libgaze: Real-time gaze-tracking of freely moving observers for wall-sized displays. In Proceedings of the 13th International Fall Workshop on Vision, Modeling, and Visualization (VMV 2008), IOS Press/Amsterdam, 101--110.
[11]
Holmqvist, K., Nyström, M., Andersson, R., Dewhurst, R., Jarodzka, H., and van de Weijer, J. 2011. Eye Tracking - A Comprehensive Guide to Methods and Measures. Oxford University Press.
[12]
Huang, H., Allison, R. S., and Jenkin, M. 2004. Combined head- eye tracking for immersive virtual reality. In ICAT'2004 14th International Conference on Artificial Reality and Telexistance Seoul, Korea.
[13]
Johansson, R. S., Westling, G., Backstroem, A., and Flanagan, J. R. 2001. Eye-hand coordination in object manipulation. Journal of Neuroscience 21, 6917--6932.
[14]
Jonikaitis, D., and Deubel, H. 2011. Independent allocation of attention to eye and hand targets in coordinated eye-hand movements. Psychological Science 22, 339--347.
[15]
Land, M., and Tatler, B. 2009. Looking and Acting - Vision and Eye Movements in Natural Behaviour. Oxford University Press.
[16]
Martin, M., Maycock, J., Schmidt, F. P., and Kramer, O. 2010. Recognition of manual actions using vector quantization and dynamic time warping. In 5th Int. Conf. on Hybrid Artificial Intelligence Systems (HAIS 2010), San Sebastian, Spain, 221--228.
[17]
Maycock, J., Blaesing, B., Bockemuehl, T., Ritter, H., and Schack, T. 2010. Motor synergies and object representations in virtual and real grasping. In 1st International Conference on Applied Bionics and Biomechanics (ICABB), Venice, Italy, 1--8.
[18]
Maycock, J., Dornbusch, D., Elbrechter, C., Haschke, R., Schack, T., and Ritter, H. 2010. Approaching manual intelligence. In KI Kuenstliche Intelligenz Issue Cognition for Technical Systems, 1--8.
[19]
Open Sound Control. {Online}. http://opensoundcontrol.org.
[20]
Pérez, A., Córdoba, M., García, A., Méndez, R., Muñoz, M., J. L. Pedraza, and Sánchez, F. 2003. A precise eye-gaze detection and tracking system. In Proceedings of the 11th International Conference in Central Europe on Computer Graphics, Visualization and Computer Vision, 1--4.
[21]
Rasolzadeh, B., Bjorkman, M., Huebner, K., and Kragic, D. 2010. An active vision system for detecting, fixating and manipulating objects in real world. Int. Jour. of Robotics Research 29, 133--154.
[22]
Rayner, K., Smith, T., Malcolm, G., and Henderson, J. 2009. Eye movements and visual encoding during scene perception. Psychological Science 20, 6--10.
[23]
Sensomotoric Instruments. {Online}. http://www.smivision.com.
[24]
Vicon motion capture system. {Online}. http://www.vicon.com.
[25]
Zelinsky, G., and Neider, M. 2008. An eye movement analysis of multiple object tracking in realistic environment. Visual Cognition 16, 553--566.

Cited By

View all
  • (2025)The fundamentals of eye tracking part 4: Tools for conducting an eye tracking studyBehavior Research Methods10.3758/s13428-024-02529-757:1Online publication date: 6-Jan-2025
  • (2023)An Integrated Eye-Tracking and Motion Capture System in Synchronized Gaze and Movement Analysis2023 International Conference on Rehabilitation Robotics (ICORR)10.1109/ICORR58425.2023.10304692(1-6)Online publication date: 24-Sep-2023
  • (2022)Comparison of visual SLAM and IMU in tracking head movement outdoorsBehavior Research Methods10.3758/s13428-022-01941-1Online publication date: 11-Aug-2022
  • Show More Cited By

Index Terms

  1. Automatic analysis of 3D gaze coordinates on scene objects using data from eye-tracking and motion-capture systems

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      ETRA '12: Proceedings of the Symposium on Eye Tracking Research and Applications
      March 2012
      420 pages
      ISBN:9781450312219
      DOI:10.1145/2168556
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 28 March 2012

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. 3D gaze vector
      2. eye tracking
      3. gaze tracking
      4. motion tracking

      Qualifiers

      • Research-article

      Conference

      ETRA '12
      ETRA '12: Eye Tracking Research and Applications
      March 28 - 30, 2012
      California, Santa Barbara

      Acceptance Rates

      Overall Acceptance Rate 69 of 137 submissions, 50%

      Upcoming Conference

      ETRA '25

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)47
      • Downloads (Last 6 weeks)4
      Reflects downloads up to 03 Mar 2025

      Other Metrics

      Citations

      Cited By

      View all
      • (2025)The fundamentals of eye tracking part 4: Tools for conducting an eye tracking studyBehavior Research Methods10.3758/s13428-024-02529-757:1Online publication date: 6-Jan-2025
      • (2023)An Integrated Eye-Tracking and Motion Capture System in Synchronized Gaze and Movement Analysis2023 International Conference on Rehabilitation Robotics (ICORR)10.1109/ICORR58425.2023.10304692(1-6)Online publication date: 24-Sep-2023
      • (2022)Comparison of visual SLAM and IMU in tracking head movement outdoorsBehavior Research Methods10.3758/s13428-022-01941-1Online publication date: 11-Aug-2022
      • (2022)Vision-Based InteractionundefinedOnline publication date: 25-Feb-2022
      • (2021)A novel target tracking method of unmanned drones by gaze prediction combined with YOLO algorithm2021 IEEE International Conference on Unmanned Systems (ICUS)10.1109/ICUS52573.2021.9641499(792-797)Online publication date: 15-Oct-2021
      • (2021)Evaluation of an AoI Mapping and Analysis Tool for the Identification of Visual Scan Pattern2021 IEEE/AIAA 40th Digital Avionics Systems Conference (DASC)10.1109/DASC52595.2021.9594500(1-8)Online publication date: 3-Oct-2021
      • (2019)Semantic 3D gaze mapping for estimating focused objectsProceedings of the 21st International Conference on Human-Computer Interaction with Mobile Devices and Services10.1145/3338286.3344396(1-6)Online publication date: 1-Oct-2019
      • (2019)The use of eye tracking technology to explore learning and performance within virtual reality and mixed reality settings: a scoping reviewInteractive Learning Environments10.1080/10494820.2019.170256030:7(1338-1350)Online publication date: 25-Dec-2019
      • (2018)Look to GoProceedings of the 2018 ACM Symposium on Spatial User Interaction10.1145/3267782.3267798(130-140)Online publication date: 13-Oct-2018
      • (2018)Evaluation of the fine motor skills of children with DCD using the digitalised visual‐motor tracking systemThe Journal of Engineering10.1049/joe.2017.04052018:2(123-129)Online publication date: 16-Feb-2018
      • Show More Cited By

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Figures

      Tables

      Media

      Share

      Share

      Share this Publication link

      Share on social media