skip to main content
10.1145/3450341.3458486acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
short-paper
Public Access

VEDBViz: The Visual Experience Database Visualization and Interaction Tool

Authors Info & Claims
Published:25 May 2021Publication History

ABSTRACT

Mobile, simultaneous tracking of both the head and eyes is typically achieved through integration of separate head and eye tracking systems because off-the-shelf solutions do not yet exist. Similarly, joint visualization and analysis of head and eye movement data is not possible with standard software packages because these were designed to support either head or eye tracking in isolation. Thus, there is a need for software that supports joint analysis of head and eye data to characterize and investigate topics including head-eye coordination and reconstruction of how the eye is moving in space. To address this need, we have begun developing VEDBViz which supports simultaneous graphing and animation of head and eye movement data recorded with the Intel RealSense T265 and Pupil Core, respectively. We describe current functionality as well as features and applications that are still in development.

References

  1. Isayas B. Adhanom, Samantha C. Lee, Eelke Folmer, and Paul MacNeilage. 2020. GazeMetrics: An Open-Source Tool for Measuring the Data Quality of HMD-Based Eye Trackers. In ACM Symposium on Eye Tracking Research and Applications (Stuttgart, Germany) (ETRA ’20 Short Papers). Association for Computing Machinery, New York, NY, USA, Article 19, 5 pages. https://doi.org/10.1145/3379156.3391374Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Aayush K. Chaudhary, Rakshit Kothari, Manoj Acharya, Shusil Dangi, Nitinraj Nair, Reynold Bailey, Christopher Kanan, Gabriel Diaz, and Jeff B. Pelz. 2019. RITnet: Real-time Semantic Segmentation of the Eye for Gaze Tracking. 2019 IEEE/CVF International Conference on Computer Vision Workshop (ICCVW) (Oct 2019). https://doi.org/10.1109/iccvw.2019.00568Google ScholarGoogle Scholar
  3. Oscar Creativo. 2021. Eye Free Model 3D. https://sketchfab.com/3d-models/eye-free-model-3d-by-oscar-creativo-5d466ea41c874fc5b376c92a313a9bb3 Accessed: 2021-3-12.Google ScholarGoogle Scholar
  4. Edwin S Dalmaijer, Sebastiaan Mathôt, and Stefan Van der Stigchel. 2014. PyGaze: An open-source, cross-platform toolbox for minimal-effort programming of eyetracking experiments. Behavior Research Methods 46, 4 (2014), 913–921.Google ScholarGoogle ScholarCross RefCross Ref
  5. Peter Hausamann, Christian Sinnott, and Paul R. MacNeilage. 2020. Positional head-eye tracking outside the lab: An open-source solution. In Eye Tracking Research and Applications Symposium (ETRA)(ETRA ’20 Short Papers). Association for Computing Machinery, New York, NY, USA. https://doi.org/10.1145/3379156.3391365Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. IOSplash. 2021. Graph and Chart. https://assetstore.unity.com/packages/tools/gui/graph-and-chart-78488 Accessed: 2021-3-12.Google ScholarGoogle Scholar
  7. Moritz Kassner, William Patera, and Andreas Bulling. 2014. Pupil: An Open Source Platform for Pervasive Eye Tracking and Mobile Gaze-based Interaction. In Adjunct Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing (Seattle, Washington) (UbiComp ’14 Adjunct). ACM, New York, NY, USA, 1151–1160. https://doi.org/10.1145/2638728.2641695Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Thomas Kinsman, Karen Evans, Glenn Sweeney, Tommy Keane, and Jeff Pelz. 2012. Ego-Motion Compensation Improves Fixation Detection in Wearable Eye Tracking. In Proceedings of the Symposium on Eye Tracking Research and Applications (Santa Barbara, California) (ETRA ’12). Association for Computing Machinery, New York, NY, USA, 221–224. https://doi.org/10.1145/2168556.2168599Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Rakshit Kothari, Zhizhuo Yang, Christopher Kanan, Reynold Bailey, Jeff B Pelz, and Gabriel J Diaz. 2020. Gaze-in-wild: A dataset for studying eye and head coordination in everyday activities. Scientific Reports 10, 1 (2020), 1–18.Google ScholarGoogle ScholarCross RefCross Ref
  10. Paul R MacNeilage, Luan Nguyen, and Christian Sinnott. 2019. Characterization of natural head and eye movements driving retinal flow. Journal of Vision 19, 10 (2019), 147d–147d.Google ScholarGoogle ScholarCross RefCross Ref
  11. martinjario. 2016. Skull downloadable. https://sketchfab.com/3d-models/skull-downloadable-1a9db900738d44298b0bc59f68123393 Accessed: 2021-3-12.Google ScholarGoogle Scholar
  12. Jonathan Samir Matthis, Jacob L Yates, and Mary M Hayhoe. 2018. Gaze and the control of foot placement when walking in natural terrain. Current Biology 28, 8 (2018), 1224–1233.Google ScholarGoogle ScholarCross RefCross Ref
  13. Tobii. 2020. Tobii Pro Glasses 3. https://www.tobiipro.com/product-listing/tobii-pro-glasses-3/. Accessed: 2021-3-12.Google ScholarGoogle Scholar
  14. Matteo Tomasi, Shrinivas Pundlik, Alex R. Bowers, Eli Peli, and Gang Luo. 2016. Mobile gaze tracking system for outdoor walking behavioral studies. Journal of Vision 16, 3 (2016).Google ScholarGoogle ScholarCross RefCross Ref
  15. Marc Tonsen, Chris Kay Baumann, and Kai Dierkes. 2020. A High-Level Description and Performance Evaluation of Pupil Invisible. (Sep 2020). arxiv:2009.00508 [cs.CV]Google ScholarGoogle Scholar
  16. Jason Weimann. 2018. Unity Video Player with Controls & Time Scrubber. https://unity3d.college/2018/04/25/unity-video-player-controls-time-scrubber/. Accessed: 2021-3-10.Google ScholarGoogle Scholar
  17. Yuk-Hoi Yiu, Moustafa Aboulatta, Theresa Raiser, Leoni Ophey, Virginia L. Flanagin, Peter zu Eulenburg, and Seyed-Ahmad Ahmadi. 2019. DeepVOG: Open-source pupil segmentation and gaze estimation in neuroscience using deep learning. Journal of Neuroscience Methods 324 (2019), 108307. https://doi.org/10.1016/j.jneumeth.2019.05.016Google ScholarGoogle ScholarCross RefCross Ref

Recommendations

Comments

Login options

Check if you have access through your login credentials or your institution to get full access on this article.

Sign in
  • Published in

    cover image ACM Conferences
    ETRA '21 Adjunct: ACM Symposium on Eye Tracking Research and Applications
    May 2021
    78 pages
    ISBN:9781450383578
    DOI:10.1145/3450341

    Copyright © 2021 ACM

    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    • Published: 25 May 2021

    Permissions

    Request permissions about this article.

    Request Permissions

    Check for updates

    Qualifiers

    • short-paper
    • Research
    • Refereed limited

    Acceptance Rates

    Overall Acceptance Rate69of137submissions,50%

    Upcoming Conference

    ETRA '24
    The 2024 Symposium on Eye Tracking Research and Applications
    June 4 - 7, 2024
    Glasgow , United Kingdom

PDF Format

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format .

View HTML Format