ABSTRACT
Mobile, simultaneous tracking of both the head and eyes is typically achieved through integration of separate head and eye tracking systems because off-the-shelf solutions do not yet exist. Similarly, joint visualization and analysis of head and eye movement data is not possible with standard software packages because these were designed to support either head or eye tracking in isolation. Thus, there is a need for software that supports joint analysis of head and eye data to characterize and investigate topics including head-eye coordination and reconstruction of how the eye is moving in space. To address this need, we have begun developing VEDBViz which supports simultaneous graphing and animation of head and eye movement data recorded with the Intel RealSense T265 and Pupil Core, respectively. We describe current functionality as well as features and applications that are still in development.
- Isayas B. Adhanom, Samantha C. Lee, Eelke Folmer, and Paul MacNeilage. 2020. GazeMetrics: An Open-Source Tool for Measuring the Data Quality of HMD-Based Eye Trackers. In ACM Symposium on Eye Tracking Research and Applications (Stuttgart, Germany) (ETRA ’20 Short Papers). Association for Computing Machinery, New York, NY, USA, Article 19, 5 pages. https://doi.org/10.1145/3379156.3391374Google ScholarDigital Library
- Aayush K. Chaudhary, Rakshit Kothari, Manoj Acharya, Shusil Dangi, Nitinraj Nair, Reynold Bailey, Christopher Kanan, Gabriel Diaz, and Jeff B. Pelz. 2019. RITnet: Real-time Semantic Segmentation of the Eye for Gaze Tracking. 2019 IEEE/CVF International Conference on Computer Vision Workshop (ICCVW) (Oct 2019). https://doi.org/10.1109/iccvw.2019.00568Google Scholar
- Oscar Creativo. 2021. Eye Free Model 3D. https://sketchfab.com/3d-models/eye-free-model-3d-by-oscar-creativo-5d466ea41c874fc5b376c92a313a9bb3 Accessed: 2021-3-12.Google Scholar
- Edwin S Dalmaijer, Sebastiaan Mathôt, and Stefan Van der Stigchel. 2014. PyGaze: An open-source, cross-platform toolbox for minimal-effort programming of eyetracking experiments. Behavior Research Methods 46, 4 (2014), 913–921.Google ScholarCross Ref
- Peter Hausamann, Christian Sinnott, and Paul R. MacNeilage. 2020. Positional head-eye tracking outside the lab: An open-source solution. In Eye Tracking Research and Applications Symposium (ETRA)(ETRA ’20 Short Papers). Association for Computing Machinery, New York, NY, USA. https://doi.org/10.1145/3379156.3391365Google ScholarDigital Library
- IOSplash. 2021. Graph and Chart. https://assetstore.unity.com/packages/tools/gui/graph-and-chart-78488 Accessed: 2021-3-12.Google Scholar
- Moritz Kassner, William Patera, and Andreas Bulling. 2014. Pupil: An Open Source Platform for Pervasive Eye Tracking and Mobile Gaze-based Interaction. In Adjunct Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing (Seattle, Washington) (UbiComp ’14 Adjunct). ACM, New York, NY, USA, 1151–1160. https://doi.org/10.1145/2638728.2641695Google ScholarDigital Library
- Thomas Kinsman, Karen Evans, Glenn Sweeney, Tommy Keane, and Jeff Pelz. 2012. Ego-Motion Compensation Improves Fixation Detection in Wearable Eye Tracking. In Proceedings of the Symposium on Eye Tracking Research and Applications (Santa Barbara, California) (ETRA ’12). Association for Computing Machinery, New York, NY, USA, 221–224. https://doi.org/10.1145/2168556.2168599Google ScholarDigital Library
- Rakshit Kothari, Zhizhuo Yang, Christopher Kanan, Reynold Bailey, Jeff B Pelz, and Gabriel J Diaz. 2020. Gaze-in-wild: A dataset for studying eye and head coordination in everyday activities. Scientific Reports 10, 1 (2020), 1–18.Google ScholarCross Ref
- Paul R MacNeilage, Luan Nguyen, and Christian Sinnott. 2019. Characterization of natural head and eye movements driving retinal flow. Journal of Vision 19, 10 (2019), 147d–147d.Google ScholarCross Ref
- martinjario. 2016. Skull downloadable. https://sketchfab.com/3d-models/skull-downloadable-1a9db900738d44298b0bc59f68123393 Accessed: 2021-3-12.Google Scholar
- Jonathan Samir Matthis, Jacob L Yates, and Mary M Hayhoe. 2018. Gaze and the control of foot placement when walking in natural terrain. Current Biology 28, 8 (2018), 1224–1233.Google ScholarCross Ref
- Tobii. 2020. Tobii Pro Glasses 3. https://www.tobiipro.com/product-listing/tobii-pro-glasses-3/. Accessed: 2021-3-12.Google Scholar
- Matteo Tomasi, Shrinivas Pundlik, Alex R. Bowers, Eli Peli, and Gang Luo. 2016. Mobile gaze tracking system for outdoor walking behavioral studies. Journal of Vision 16, 3 (2016).Google ScholarCross Ref
- Marc Tonsen, Chris Kay Baumann, and Kai Dierkes. 2020. A High-Level Description and Performance Evaluation of Pupil Invisible. (Sep 2020). arxiv:2009.00508 [cs.CV]Google Scholar
- Jason Weimann. 2018. Unity Video Player with Controls & Time Scrubber. https://unity3d.college/2018/04/25/unity-video-player-controls-time-scrubber/. Accessed: 2021-3-10.Google Scholar
- Yuk-Hoi Yiu, Moustafa Aboulatta, Theresa Raiser, Leoni Ophey, Virginia L. Flanagin, Peter zu Eulenburg, and Seyed-Ahmad Ahmadi. 2019. DeepVOG: Open-source pupil segmentation and gaze estimation in neuroscience using deep learning. Journal of Neuroscience Methods 324 (2019), 108307. https://doi.org/10.1016/j.jneumeth.2019.05.016Google ScholarCross Ref
Recommendations
Pistol: Pupil Invisible Supportive Tool in the Wild
AbstractThis paper is an in the wild evaluation of the eye tracking tool Pistol. Pistol supports Pupil Invisible projects and other eye trackers (Dikablis, Emke GmbH, Look, Pupil, and many more) in offline mode. For all eye tracking recordings, Pistol is ...
Comparing Eye Tracking and Head Tracking During a Visual Attention Task in Immersive Virtual Reality
Human-Computer Interaction. Interaction Techniques and Novel ApplicationsAbstractThe use of eye tracking (ET) and head tracking (HT) in head-mounted displays allows for the study of a subject’s attention in virtual reality environments, expanding the possibility to develop experiments in areas such as health or consumer ...
A Method to Recognize Eyeball Movement Gesture using Infrared Distance Sensor Array on Eyewear
iiWAS2021: The 23rd International Conference on Information Integration and Web IntelligenceSensing technology for eyeball movement (i.e., gaze movement) has enabled various applications, e.g., hands-free input interfaces and provided information that helps us understand human beings in various fields. However, the existing method, which ...
Comments