ABSTRACT
Gaze tracking in psychological, cognitive, and user interaction studies has recently evolved toward mobile solutions, as they enable direct assessing of users' visual attention in natural environments, and augmented and virtual reality (AR/VR) applications. Productive approaches in analyzing and predicting user actions with gaze data require a multidisciplinary approach with experts in cognitive and behavioral sciences, machine vision, and machine learning. This workshop brings together a cross-domain group of individuals to (i) discuss and contribute to the problem of using mobile gaze tracking for inferring user action, (ii) advance the sharing of data and analysis algorithms as well as device solutions, and (iii) increase understanding of behavioral aspects of gaze-action sequences in natural environments and AR/VR applications.
- Alireza Fathi, Yin Li, and James M. Rehg. 2012. Learning to recognize daily actions using gaze. In Computer Vision--ECCV 2012. Springer, 314--327. Google ScholarDigital Library
- Moritz Kassner, William Patera, and Andreas Bulling. 2014. Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction. In Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication. ACM, 1151--1160. Google ScholarDigital Library
- Makeroni Labs. 2016. Eye of Horus. https://hackaday.io/project/6638-eye-of-horus-open-source-eye-tracking-assistance. (2016). Accessed: 2016-02-26.Google Scholar
- Kristian Lukander, Sharman Jagadeesan, Huageng Chi, and Kiti Müller. 2013. OMG!: a new robust, wearable and affordable open source mobile gaze tracker. In Proceedings of the 15th international conference on Human-computer interaction with mobile devices and services. ACM, 408--411. Google ScholarDigital Library
- Francisco J. Parada, Dean Wyatte, Chen Yu, Ruj Akavipat, Brandi Emerick, and Thomas Busey. 2015. ExpertEyes: Open-source, high-definition eyetracking. Behavior research methods 47, 1 (2015), 73--84.Google Scholar
- Akihiro Tsukada, Motoki Shino, Michael Devyver, and Takeo Kanade. 2011. Illumination-free gaze estimation method for first-person vision wearable device. In Computer Vision Workshops (ICCV Workshops), 2011 IEEE International Conference on. IEEE, 2084--2091.Google ScholarCross Ref
Index Terms
Inferring user action with mobile gaze tracking
Recommendations
The Eye in Extended Reality: A Survey on Gaze Interaction and Eye Tracking in Head-worn Extended Reality
With innovations in the field of gaze and eye tracking, a new concentration of research in the area of gaze-tracked systems and user interfaces has formed in the field of Extended Reality (XR). Eye trackers are being used to explore novel forms of spatial ...
Hitchhiking Hands: Remote Interaction by Switching Multiple Hand Avatars with Gaze
SA '23: SIGGRAPH Asia 2023 Emerging TechnologiesVirtual reality (VR) and augmented reality (AR) with eye tracking and hand tracking are widely used in entertainment, gaming, design, and training. However, most VR and AR interaction methods are limited in their interactable range and do not fully ...
Exploring User Behaviour in Asymmetric Collaborative Mixed Reality
VRST '22: Proceedings of the 28th ACM Symposium on Virtual Reality Software and TechnologyA common issue for collaborative mixed reality is the asymmetry of interaction with the shared virtual environment. For example, an augmented reality (AR) user might use one type of head-mounted display (HMD) in a physical environment, while a virtual ...
Comments