ABSTRACT
Mobile gaze-based interaction with multiple displays may occur from arbitrary positions and orientations. However, maintaining high gaze estimation accuracy in such situa-tions remains a significant challenge. In this paper, we present GazeProjector, a system that combines (1) natural feature tracking on displays to determine the mobile eye tracker's position relative to a display with (2) accurate point-of-gaze estimation. GazeProjector allows for seam-less gaze estimation and interaction on multiple displays of arbitrary sizes independently of the user's position and orientation to the display. In a user study with 12 partici-pants we compare GazeProjector to established methods (here: visual on-screen markers and a state-of-the-art video-based motion capture system). We show that our approach is robust to varying head poses, orientations, and distances to the display, while still providing high gaze estimation accuracy across multiple displays without re-calibration for each variation. Our system represents an important step towards the vision of pervasive gaze-based interfaces.
Supplemental Material
- Alahi, A., Ortiz, R. and Vandergheynst, P. FREAK: Fast retina keypoint. Proc. CVPR 2012, 510--517. Google ScholarDigital Library
- Ballagas, R., Borchers, J., Rohs, M. and Sheridan, J. The smart phone: A ubiquitous input device. IEEE Pervasive Computing 5(1), 70--77, 2006. Google ScholarDigital Library
- Baur, D., Boring, S. and Feiner, S. Virtual projection: exploring optical projection as a metaphor for multidevice interaction. Proc. CHI 2012, 1693--1702. Google ScholarDigital Library
- Boring, S., Baur, D., Butz, A., Gustafson, S. and Baudisch, P. Touch projector: mobile interaction through video. Proc. CHI 2010, 2287--2296. Google ScholarDigital Library
- Breuninger, J., Lange, C., and Bengler, K. Implementing Gaze Control for Peripheral Devices. Proc. PETMEI 2011, 3--8. Google ScholarDigital Library
- Bulling, A., and Gellersen, H. Toward Mobile Eyebased human-computer interaction. IEEE Pervasive Computing 9(4):8--12, 2010. Google ScholarDigital Library
- Bulling, A., Alt, F., and Schmidt, A. Increasing the security of gaze-based cued-recall graphical passwords using saliency masks. Proc. CHI 2012, 3011--3020. Google ScholarDigital Library
- Cerrolaza, J. J., Villanueva, A., Villanueva, M. and Cabeza, R. Error characterization and compensation in eye tracking systems. Proc. ETRA 2012, 205--208. Google ScholarDigital Library
- Eaddy, M., Blasko, G., Babcock, J. and Feiner, S. My own private kiosk: Privacy-preserving public displays. Proc. ISWC 2004, 132--135. Google ScholarDigital Library
- Guitton D. and Volle M. Gaze control in humans: eyehead coordination during orienting movements to targets within and beyond the oculomotor range. Journal of Neurophysiology 58:427--459, 1987.Google ScholarCross Ref
- Hennessey, C. and Fiset, J. Long Range Eye Tracking: Bringing Eye Tracking into the Living Room. Proc. ETRA 2012, 249--252. Google ScholarDigital Library
- Herbert, L., Pears, N., Jackson, D. and Olivier, P. Mobile device and intelligent display interaction via scaleinvariant image feature matching. Proc. PECCS 2011.Google Scholar
- Jacob, R. J. K. What you look at is what you get: eye movement-based interaction techniques. Proc. CHI 1990, 11--18. Google ScholarDigital Library
- Kassner, M., Patera, W. and Bulling, A. Pupil: An Open Source Platform for Pervasive Eye Tracking and Mobile Gaze-based Interaction. Adj. Proc. UbiComp 2014, 1151--1160. Google ScholarDigital Library
- Lowe, D.G. Object recognition from local scaleinvariant features. Proc. ICCV 1999, 1150--1157. Google ScholarDigital Library
- Mardanbegi, D. and Hansen, D.W. Mobile Gaze-based Screen Interaction in 3D Environments. Proc. NGCA 2011, 2:1--2:4. Google ScholarDigital Library
- Marquardt, N., Diaz-Marino, R., Boring, S., and Greenberg, S. The proximity toolkit: prototyping proxemic interactions in ubiquitous computing ecologies, Proc. UIST 2011, 315--326. Google ScholarDigital Library
- Majaranta, P., and Kari-Jouko R. Twenty years of eye typing: systems and design issues. Proc. ETRA 2002, 15--22. Google ScholarDigital Library
- Model, D. and Eizenman, M. A General Framework for the Extension of the Tracking Range of UserCalibration-Free Remote Eye-Gaze Tracking Systems, Proc. ETRA 2012, 253--256. Google ScholarDigital Library
- Nakanishi, Y., Fujii, T., Kiatjima, K., Sato, Y. and Koike, H. Vision-based face tracking system for large displays. Proc. UbiComp 2002, 152--159. Google ScholarDigital Library
- Pears, N., Jackson, D.G. and Olivier, P. Smart phone interaction with registered displays. IEEE Pervasive Computing 8(2), 14--21, 2009. Google ScholarDigital Library
- Sibert, L. E. and Jacob, R. J. K. Evaluation of eye gaze interaction. Proc. CHI 2000, 281--288. Google ScholarDigital Library
- Sippl, A., Holzmann, C., Zachhuber, D. and Ferscha, A. Real-time gaze tracking for public displays. Proc. AmI 2010, 167--176. Google ScholarDigital Library
- San Agustin, J., Hansen, J. P. and Tall, M. Gaze-based interaction with public displays using off-the-shelf components. Adj. Proc. Ubicomp 2010, 377--378. Google ScholarDigital Library
- Smith, J. D., Vertegaal, R. and Sohn, C. ViewPointer: lightweight calibration-free eye tracking for ubiquitous handsfree deixis. Proc. UIST 2005, 53--61. Google ScholarDigital Library
- Stahl, J. S. Amplitude of human head movements associated with horizontal saccades. Experimental Brain Research 126.1: 41--54, 1999.Google ScholarCross Ref
- Stellmach, S. and Dachselt, R. Look & touch: gazesupported target acquisition. Proc. CHI 2012, 29812990. Google ScholarDigital Library
- Stellmach, S. and Dachselt, R. Still looking: investigating seamless gaze-supported selection, positioning, and manipulation of distant targets. Proc. CHI 2013, 285294. Google ScholarDigital Library
- Turner, J., Bulling, A. and Gellersen, H. Extending the visual field of a head-mounted eye tracker for pervasive eye-based interaction. Proc. ETRA 2012, 269--272. Google ScholarDigital Library
- Vertegaal, R. Attentive user interfaces. Communications of the ACM 46(3):30--33, 2003. Google ScholarDigital Library
- Vidal, M., Bulling, A. and Gellersen, H. Pursuits: Spontaneous Interaction with Displays based on Smooth Pursuit Eye Movement and Moving Targets. Proc. UbiComp 2013, 439--448. Google ScholarDigital Library
- Yu, L., and Eizenman, E. A new methodology for determining point-of-gaze in head-mounted eye tracking systems. IEEE Transactions on Biomedical Engineering 51(10):1765--1773, 2004.Google ScholarCross Ref
- Zhai, S., Morimoto, C. and Ihde, S. Manual and gaze input cascaded (MAGIC) pointing. Proc. CHI 1999, 246--253. Google ScholarDigital Library
- Zhang, Y., Bulling, A. and Gellersen, H. Sideways: A Gaze Interface for Spontaneous Interaction with Situated Displays. Proc. CHI 2013, 851--860 Google ScholarDigital Library
Index Terms
- GazeProjector: Accurate Gaze Estimation and Seamless Gaze Interaction Across Multiple Displays
Recommendations
Get a grip: slippage-robust and glint-free gaze estimation for real-time pervasive head-mounted eye tracking
ETRA '19: Proceedings of the 11th ACM Symposium on Eye Tracking Research & ApplicationsA key assumption conventionally made by flexible head-mounted eye-tracking systems is often invalid: The eye center does not remain stationary w.r.t. the eye camera due to slippage. For instance, eye-tracker slippage might happen due to head ...
Calibration rapide pour I'Eye tracking
IHM '09: Proceedings of the 21st International Conference on Association Francophone d'Interaction Homme-MachineOur research aims to know where a person is looking at. This information is also used for training purposes. Our eye tracker system can be used in various fields such as learning to drive, support for physically handicapped or as an interface command. ...
hEYEbrid: A hybrid approach for mobile calibration-free gaze estimation
We introduce hEYEbrid, a calibration-free method for spontaneous and long-term eye gaze tracking, with competitive gaze estimation. It is based on a hybrid concept that combines infrared eye images with corneal imaging. For this, two eye cameras are ...
Comments