skip to main content
10.1145/2807442.2807479acmconferencesArticle/Chapter ViewAbstractPublication PagesuistConference Proceedingsconference-collections
research-article

GazeProjector: Accurate Gaze Estimation and Seamless Gaze Interaction Across Multiple Displays

Authors Info & Claims
Published:05 November 2015Publication History

ABSTRACT

Mobile gaze-based interaction with multiple displays may occur from arbitrary positions and orientations. However, maintaining high gaze estimation accuracy in such situa-tions remains a significant challenge. In this paper, we present GazeProjector, a system that combines (1) natural feature tracking on displays to determine the mobile eye tracker's position relative to a display with (2) accurate point-of-gaze estimation. GazeProjector allows for seam-less gaze estimation and interaction on multiple displays of arbitrary sizes independently of the user's position and orientation to the display. In a user study with 12 partici-pants we compare GazeProjector to established methods (here: visual on-screen markers and a state-of-the-art video-based motion capture system). We show that our approach is robust to varying head poses, orientations, and distances to the display, while still providing high gaze estimation accuracy across multiple displays without re-calibration for each variation. Our system represents an important step towards the vision of pervasive gaze-based interfaces.

Skip Supplemental Material Section

Supplemental Material

p395.mp4

mp4

67.2 MB

References

  1. Alahi, A., Ortiz, R. and Vandergheynst, P. FREAK: Fast retina keypoint. Proc. CVPR 2012, 510--517. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Ballagas, R., Borchers, J., Rohs, M. and Sheridan, J. The smart phone: A ubiquitous input device. IEEE Pervasive Computing 5(1), 70--77, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Baur, D., Boring, S. and Feiner, S. Virtual projection: exploring optical projection as a metaphor for multidevice interaction. Proc. CHI 2012, 1693--1702. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Boring, S., Baur, D., Butz, A., Gustafson, S. and Baudisch, P. Touch projector: mobile interaction through video. Proc. CHI 2010, 2287--2296. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Breuninger, J., Lange, C., and Bengler, K. Implementing Gaze Control for Peripheral Devices. Proc. PETMEI 2011, 3--8. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Bulling, A., and Gellersen, H. Toward Mobile Eyebased human-computer interaction. IEEE Pervasive Computing 9(4):8--12, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Bulling, A., Alt, F., and Schmidt, A. Increasing the security of gaze-based cued-recall graphical passwords using saliency masks. Proc. CHI 2012, 3011--3020. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Cerrolaza, J. J., Villanueva, A., Villanueva, M. and Cabeza, R. Error characterization and compensation in eye tracking systems. Proc. ETRA 2012, 205--208. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Eaddy, M., Blasko, G., Babcock, J. and Feiner, S. My own private kiosk: Privacy-preserving public displays. Proc. ISWC 2004, 132--135. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Guitton D. and Volle M. Gaze control in humans: eyehead coordination during orienting movements to targets within and beyond the oculomotor range. Journal of Neurophysiology 58:427--459, 1987.Google ScholarGoogle ScholarCross RefCross Ref
  11. Hennessey, C. and Fiset, J. Long Range Eye Tracking: Bringing Eye Tracking into the Living Room. Proc. ETRA 2012, 249--252. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Herbert, L., Pears, N., Jackson, D. and Olivier, P. Mobile device and intelligent display interaction via scaleinvariant image feature matching. Proc. PECCS 2011.Google ScholarGoogle Scholar
  13. Jacob, R. J. K. What you look at is what you get: eye movement-based interaction techniques. Proc. CHI 1990, 11--18. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Kassner, M., Patera, W. and Bulling, A. Pupil: An Open Source Platform for Pervasive Eye Tracking and Mobile Gaze-based Interaction. Adj. Proc. UbiComp 2014, 1151--1160. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Lowe, D.G. Object recognition from local scaleinvariant features. Proc. ICCV 1999, 1150--1157. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Mardanbegi, D. and Hansen, D.W. Mobile Gaze-based Screen Interaction in 3D Environments. Proc. NGCA 2011, 2:1--2:4. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Marquardt, N., Diaz-Marino, R., Boring, S., and Greenberg, S. The proximity toolkit: prototyping proxemic interactions in ubiquitous computing ecologies, Proc. UIST 2011, 315--326. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Majaranta, P., and Kari-Jouko R. Twenty years of eye typing: systems and design issues. Proc. ETRA 2002, 15--22. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Model, D. and Eizenman, M. A General Framework for the Extension of the Tracking Range of UserCalibration-Free Remote Eye-Gaze Tracking Systems, Proc. ETRA 2012, 253--256. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Nakanishi, Y., Fujii, T., Kiatjima, K., Sato, Y. and Koike, H. Vision-based face tracking system for large displays. Proc. UbiComp 2002, 152--159. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Pears, N., Jackson, D.G. and Olivier, P. Smart phone interaction with registered displays. IEEE Pervasive Computing 8(2), 14--21, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Sibert, L. E. and Jacob, R. J. K. Evaluation of eye gaze interaction. Proc. CHI 2000, 281--288. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Sippl, A., Holzmann, C., Zachhuber, D. and Ferscha, A. Real-time gaze tracking for public displays. Proc. AmI 2010, 167--176. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. San Agustin, J., Hansen, J. P. and Tall, M. Gaze-based interaction with public displays using off-the-shelf components. Adj. Proc. Ubicomp 2010, 377--378. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. Smith, J. D., Vertegaal, R. and Sohn, C. ViewPointer: lightweight calibration-free eye tracking for ubiquitous handsfree deixis. Proc. UIST 2005, 53--61. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. Stahl, J. S. Amplitude of human head movements associated with horizontal saccades. Experimental Brain Research 126.1: 41--54, 1999.Google ScholarGoogle ScholarCross RefCross Ref
  27. Stellmach, S. and Dachselt, R. Look & touch: gazesupported target acquisition. Proc. CHI 2012, 29812990. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. Stellmach, S. and Dachselt, R. Still looking: investigating seamless gaze-supported selection, positioning, and manipulation of distant targets. Proc. CHI 2013, 285294. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. Turner, J., Bulling, A. and Gellersen, H. Extending the visual field of a head-mounted eye tracker for pervasive eye-based interaction. Proc. ETRA 2012, 269--272. Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. Vertegaal, R. Attentive user interfaces. Communications of the ACM 46(3):30--33, 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library
  31. Vidal, M., Bulling, A. and Gellersen, H. Pursuits: Spontaneous Interaction with Displays based on Smooth Pursuit Eye Movement and Moving Targets. Proc. UbiComp 2013, 439--448. Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. Yu, L., and Eizenman, E. A new methodology for determining point-of-gaze in head-mounted eye tracking systems. IEEE Transactions on Biomedical Engineering 51(10):1765--1773, 2004.Google ScholarGoogle ScholarCross RefCross Ref
  33. Zhai, S., Morimoto, C. and Ihde, S. Manual and gaze input cascaded (MAGIC) pointing. Proc. CHI 1999, 246--253. Google ScholarGoogle ScholarDigital LibraryDigital Library
  34. Zhang, Y., Bulling, A. and Gellersen, H. Sideways: A Gaze Interface for Spontaneous Interaction with Situated Displays. Proc. CHI 2013, 851--860 Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. GazeProjector: Accurate Gaze Estimation and Seamless Gaze Interaction Across Multiple Displays

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      UIST '15: Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology
      November 2015
      686 pages
      ISBN:9781450337793
      DOI:10.1145/2807442

      Copyright © 2015 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 5 November 2015

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      UIST '15 Paper Acceptance Rate70of297submissions,24%Overall Acceptance Rate842of3,967submissions,21%

      Upcoming Conference

      UIST '24

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader