skip to main content
10.1145/2818346.2820752acmconferencesArticle/Chapter ViewAbstractPublication Pagesicmi-mlmiConference Proceedingsconference-collections
research-article
Best Student Paper

Gaze+Gesture: Expressive, Precise and Targeted Free-Space Interactions

Published:09 November 2015Publication History

ABSTRACT

Humans rely on eye gaze and hand manipulations extensively in their everyday activities. Most often, users gaze at an object to perceive it and then use their hands to manipulate it. We propose applying a multimodal, gaze plus free-space gesture approach to enable rapid, precise and expressive touch-free interactions. We show the input methods are highly complementary, mitigating issues of imprecision and limited expressivity in gaze-alone systems, and issues of targeting speed in gesture-alone systems. We extend an existing interaction taxonomy that naturally divides the gaze+gesture interaction space, which we then populate with a series of example interaction techniques to illustrate the character and utility of each method. We contextualize these interaction techniques in three example scenarios. In our user study, we pit our approach against five contemporary approaches; results show that gaze+gesture can outperform systems using gaze or gesture alone, and in general, approach the performance of "gold standard" input systems, such as the mouse and trackpad.

Skip Supplemental Material Section

Supplemental Material

References

  1. Bérard, F., Ip, J., Benovoy, M., El-Shimy, D., Blum, J.R. and Cooperstock, J.R. Did "Minority Report" Get It Wrong? Su-periority of the Mouse over 3D Input Devices in a 3D Place-ment Task. In Proc. INTERACT '09, 400--414. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Bolt, R.A. Put-That-There: Voice and Gesture at the Graphics Interface. In Proc. SIGGRAPH '80, 262--270. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Card, S.K., English, W.K. and Burr, B.J. 1978. Evaluation of mouse, rate-controlled isometric joystick, step keys, and text keys for text selection on a CRT. Ergonomics 21(8), 601--61Google ScholarGoogle ScholarCross RefCross Ref
  4. Chen, L. Incorporating gesture and gaze into multimodal models of human-to-human communication. In Proc. NAACL-DocConsortium '06, 211--21 Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Chi, C.F. and Lin, C.L. 1997. Speed and accuracy of eye-gaze pointing. Percept. Mot. Skills, 85(2), 705--718.Google ScholarGoogle ScholarCross RefCross Ref
  6. Cockburn. A., Quinn, P., Gutwin, C., Ramos, G. and Looser, J. 2011. Air pointing: Design and evaluation of spatial target acquisition with and without visual feedback. Int. J. Human-Computer Studies, 69, 401--414. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Connolly, K.J. 1998. Psychobiology of the Hand. Cambridge University Press.Google ScholarGoogle Scholar
  8. Erol, A., Bebis, G., Nicolescu, M., Boyle, R., and Twombly, X. 2007. Vision-based hand pose estimation: A review. Com-put. Vis. Image Und., 108(1), 52--73. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. The EyeTribe. Eye Tribe Tracker. http://theeyetribe.comGoogle ScholarGoogle Scholar
  10. Grauman, K., Betke, M., Lombardi, J., Gips, J. and Bradski, J.R. 2003. Communication via eye blinks and eyebrow raises: Video-based human-computer interfaces. Univers. Access. Inform. Soc. 2(4), 359--373. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Hales, J., Rozado, D. and Mardanbegi, D. Interacting with objects in the environment by gaze and hand gestures. In Proc. PETMEI '13, 1--9.Google ScholarGoogle Scholar
  12. Hardenberg, C. and Bérard, F. Bare-Hand Human-Computer Interaction. In Proc. PUI '01, 1--8. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Hutchinson, T. E., White, K. P. Jr., Martin, W. N., Reichert, K. C. and Frey, L. A. 1989. Human-computer interaction us-ing eye-gaze input. IEEE Trans. Syst., Man, Cybern. 19(6), 1527--1534.Google ScholarGoogle ScholarCross RefCross Ref
  14. ISO/DIS 9241--9. 1998. Ergonomic Requirements for Office Work with Visual Display Terminals, Non-keyboard Input Device Requirements.Google ScholarGoogle Scholar
  15. Jacob, R. J. K. What you look at is what you get: eye move-ment-based interaction techniques. In Proc. CHI '90, 11--18. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Jones, B., Sodhi, R., Forsyth, D., Bailey, B., and Maciocci, G. Around device interaction for multiscale navigation. In Proc. MobileHCI '12, 83--92. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Kim, H-J., Kim, H., Chae, S., Seo, J. and Han, T-D. AR pen and hand gestures: a new tool for pen drawings. In CHI EA '13, 943--948. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Kratz, S., Rohs, M., Guse, D., Müller, J., Bailly, G. and Nischt, M. PalmSpace: continuous around-device gestures vs. multitouch for 3D rotation tasks on mobile devices. In Proc. AVI '12, 181--188. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Kumar, M., Paepcke, A. and Winograd, T. EyePoint: Practi-cal Pointing and Selection Using Gaze and Keyboard. In Proc. CHI '07, 421--430. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Leap Motion, Inc. https://www.leapmotion.comGoogle ScholarGoogle Scholar
  21. MacKenzie, C. L., and Iberall, T. 1994. The Grasping Hand. Advances in Psychology, Vol. 104. Elsevier Science B.V. Amsterdam, The Netherlands.Google ScholarGoogle Scholar
  22. MacKenzie, I. S. 2012. Evaluating eye tracking systems for computer input. Gaze interaction and applications of eye tracking: Advances in assistive technologies, 205--225.Google ScholarGoogle Scholar
  23. Mateo, J.C., Agustin, J.S. and Hansen, J.P. Gaze Beats Mouse: Hands-free Selection by Combining Gaze and EMG. In CHI EA '08, 3039--3044. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Microsoft Corp. Kinect. http://www.xbox.com/en-US/kinectGoogle ScholarGoogle Scholar
  25. Miniotas, D. Application of Fitts' law to eye gaze interaction. In CHI EA '00, 339--340. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. Morimoto, C. H. and Mimica, M. R. M. 2005. Eye gaze tracking techniques for interactive applications. Comput. Vis. Image Underst. 98(1), 4--24. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. Nevalainen, S. and Sajaniemi, J. Comparison of Three Eye Tracking Devices in Psychology of Programming Research. In Proc. PPIG '04, 151--158.Google ScholarGoogle Scholar
  28. Pfeuffer, K., Alexander, J., Chong, M.K. and Gellersen, H. Gaze-touch: Combining Gaze with Multi-touch for Interac-tion on the Same Surface. In Proc. UIST '14. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. Pouke, M., Karhu, A., Hickey, S., Arhippainen, L. Gaze tracking and non-touch gesture based interaction method for mobile 3D virtual spaces. In Proc. OzCHI '12, 505--512. Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. Schiffman, H.R. 2001. Sensation and Perception: An Inte-grated Approach. New York: John Wiley & Sons, Inc. p. 70.Google ScholarGoogle Scholar
  31. Schwaller, M. and Lalanne, D. Pointing in the Air: Measur-ing the Effect of Hand Selection Strategies on Performance and Effort. In SouthCHI '13, 732--747.Google ScholarGoogle Scholar
  32. Shih, S. W., Wu, Y. T., & Liu, J. A calibration-free gaze tracking technique. In Proc. ICPR '00, 201--204.Google ScholarGoogle Scholar
  33. Sibert, L.E. and Jacob, R.J. Evaluation of Eye Gaze Interac-tion. In Proc. CHI '00, 281--288. Google ScholarGoogle ScholarDigital LibraryDigital Library
  34. Slaney, M., Rajan, R., Stolcke, A. and Parthasarathy, P. Gaze-enhanced speech recognition. In Proc. ICASSP '14, 3236--3240.Google ScholarGoogle Scholar
  35. Špakov, O., Isokoski, P. and Majaranta, P. Look and Lean: Accurate Head-Assisted Eye Pointing. In Proc. ETRA '14, 35--42. Google ScholarGoogle ScholarDigital LibraryDigital Library
  36. Starner, T., Auxier, J., Ashbrook, D. and Gandy, M. The Ges-ture Pendant: A Self-illuminating, Wearable, Infrared Com-puter Vision System for Home Automation Control and Medi-cal Monitoring. In Proc. ISWC '00, 87--94. Google ScholarGoogle ScholarDigital LibraryDigital Library
  37. Stellmach, S., and Dachselt, R. Look & touch: gaze-supported target acquisition. In Proc. CHI '12, 2981--2990. Google ScholarGoogle ScholarDigital LibraryDigital Library
  38. Stellmach, S. and Dachselt, R. Still looking: investigating seamless gaze-supported selection, positioning, and manipula-tion of distant targets. In Proc. CHI '13, 285--294. Google ScholarGoogle ScholarDigital LibraryDigital Library
  39. Ware, C. and Mikaelian, H.H. An evaluation of an eye track-er as a device for computer input. In Proc. CHI '87, 183--188. Google ScholarGoogle ScholarDigital LibraryDigital Library
  40. Wigdor, D. and Wixon, D. 2011. Brave NUI World: Design-ing Natural User Interfaces for Touch and Gesture. Morgan Kaufmann, 97--104. Google ScholarGoogle ScholarDigital LibraryDigital Library
  41. Wilson, A. and Cutrell, E. FlowMouse: a computer vision-based pointing and gesture input device. In Proc. INTERACT '05, 565--578. Google ScholarGoogle ScholarDigital LibraryDigital Library
  42. Wilson, F. 1998. The Hand: How Its Use Shapes the Brain, Language, and Human Culture. Pantheon Books, New York.Google ScholarGoogle Scholar
  43. Yoo, B., et al. 3D user interface combining gaze and hand gestures for large-scale display. In CHI EA '10, 3709--3714. Google ScholarGoogle ScholarDigital LibraryDigital Library
  44. Zhai, S., Morimoto, C. and Ihde, S. Manual and gaze input cascaded (MAGIC) pointing. In Proc. CHI '99, 246--253. Google ScholarGoogle ScholarDigital LibraryDigital Library
  45. Zhang, X., Chen, X., Wang, W., Yang, J., Lantz, V. and Wang, K. Hand gesture recognition and virtual game control based on 3D accelerometer and EMG sensors. In Proc. IUI '09, 401--406. Google ScholarGoogle ScholarDigital LibraryDigital Library
  46. Zhu, Z., & Ji, Q. 2004. Eye and gaze tracking for interactive graphic display. Mach. Vis. Appl., 15(3), 139--148. Google ScholarGoogle ScholarDigital LibraryDigital Library
  47. Zimmerman, T., Lanier, J., Blanchard, C., Bryson, S. and Harvill, Y. A hand gesture interface device. In Proc. CHI '87, 189--192. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Gaze+Gesture: Expressive, Precise and Targeted Free-Space Interactions

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      ICMI '15: Proceedings of the 2015 ACM on International Conference on Multimodal Interaction
      November 2015
      678 pages
      ISBN:9781450339124
      DOI:10.1145/2818346

      Copyright © 2015 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 9 November 2015

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      ICMI '15 Paper Acceptance Rate52of127submissions,41%Overall Acceptance Rate453of1,080submissions,42%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader