ABSTRACT
Humans rely on eye gaze and hand manipulations extensively in their everyday activities. Most often, users gaze at an object to perceive it and then use their hands to manipulate it. We propose applying a multimodal, gaze plus free-space gesture approach to enable rapid, precise and expressive touch-free interactions. We show the input methods are highly complementary, mitigating issues of imprecision and limited expressivity in gaze-alone systems, and issues of targeting speed in gesture-alone systems. We extend an existing interaction taxonomy that naturally divides the gaze+gesture interaction space, which we then populate with a series of example interaction techniques to illustrate the character and utility of each method. We contextualize these interaction techniques in three example scenarios. In our user study, we pit our approach against five contemporary approaches; results show that gaze+gesture can outperform systems using gaze or gesture alone, and in general, approach the performance of "gold standard" input systems, such as the mouse and trackpad.
Supplemental Material
Available for Download
- Bérard, F., Ip, J., Benovoy, M., El-Shimy, D., Blum, J.R. and Cooperstock, J.R. Did "Minority Report" Get It Wrong? Su-periority of the Mouse over 3D Input Devices in a 3D Place-ment Task. In Proc. INTERACT '09, 400--414. Google ScholarDigital Library
- Bolt, R.A. Put-That-There: Voice and Gesture at the Graphics Interface. In Proc. SIGGRAPH '80, 262--270. Google ScholarDigital Library
- Card, S.K., English, W.K. and Burr, B.J. 1978. Evaluation of mouse, rate-controlled isometric joystick, step keys, and text keys for text selection on a CRT. Ergonomics 21(8), 601--61Google ScholarCross Ref
- Chen, L. Incorporating gesture and gaze into multimodal models of human-to-human communication. In Proc. NAACL-DocConsortium '06, 211--21 Google ScholarDigital Library
- Chi, C.F. and Lin, C.L. 1997. Speed and accuracy of eye-gaze pointing. Percept. Mot. Skills, 85(2), 705--718.Google ScholarCross Ref
- Cockburn. A., Quinn, P., Gutwin, C., Ramos, G. and Looser, J. 2011. Air pointing: Design and evaluation of spatial target acquisition with and without visual feedback. Int. J. Human-Computer Studies, 69, 401--414. Google ScholarDigital Library
- Connolly, K.J. 1998. Psychobiology of the Hand. Cambridge University Press.Google Scholar
- Erol, A., Bebis, G., Nicolescu, M., Boyle, R., and Twombly, X. 2007. Vision-based hand pose estimation: A review. Com-put. Vis. Image Und., 108(1), 52--73. Google ScholarDigital Library
- The EyeTribe. Eye Tribe Tracker. http://theeyetribe.comGoogle Scholar
- Grauman, K., Betke, M., Lombardi, J., Gips, J. and Bradski, J.R. 2003. Communication via eye blinks and eyebrow raises: Video-based human-computer interfaces. Univers. Access. Inform. Soc. 2(4), 359--373. Google ScholarDigital Library
- Hales, J., Rozado, D. and Mardanbegi, D. Interacting with objects in the environment by gaze and hand gestures. In Proc. PETMEI '13, 1--9.Google Scholar
- Hardenberg, C. and Bérard, F. Bare-Hand Human-Computer Interaction. In Proc. PUI '01, 1--8. Google ScholarDigital Library
- Hutchinson, T. E., White, K. P. Jr., Martin, W. N., Reichert, K. C. and Frey, L. A. 1989. Human-computer interaction us-ing eye-gaze input. IEEE Trans. Syst., Man, Cybern. 19(6), 1527--1534.Google ScholarCross Ref
- ISO/DIS 9241--9. 1998. Ergonomic Requirements for Office Work with Visual Display Terminals, Non-keyboard Input Device Requirements.Google Scholar
- Jacob, R. J. K. What you look at is what you get: eye move-ment-based interaction techniques. In Proc. CHI '90, 11--18. Google ScholarDigital Library
- Jones, B., Sodhi, R., Forsyth, D., Bailey, B., and Maciocci, G. Around device interaction for multiscale navigation. In Proc. MobileHCI '12, 83--92. Google ScholarDigital Library
- Kim, H-J., Kim, H., Chae, S., Seo, J. and Han, T-D. AR pen and hand gestures: a new tool for pen drawings. In CHI EA '13, 943--948. Google ScholarDigital Library
- Kratz, S., Rohs, M., Guse, D., Müller, J., Bailly, G. and Nischt, M. PalmSpace: continuous around-device gestures vs. multitouch for 3D rotation tasks on mobile devices. In Proc. AVI '12, 181--188. Google ScholarDigital Library
- Kumar, M., Paepcke, A. and Winograd, T. EyePoint: Practi-cal Pointing and Selection Using Gaze and Keyboard. In Proc. CHI '07, 421--430. Google ScholarDigital Library
- Leap Motion, Inc. https://www.leapmotion.comGoogle Scholar
- MacKenzie, C. L., and Iberall, T. 1994. The Grasping Hand. Advances in Psychology, Vol. 104. Elsevier Science B.V. Amsterdam, The Netherlands.Google Scholar
- MacKenzie, I. S. 2012. Evaluating eye tracking systems for computer input. Gaze interaction and applications of eye tracking: Advances in assistive technologies, 205--225.Google Scholar
- Mateo, J.C., Agustin, J.S. and Hansen, J.P. Gaze Beats Mouse: Hands-free Selection by Combining Gaze and EMG. In CHI EA '08, 3039--3044. Google ScholarDigital Library
- Microsoft Corp. Kinect. http://www.xbox.com/en-US/kinectGoogle Scholar
- Miniotas, D. Application of Fitts' law to eye gaze interaction. In CHI EA '00, 339--340. Google ScholarDigital Library
- Morimoto, C. H. and Mimica, M. R. M. 2005. Eye gaze tracking techniques for interactive applications. Comput. Vis. Image Underst. 98(1), 4--24. Google ScholarDigital Library
- Nevalainen, S. and Sajaniemi, J. Comparison of Three Eye Tracking Devices in Psychology of Programming Research. In Proc. PPIG '04, 151--158.Google Scholar
- Pfeuffer, K., Alexander, J., Chong, M.K. and Gellersen, H. Gaze-touch: Combining Gaze with Multi-touch for Interac-tion on the Same Surface. In Proc. UIST '14. Google ScholarDigital Library
- Pouke, M., Karhu, A., Hickey, S., Arhippainen, L. Gaze tracking and non-touch gesture based interaction method for mobile 3D virtual spaces. In Proc. OzCHI '12, 505--512. Google ScholarDigital Library
- Schiffman, H.R. 2001. Sensation and Perception: An Inte-grated Approach. New York: John Wiley & Sons, Inc. p. 70.Google Scholar
- Schwaller, M. and Lalanne, D. Pointing in the Air: Measur-ing the Effect of Hand Selection Strategies on Performance and Effort. In SouthCHI '13, 732--747.Google Scholar
- Shih, S. W., Wu, Y. T., & Liu, J. A calibration-free gaze tracking technique. In Proc. ICPR '00, 201--204.Google Scholar
- Sibert, L.E. and Jacob, R.J. Evaluation of Eye Gaze Interac-tion. In Proc. CHI '00, 281--288. Google ScholarDigital Library
- Slaney, M., Rajan, R., Stolcke, A. and Parthasarathy, P. Gaze-enhanced speech recognition. In Proc. ICASSP '14, 3236--3240.Google Scholar
- Špakov, O., Isokoski, P. and Majaranta, P. Look and Lean: Accurate Head-Assisted Eye Pointing. In Proc. ETRA '14, 35--42. Google ScholarDigital Library
- Starner, T., Auxier, J., Ashbrook, D. and Gandy, M. The Ges-ture Pendant: A Self-illuminating, Wearable, Infrared Com-puter Vision System for Home Automation Control and Medi-cal Monitoring. In Proc. ISWC '00, 87--94. Google ScholarDigital Library
- Stellmach, S., and Dachselt, R. Look & touch: gaze-supported target acquisition. In Proc. CHI '12, 2981--2990. Google ScholarDigital Library
- Stellmach, S. and Dachselt, R. Still looking: investigating seamless gaze-supported selection, positioning, and manipula-tion of distant targets. In Proc. CHI '13, 285--294. Google ScholarDigital Library
- Ware, C. and Mikaelian, H.H. An evaluation of an eye track-er as a device for computer input. In Proc. CHI '87, 183--188. Google ScholarDigital Library
- Wigdor, D. and Wixon, D. 2011. Brave NUI World: Design-ing Natural User Interfaces for Touch and Gesture. Morgan Kaufmann, 97--104. Google ScholarDigital Library
- Wilson, A. and Cutrell, E. FlowMouse: a computer vision-based pointing and gesture input device. In Proc. INTERACT '05, 565--578. Google ScholarDigital Library
- Wilson, F. 1998. The Hand: How Its Use Shapes the Brain, Language, and Human Culture. Pantheon Books, New York.Google Scholar
- Yoo, B., et al. 3D user interface combining gaze and hand gestures for large-scale display. In CHI EA '10, 3709--3714. Google ScholarDigital Library
- Zhai, S., Morimoto, C. and Ihde, S. Manual and gaze input cascaded (MAGIC) pointing. In Proc. CHI '99, 246--253. Google ScholarDigital Library
- Zhang, X., Chen, X., Wang, W., Yang, J., Lantz, V. and Wang, K. Hand gesture recognition and virtual game control based on 3D accelerometer and EMG sensors. In Proc. IUI '09, 401--406. Google ScholarDigital Library
- Zhu, Z., & Ji, Q. 2004. Eye and gaze tracking for interactive graphic display. Mach. Vis. Appl., 15(3), 139--148. Google ScholarDigital Library
- Zimmerman, T., Lanier, J., Blanchard, C., Bryson, S. and Harvill, Y. A hand gesture interface device. In Proc. CHI '87, 189--192. Google ScholarDigital Library
Index Terms
- Gaze+Gesture: Expressive, Precise and Targeted Free-Space Interactions
Recommendations
Gaze and Touch Interaction on Tablets
UIST '16: Proceedings of the 29th Annual Symposium on User Interface Software and TechnologyWe explore how gaze can support touch interaction on tablets. When holding the device, the free thumb is normally limited in reach, but can provide an opportunity for indirect touch input. Here we propose gaze and touch input, where touches redirect to ...
Evaluation of eye gaze interaction
CHI '00: Proceedings of the SIGCHI conference on Human Factors in Computing SystemsEye gaze interaction can provide a convenient and natural addition to user-computer dialogues. We have previously reported on our interaction techniques using eye gaze [10]. While our techniques seemed useful in demonstration, we now investigate their ...
Look and lean: accurate head-assisted eye pointing
ETRA '14: Proceedings of the Symposium on Eye Tracking Research and ApplicationsCompared to the mouse, eye pointing is inaccurate. As a consequence, small objects are difficult to point by gaze alone. We suggest using a combination of eye pointing and subtle head movements to achieve accurate hands-free pointing in a conventional ...
Comments