skip to main content
10.1145/2370216.2370369acmconferencesArticle/Chapter ViewAbstractPublication PagesubicompConference Proceedingsconference-collections
research-article

Enhanced gaze interaction using simple head gestures

Published: 05 September 2012 Publication History

Abstract

We propose a combination of gaze pointing and head gestures for enhanced hands-free interaction. Instead of the traditional dwell-time selection method, we experimented with five simple head gestures: nodding, turning left/right, and tilting left/right. The gestures were detected from the eye-tracking data by a range-based algorithm, which was found accurate enough in recognizing nodding and left-directed gestures. The gaze estimation accuracy did not noticeably suffer from the quick head motions. Participants pointed to nodding as the best gesture for occasional selections tasks and rated the other gestures as promising methods for navigation (turning) and functional mode switching (tilting). In general, dwell time works well for repeated tasks such as eye typing. However, considering multimodal games or transient interactions in pervasive and mobile environments, we believe a combination of gaze and head interaction could potentially provide a natural and more accurate interaction method.

References

[1]
Bee, N. and André, E. (2008). Writing with Your Eye: A Dwell Time Free Writing System Adapted to the Nature of Human Eye Gaze. In Proceedings of the 4th IEEE Tutorial and Research Workshop on Perception and Interactive Technologies for Speech-Based Systems: Perception in Multimodal Dialogue Systems (PIT '08), Springer-Verlag, Berlin, Heidelberg, 111--122.
[2]
Donegan, M., Morris, J. D., Corno, F., Signorile, I., Chió, A., Pasian, V., Vignola, A., Buchholz, M., and Holmqvist, E. (2009). Understanding users and their needs. Universal Access in Information Society 8(4), 259--275.
[3]
Drewes, H. and Schmidt, A. (2007). Interacting with the computer using gaze gestures. In Proceedings of the 11th IFIP TC 13 International Conference on Human-Computer Interaction (INTERACT'07), Vol. 2, Springer-Verlag, Berlin, Heidelberg, 475--488.
[4]
Gizatdinova, Y., Špakov, O., and Surakka, V. (2012). Face typing: Visual gesture-based perceptual interface for typing with a scrollable virtual keyboard. In Proceedings of the 2012 IEEE Workshop on the Applications of Computer Vision (WACV'12), IEEE Computer Society, 7 pages (in press).
[5]
Hansen, D. W., Skovsgaard, H. H. T., Hansen, J. P., and Møllenbach, E. (2008). Noise tolerant selection by gaze-controlled pan and zoom in 3D. In Proceedings of the 2008 Symposium on Eye Tracking Research & Applications (ETRA'08), ACM, New York, NY, USA, 205--212.
[6]
Heikkilä, H., and Räihä, K.-J. (2009). Speed and Accuracy of Gaze Gestures. Journal of Eye Movement Research, 3(2):1, 1--14.
[7]
Hornof, A. J., and Cavender, A. (2005). EyeDraw: enabling children with severe motor impairments to draw with their eyes. In Proceedings of the Conference on Human Factors in Computing Systems (CHI'05), ACM, New York, NY, USA, 161--170.
[8]
Huckauf, A. and Urbina, M. H. (2008). On object selection in gaze controlled environments. Journal of Eye Movement Research, 2(4):4, 1--7.
[9]
Huckauf, A. and Urbina, M. H. (2011). Object selection in gaze controlled systems: What you don't look at is what you get. ACM Transactions on Applied Perception, 8(2):13, 14 pages.
[10]
Isokoski, P. (2000). Text input methods for eye trackers using off-screen targets. In Proceedings of the 2000 Symposium on Eye Tracking Research & Applications (ETRA'00), ACM, New York, NY, USA, 15--21.
[11]
Istance, H., Bates, R., Hyrskykari, A., and Vickers, S. (2008). Snap clutch, a moded approach to solving the Midas touch problem. In Proceedings of the 2008 Symposium on Eye Tracking Research & Applications (ETRA'08), ACM, New York, NY, 221--228.
[12]
Kang, Y. G., Joo, H. J., and Rhee, P. (2006). Real Time Head Nod and Shake Detection Using HMMs. In Proceedings of Knowledge-Based Intelligent Information and Engineering Systems, Vol. 4253, Springer, 707--714.
[13]
Kumar, M., Paepcke, A., and Winograd, T. (2007). EyePoint: practical pointing and selection using gaze and keyboard. In Proceedings of the Conference on Human Factors in Computing Systems (CHI'07), ACM, New York, NY, USA, 421--430.
[14]
Lin, C. S., Ho, C. W., Chan, C. N., Chau, C. R., Wu, Y. C., and Yeh, M. S. (2007). An eye-tracking and head-control system using movement increment-coordinate method. Journal of Optics & Laser Technology, 39(6), 1218--1225.
[15]
Majaranta, P., Ahola, U.-K., and Špakov, O. (2009). Fast Gaze Typing with an Adjustable Dwell Time. In Proceedings of the Conference on Human Factors in Computing Systems (CHI'09), ACM, New York, NY, USA, 357--360.
[16]
Majaranta, P., and Räihä, K-J. (2002). Twenty years of eye typing: systems and design issues. In Proceedings of the 2002 symposium on Eye tracking research & applications (ETRA'02), ACM, New York, NY, USA, 15--22.
[17]
Mardanbegi, D., Hansen, D. W. and Pederson, T. (2012). Eye-based head gestures. In Proceedings of the 2012 Symposium on Eye Tracking Research and Applications (ETRA '12), ACM, New York, NY, USA, 139--146.
[18]
Møllenbach, E., Lillholm, M., Gail, A., and Hansen. J. P. (2010). Single gaze gestures. In Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications (ETRA'10), ACM, New York, NY, USA, 177--180.
[19]
Nonaka, H. (2003). Communication Interface with Eye-Gaze and Head Gesture Using Successive DP Matching and Fuzzy Inference. Journal of Intelligent Information Systems, 21(2), Kluwer Academic Publishers Hingham, MA, USA, 105--112.
[20]
Rasmusson, D., Chappell, R., and Trego, M. (1999). Quick Glance: Eye-tracking Access to the Windows95 Operating Environment. In Proceedings of the Fourteenth International Conference on Technology and Persons with Disabilities (CSUN'99), Los Angeles, CA.
[21]
Skovsgaard, H., Räihä, K.-J., and Tall, M. (2012). Computer Control by Gaze. In Majaranta et al. (Eds.), Gaze Interaction and Applications of Eye Tracking: Advances in Assistive Technologies (pp. 63--77), IGI Global.
[22]
Špakov, O., and Miniotas, D. (2005). EyeChess: A Tutorial for Endgames with Gaze-Controlled Pieces. In Proceedings of Communication by Gaze Interaction (COGAIN'05), 16--18.
[23]
Surakka, V., Illi, M., and Isokoski, P. (2004). Gazing and frowning as a new human--computer interaction technique. ACM Transactions on Applied Perception, 1(1), 40--56.
[24]
Tan, W., and Rong, G. (2003). A real-time head nod and shake detector using HMMs. Expert Systems with Applications, 25, 461--466.
[25]
Tuisku, O., Majaranta, P., Isokoski, P., and Räihä, K-J. (2008). Now Dasher! Dash away!: longitudinal study of fast text entry by Eye Gaze. In Proceedings of the 2008 Symposium on Eye Tracking Research & Applications (ETRA'08), ACM, New York, NY, USA, 19--26.
[26]
Velichkovsky, B., Sprenger, A. and Unema, P. (1997) Towards gaze-mediated interaction: Collecting solutions of the "Midas touch problem". In Proceedings of the IFIP TC13 International Conference on Human-Computer Interaction (INTERACT 1997), 509--516. London: Chapman and Hall
[27]
Ward, D. J. and MacKay, D. J. C. (2002) Fast hands-free writing by gaze direction. Nature 418(6900), 838.
[28]
Wobbrock, J. O., Rubinstein, J., Sawyer, M. W., and Duchowski, A. T. (2008). Longitudinal evaluation of discrete consecutive gaze gestures for text entry. In Proceedings of the 2008 Symposium on Eye Tracking Research & Applications (ETRA'08), ACM, New York, NY, USA, 11--18

Cited By

View all
  • (2024)Bi-Directional Gaze-Based Communication: A ReviewMultimodal Technologies and Interaction10.3390/mti81201088:12(108)Online publication date: 4-Dec-2024
  • (2024)Hands-free Selection in Scroll Lists for AR DevicesProceedings of Mensch und Computer 202410.1145/3670653.3670671(323-330)Online publication date: 1-Sep-2024
  • (2024)GazePuffer: Hands-Free Input Method Leveraging Puff Cheeks for VR2024 IEEE Conference Virtual Reality and 3D User Interfaces (VR)10.1109/VR58804.2024.00055(331-341)Online publication date: 16-Mar-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
UbiComp '12: Proceedings of the 2012 ACM Conference on Ubiquitous Computing
September 2012
1268 pages
ISBN:9781450312240
DOI:10.1145/2370216
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

In-Cooperation

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 05 September 2012

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. dwell time
  2. eye tracking
  3. head gestures
  4. selection

Qualifiers

  • Research-article

Conference

Ubicomp '12
Ubicomp '12: The 2012 ACM Conference on Ubiquitous Computing
September 5 - 8, 2012
Pennsylvania, Pittsburgh

Acceptance Rates

UbiComp '12 Paper Acceptance Rate 58 of 301 submissions, 19%;
Overall Acceptance Rate 764 of 2,912 submissions, 26%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)62
  • Downloads (Last 6 weeks)8
Reflects downloads up to 05 Mar 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Bi-Directional Gaze-Based Communication: A ReviewMultimodal Technologies and Interaction10.3390/mti81201088:12(108)Online publication date: 4-Dec-2024
  • (2024)Hands-free Selection in Scroll Lists for AR DevicesProceedings of Mensch und Computer 202410.1145/3670653.3670671(323-330)Online publication date: 1-Sep-2024
  • (2024)GazePuffer: Hands-Free Input Method Leveraging Puff Cheeks for VR2024 IEEE Conference Virtual Reality and 3D User Interfaces (VR)10.1109/VR58804.2024.00055(331-341)Online publication date: 16-Mar-2024
  • (2023)Brave New GES World: A Systematic Literature Review of Gestures and Referents in Gesture Elicitation StudiesACM Computing Surveys10.1145/363645856:5(1-55)Online publication date: 7-Dec-2023
  • (2023)HeadarProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36109007:3(1-28)Online publication date: 27-Sep-2023
  • (2023)Classifying Head Movements to Separate Head-Gaze and Head Gestures as Distinct Modes of InputProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581201(1-14)Online publication date: 19-Apr-2023
  • (2023)TicTacToes: Assessing Toe Movements as an Input ModalityProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580954(1-17)Online publication date: 19-Apr-2023
  • (2023)Vergence Matching: Inferring Attention to Objects in 3D Environments for Gaze-Assisted SelectionProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580685(1-15)Online publication date: 19-Apr-2023
  • (2023)Long-Distance Gesture Recognition Using Dynamic Neural Networks2023 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)10.1109/IROS55552.2023.10342147(1307-1312)Online publication date: 1-Oct-2023
  • (2022)StretchARProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/35503056:3(1-26)Online publication date: 7-Sep-2022
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media