skip to main content
10.1145/2578153.2578157acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
research-article

Look and lean: accurate head-assisted eye pointing

Published: 26 March 2014 Publication History

Abstract

Compared to the mouse, eye pointing is inaccurate. As a consequence, small objects are difficult to point by gaze alone. We suggest using a combination of eye pointing and subtle head movements to achieve accurate hands-free pointing in a conventional desktop computing environment. For tracking the head movements, we exploited information of the eye position in the eye tracker's camera view. We conducted a series of three experiments to study the potential caveats and benefits of using head movements to adjust gaze cursor position. Results showed that head-assisted eye pointing significantly improves the pointing accuracy without a negative impact on the pointing time. In some cases participants were able to point almost 3 times closer to the target's center, compared to the eye pointing alone (7 vs. 19 pixels). We conclude that head assisted eye pointing is a comfortable and potentially very efficient alternative for other assisting methods in the eye pointing, such as zooming.

References

[1]
Ashmore, M., Duchowski, A. T., and Shoemaker, G. 2005. Efficient eye pointing with a fisheye lens. In Proceedings of Graphics Interface 2005, GI'05, Canadian Human-Computer Communications Society, University of Waterloo, Canada, 203--210.
[2]
Bates, R. and Istance, H. O. 2003. Why are eye mice unpopular? A detailed comparison of head and eye controlled assistive technology pointing device. Universal Access in the Information Society 2, 3, 280--290.
[3]
Biedert, R., Dengel, A., and Käding, C. 2012. Universal eye-tracking based text cursor warping. In Proceed ings of the 2012 Symposium on Eye Tracking Research and Applications, ETRA '12, ACM, 361--364.
[4]
Fono. D. and Vertegaal, R. 2005. EyeWindows: evaluation of eye-controlled zooming windows for focus selection. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI '05, ACM, 151--160.
[5]
Hansen, J. P., Hansen, D. W., and Johansen, A. S. 2001. Bringing gaze-based interaction back to basics. In C. Stephanidis (Ed.), Universal Access in HCI: Towards an Information Society for All (Volume 3 of the Proceedings of the 9th International Conference on Human-Computer Interaction), Lawrence Erlbaum Associates, 325--328.
[6]
Hansen, D. W, and Ji, Q. 2010. In the eye of the beholder: A survey of models for eyes and gaze. IEEE Transactions on Pattern Analysis and Machine Intelligence 32, 3, 478--500.
[7]
Kumar, M., Klingner, J., Puranik, R., Winograd, T., and Paepcke. A. 2008. Improving the accuracy of gaze input for interaction. In Proceedings of the 2008 Symposium on Eye Tracking Research & Applications, ETRA'08, ACM, 65--68.
[8]
Kumar, M., Paepcke, A., and Winograd, T. 2007. EyePoint: practical pointing and selection using gaze and keyboard. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI '07, ACM, 421--430.
[9]
Lankford, C. 2000. Effective eye-gaze input into Windows. In Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, ETRA'00, ACM, 23--27.
[10]
Mardanbegi, M., Hansen, D. W, and Pederson, T. 2012. Eye-based head gestures. In Proceedings of the 2012 Symposium on Eye Tracking Research & Applications, ETRA'12, ACM, 139--146.
[11]
Miniotas, D., Špakov. O., and MacKenzie, I. S. 2004. Eye gaze interaction with expanding targets. In Extended Abstracts on Human Factors in Computing Systems, CHI'04. ACM, 1255--1258.
[12]
Miniotas, D., Špakov, O., Tugoy, I., and MacKenzie, I. S. 2006. Speech-augmented eye gaze interaction with small closely spaced targets. In Proceedings of the 2006 Symposium on Eye Tracking Research & Applications, ETRA'06, ACM, 67--72.
[13]
Monden, A., Matsumoto, K., and Yamato, M. 2005. Evaluation of gaze-added target selection methods suitable for general GUIs. International Journal of Computer Applications in Technology 24, 1, Inderscience Publishers, 17--24.
[14]
Rozado, D. 2013. Mouse and keyboard cursor warping to accelerate and reduce the effort of routine HCI input tasks. IEEE Transactions on Human-Machine Systems 43, 5, 487--493.
[15]
Skovsgaard, H., Mateo, J. C., Flach, J. M., and Hansen, J. P. 2010. Small-target selection with gaze alone. In Proceedings of the 2010 Symposium on Eye Tracking Research & Applications, ETRA'10, ACM, 145--148.
[16]
Skovsgaard H., Räihä K.-J., Tall, M. 2012. Computer control by gaze. In P. Majaranta et al. (eds.) Gaze Interaction and Applications of Eye Tracking: Advances in Assistive Technologies, IGI Global, 78--102.
[17]
Špakov, O. 2012. Comparison of eye movement filters used in HCI. In Proceedings of the 2012 Symposium on Eye Tracking Research & Applications, ETRA'12, ACM, 281--284.
[18]
Špakov, O. and Majaranta, P. 2012, Enhanced gaze interaction using simple head gestures. In Proceedings of the 2012 ACM Conference on Ubiquitous Computing, UBICOMP'12, ACM, 705--710.
[19]
Špakov, O., and Miniotas, D. 2005. Gaze-based selection of standard-size menu items. In Proceeding of International Conference on Mumtimodal Interfaces, ICMI'05, ACM, 124--128.
[20]
Stellmach, S. and Dachselt, R. 2012. Look & touch: gaze-supported target acquisition. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI '12, ACM, 2981--2990.
[21]
Stellmach, S. and Dachselt, R. 2013. Still looking: investigating seamless gaze-supported selection, positioning, and manipulation of distant targets. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI '13, ACM, 285--294.
[22]
Turner, J., Bulling, A., and Gellersen, H. 2011. Combining gaze with manual interaction to extend physical reach. In Proceedings of the 1st international workshop on pervasive eye tracking & mobile eye-based interaction, PETMEI '11, ACM, 33--36.
[23]
Ware, C., and Mikaelian, H. H. 1987. An evaluation of an eye tracker as a device for computer input. In Proceedings of the SIGCHI/GI Conference on Human Factors in Computing Systems and Graphics Interface, CHI '87, ACM, 183--188.
[24]
Zhai, S., Morimoto, C., and Ihde, S. 1999. Manual and gaze input cascaded (MAGIC) pointing. In Proceedings of the SIGCHI conference on Human Factors in Computing Systems, CHI'99, ACM, 246--253.
[25]
Zhang, X., Ren, X., and Zha, H. 2008. Improving eye cursor's stability for eye pointing tasks. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI '08. ACM, 525--534.

Cited By

View all
  • (2024)GazeSwitch: Automatic Eye-Head Mode Switching for Optimised Hands-Free PointingProceedings of the ACM on Human-Computer Interaction10.1145/36556018:ETRA(1-20)Online publication date: 28-May-2024
  • (2024)Towards an Eye-Brain-Computer Interface: Combining Gaze with the Stimulus-Preceding Negativity for Target Selections in XRProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3641925(1-17)Online publication date: 11-May-2024
  • (2024)Exploring the Effectiveness of Assistive Technology: A Preliminary Case Study Using Makey Makey, Tobii Eye Tracker, and Leap MotionExtended Reality10.1007/978-3-031-71704-8_3(32-42)Online publication date: 18-Sep-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
ETRA '14: Proceedings of the Symposium on Eye Tracking Research and Applications
March 2014
394 pages
ISBN:9781450327510
DOI:10.1145/2578153
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 26 March 2014

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. eye tracking
  2. gaze input
  3. head movements
  4. pointing

Qualifiers

  • Research-article

Conference

ETRA '14
ETRA '14: Eye Tracking Research and Applications
March 26 - 28, 2014
Florida, Safety Harbor

Acceptance Rates

Overall Acceptance Rate 69 of 137 submissions, 50%

Upcoming Conference

ETRA '25

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)32
  • Downloads (Last 6 weeks)1
Reflects downloads up to 01 Mar 2025

Other Metrics

Citations

Cited By

View all
  • (2024)GazeSwitch: Automatic Eye-Head Mode Switching for Optimised Hands-Free PointingProceedings of the ACM on Human-Computer Interaction10.1145/36556018:ETRA(1-20)Online publication date: 28-May-2024
  • (2024)Towards an Eye-Brain-Computer Interface: Combining Gaze with the Stimulus-Preceding Negativity for Target Selections in XRProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3641925(1-17)Online publication date: 11-May-2024
  • (2024)Exploring the Effectiveness of Assistive Technology: A Preliminary Case Study Using Makey Makey, Tobii Eye Tracker, and Leap MotionExtended Reality10.1007/978-3-031-71704-8_3(32-42)Online publication date: 18-Sep-2024
  • (2023)Vision-Based Interfaces for Character-Based Text EntryAdvances in Human-Computer Interaction10.1155/2023/88557642023Online publication date: 1-Jan-2023
  • (2023)Predicting Gaze-based Target Selection in Augmented Reality Headsets based on Eye and Head Endpoint DistributionsProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581042(1-14)Online publication date: 19-Apr-2023
  • (2023)Usability and acceptability of virtual reality for chronic pain management among diverse patients in a safety-net setting: a qualitative analysisJAMIA Open10.1093/jamiaopen/ooad0506:3Online publication date: 11-Jul-2023
  • (2023)Eye Tracking in Virtual Reality: a Broad Review of Applications and ChallengesVirtual Reality10.1007/s10055-022-00738-z27:2(1481-1505)Online publication date: 18-Jan-2023
  • (2022)RETRACTED ARTICLE: Eye tracking: empirical foundations for a minimal reporting guidelineBehavior Research Methods10.3758/s13428-021-01762-855:1(364-416)Online publication date: 6-Apr-2022
  • (2022)Understanding and Creating Spatial Interactions with Distant Displays Enabled by Unmodified Off-The-Shelf SmartphonesMultimodal Technologies and Interaction10.3390/mti61000946:10(94)Online publication date: 19-Oct-2022
  • (2022)GazeDock: Gaze-Only Menu Selection in Virtual Reality using Auto-Triggering Peripheral Menu2022 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)10.1109/VR51125.2022.00105(832-842)Online publication date: Mar-2022
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media