skip to main content
10.1145/3448696.3448710acmotherconferencesArticle/Chapter ViewAbstractPublication PagesafrichiConference Proceedingsconference-collections
short-paper

Towards the Use of Eye Gaze Tracking Technology: Human Computer Interaction (HCI) Research

Published: 08 July 2021 Publication History

Abstract

With a growing number of digital devices around us, and the increase in the amount of time we spend for interacting with such devices, we are strongly interested in finding new interaction methods which will ease the use of digital devices or increase interaction efficiency. Eye tracking seems to be a promising technology to achieve this goal. This study follows two different approaches on how to utilize eye tracking for computer input. The first approach researches eye gaze as pointing device in combination with a touch sensor for multimodal input and presents a method using a touch sensitive mouse. The second approach examines people's ability to perform gestures with the eyes for computer input and the separation of gaze gestures from natural eye movements. The findings from this study uncover the prospects of developing a usability tool for recording of interaction and gaze activity. All approaches present results based on user studies conducted with prototypes developed for the purpose. .The methods and approach used in this study are mostly experimental which is uncommon in computer science but appropriate in the field of Human-Computer Interaction (HCI). An iterative concept helps to get systematically closer to the development of new interaction methods based on gaze. Starting with a problem to solve the first step is to create ideas how to solve the problem. The resulting task is to realize the idea by creating a prototype.

References

[1]
Bolt I. Kenny. 2008. Usability Tool Support for Model- Based Web Development, PhD Thesis. Maximilia University München, 2008.
[2]
Deubel I.Vienna. 1989 Speed and Accuracy of Saccadic Eye Movements: Characteristics of Impulse Variability in the Oculomotor System. Journal of Experimental Psychology: Human Perception and Performance (1989), Vol. 15, No. 3, 529 – 543.
[3]
Dodge A. Dienne and Chine F. Sandra., 2005.Combining Head Tracking and Mouse Input for a GUI on Multiple Monitors. In Extended Abstracts on Human Factors in Computing Systems, CHI '05. ACM Press (2005), 1188–1191.(http://www.hci.iis.u- /papers/AshdowCHI05.pdf)
[4]
Dougheng A. Felly 2015, Efficient Eye Pointing with a Fisheye Lens. In Proceedings of Graphics interface 2005. ACM International Conference Proceeding Series, (2015), vol. 112, 203 – 210.
[5]
Erich S., 2018 Usability Tool Support for Model-Based Web Development, PhD Thesis. Ludwig- Maximilian University München, 2018.
[6]
Fitts P.Measure, Jones, R. Elvis, Milton, R. James1954. The Information Capacity of the Human Motor system in Controlling the Amplitude of Movement, In Journal of Experimental Psychology (1954), 47, 81 – 391.(http://e.guigon.free.fr/rsc/article/Fitts54.pdf)
[7]
MacKenzie I. Cherish. 1989 A Note on the Information-Theoretic Basis for Fitts' Law. Journal of Motor Behavior (1989), 21, 323 – 330. (http://www.yorku.ca/mack/JMB89.htm
[8]
Majaranta U.Ben. 2008. Writing with Your Eye: A Dwell Time Free Writing System Adapted to the Nature of Human Eye Gaze. In Perception in Multimodal Dialogue Systems, LNCS 5078/2008, Springer (2008), 111 – 122.
[9]
Methoden .I. Kelly 2015 WebGazeAnalyzer: A System for Capturing and Analyzing Web Reading Behavior Using Eye Gaze. In Extended Abstracts on Human Factors in Computing Systems, CHI '05. ACM Press (2015), 1913 – 1916.
[10]
Miles T. Amby 1954. The Information Capacity of the Human Motor System in Controlling the Amplitude of Movement, In Journal of Experimental Psychology (1954), 47, 381 – 391.
[11]
Milker A. Schinder 2013. Eye Gaze Based Reading Detection. In Conference on Convergent Technologies for Asia- Pacific Region, TENCON2003. Volume 2, (2013) 825 – 828.
[12]
Schneider M.Gregory., 2013 Touch-Typing With a Stylus. In Proceedings of the INTERACT '93 and CHI '13 Conference on Human Factors in Computing Systems CHI '13. ACM Press (2013), 80 – 87.
[13]
EyeWindows: Evaluation of Eye-Controlled Zooming Windows for Focus Selection. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. CHI '05. ACM Press (2005), 151 – 160.
[14]
Vertegaal M. Peter. 2004. Eye Tracking off the Shelf. In Proceedings of the 2004 Symposium on Eye Tracking Research & Applications. ETRA '04. ACM Press (2004), 58 – 58.
[15]
Wells H. Henry 2000. The Power Law repealed: The case for an Exponential Law of Practice, In Pyschonomic Bulletin and Review Vol. 7, Issue 2, (2000), 185 – 207.
[16]
Wader K. Temple 2006. Camera Eye-Gaze Tracking System with Free Head Motion. In Proceedings of the 2006 Symposium on Eye Tracking Research & Applications. ETRA '06. ACM Press (2006), 87 – 94.
[17]
Zhai, I. Siver, Morimoto K. Lanre 1999. anual And Gaze Input Cascaded (MAGIC) Pointing. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. CHI '99. ACM Press (1999), 246 – 253.
[18]
Zirh, J.Xander. 2007 Evaluating Eye Tracking with ISO 9241 – Part 9. Proceedings of HCI International 2007,
[19]
Zirph, K.Zeph ., 2012. Real-Time Eye Detection and Tracking Under Various Light Conditions. In Proceedings of the 2012 Symposium on Eye Tracking Research & Applications. ETRA '12. ACM Press (2012), 139 – 144.

Cited By

View all
  • (2024)ClearDepth: Addressing Depth Distortions Caused By Eyelashes For Accurate Geometric Gaze Estimation On Mobile Devices2024 IEEE International Conference on Image Processing (ICIP)10.1109/ICIP51287.2024.10647998(2135-2141)Online publication date: 27-Oct-2024
  • (2024)Beyond Average: Individualized Visual Scanpath Prediction2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)10.1109/CVPR52733.2024.02402(25420-25431)Online publication date: 16-Jun-2024
  • (2024)Do users desire gestures for in-vehicle interaction? Towards the subjective assessment of gestures in a high-fidelity driving simulatorComputers in Human Behavior10.1016/j.chb.2024.108189156:COnline publication date: 9-Jul-2024

Index Terms

  1. Towards the Use of Eye Gaze Tracking Technology: Human Computer Interaction (HCI) Research
      Index terms have been assigned to the content through auto-classification.

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Other conferences
      AfriCHI '21: Proceedings of the 3rd African Human-Computer Interaction Conference: Inclusiveness and Empowerment
      March 2021
      182 pages
      ISBN:9781450388696
      DOI:10.1145/3448696
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 08 July 2021

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. Communication
      2. Eye Tracking
      3. Eye gaze
      4. Gaze Gesture
      5. Human-computer interaction

      Qualifiers

      • Short-paper
      • Research
      • Refereed limited

      Conference

      AfriCHI 2021

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)76
      • Downloads (Last 6 weeks)2
      Reflects downloads up to 24 Jan 2025

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)ClearDepth: Addressing Depth Distortions Caused By Eyelashes For Accurate Geometric Gaze Estimation On Mobile Devices2024 IEEE International Conference on Image Processing (ICIP)10.1109/ICIP51287.2024.10647998(2135-2141)Online publication date: 27-Oct-2024
      • (2024)Beyond Average: Individualized Visual Scanpath Prediction2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)10.1109/CVPR52733.2024.02402(25420-25431)Online publication date: 16-Jun-2024
      • (2024)Do users desire gestures for in-vehicle interaction? Towards the subjective assessment of gestures in a high-fidelity driving simulatorComputers in Human Behavior10.1016/j.chb.2024.108189156:COnline publication date: 9-Jul-2024
      • (2023)Eye Tracking, Usability, and User Experience: A Systematic ReviewInternational Journal of Human–Computer Interaction10.1080/10447318.2023.222160040:17(4484-4500)Online publication date: 18-Jun-2023

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      HTML Format

      View this article in HTML Format.

      HTML Format

      Figures

      Tables

      Media

      Share

      Share

      Share this Publication link

      Share on social media