skip to main content
10.1145/3448018.3458015acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
short-paper

EyeTell: Tablet-based Calibration-free Eye-typing using Smooth-pursuit movements

Published: 25 May 2021 Publication History

Abstract

Gaze tracking technology, with the increasingly robust and lightweight equipment, can have tremendous applications. To use the technology during short interactions, such as in public displays or hospitals to communicate non-verbally after a surgery, the application needs to be intuitive without requiring a calibration. Gaze gestures such as smooth-pursuit eye movements can be detected without calibration. We report the working performance of a calibration-free eye-typing application using only the front-facing camera of a tablet. In a user study with 29 participants, we obtained an average typing speed of 1.27 WPM after four trials and a maximum typing speed of 1.95 WPM.

References

[1]
Yasmeen Abdrabou, Mariam Mostafa, Mohamed Khamis, and Amr Elmougy. 2019. Calibration-free text entry using smooth pursuit eye movements. In In Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications. 1–5. https://doi.org/10.1145/3314111.3319838
[2]
Irshad Abibouraguimane, Kakeru Hagihara, Keita Higuchi, Yoichi Sato, Tetsu Hayashida, and Maki Sugimoto. 2019. CoSummary: Adaptive Fast-Forwarding for Surgical Videos by Detecting Collaborative Scenes Using Hand Regions and Gaze Positions. In In Proceedings of the 24th International Conference on Intelligent User Interfaces.580–590. https://doi.org/10.1145/3301275
[3]
Laura J. Ball, Amy S. Nordness, Susan K. Fager, Katie Kersch, Brianae Mohr, Gary L. Pattee, David R. Beukelman, Gary L. Pattee, David R. Beukelman, Gary L. Pattee, and David R. Beukelman. 2010. Eye-Gaze Access to AAC Technology for People with Amyotrophic Lateral Sclerosis. Journal of Medical Speech-Language Pathology 18, 3 (2010), 11–23. https://doi.org/10.1001/jamaoto.2013.35 arxiv:0604217 [arXiv:cond-mat]
[4]
Nora Castner, Enkelejda Kasneci, Thomas Kübler, Katharina Scheiter, Juliane Richter, Thérése Eder, Fabian Hüttig, and Constanze Keutel. 2017. Scanpath comparison in medical image reading skills of dental students Distinguishing stages of expertise development. In In Proceedings of the 10th ACM Symposium on Eye Tracking Research & Applications. 1–9. https://doi.org/10.1145/3204493.3204550
[5]
Dietlind Helene Cymek, Stefan Ruff, Simon Hofmann, Antje Christine Venjakob, Otto Hans Martin Lutz, and Matthias Roetting. 2014. Entering PIN codes by smooth pursuit eye movements. Journal of Eye Movement Research 7, 4 (2014), 1–11. https://doi.org/10.16910/jemr.7.4.1
[6]
Heiko Drewes, Mohamed Khamis, and Florian Alt. 2019. Dialplates: Enabling pursuits-based user interfaces with large target numbers. In ACM International Conference Proceeding Series. 1–10. https://doi.org/10.1145/3365610.3365626
[7]
John Paulin Hansen, Dan Witzner Hansen, and Anders Sewerin Johansen. 2001. Bringing Gaze-based Interaction Back to Basics. In Universal Access In HCI. 325–328.
[8]
Mohamed Khamis, Ozan Saltuk, Alina Hang, Katharina Stolz, Andreas Bulling, and Florian Alt. 2016. TextPursuits: Using text for pursuits-based interaction and calibration on public displays. In In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing. 274–285. https://doi.org/10.1145/2971648.2971679
[9]
Timothée Lenglet, Jonathan Mirault, Marie Veyrat-Masson, Aurélie Funkiewiez, Maria Del Mar Amador, Gaelle Bruneteau, Nadine Le Forestier, Pierre Francois Pradat, Francois Salachas, Yannick Vacher, Lucette Lacomblez, and Jean Lorenceau. 2019. Cursive eye-writing with smooth-pursuit eye-movement is possible in subjects with amyotrophic lateral sclerosis. Frontiers in Neuroscience 13, MAY (2019), 1–11. https://doi.org/10.3389/fnins.2019.00538
[10]
Otto Hans Martin Lutz, Antje Christine Venjakob, and Stefan Ruff. 2015. SMOOVS: Towards calibration-free text entry by gaze using smooth pursuit movements. Journal of Eye Movement Research 8, 1 (2015). https://doi.org/10.16910/jemr.8.1.2
[11]
Päivi Majaranta, Ulla-Kaija Ahola, and Oleg Špakov. 2009. Fast gaze typing with an adjustable dwell time. In In Proceedings of the 27th international conference on Human factors in computing systems. 357. https://doi.org/10.1145/1518701.1518758
[12]
Martez E Mott, Shane Williams, Jacob O. Wobbrock, and Meredith Ringel Morris. 2017. Improving Dwell-Based Gaze Typing with Dynamic, Cascading Dwell Times. In In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems.2558–2570. https://doi.org/10.1145/3025453.3025517
[13]
Bernhard Obermaier, Christa Neuper, Christoph Guger, and Gert Pfurtscheller. 2001. Information transfer rate in a five-classes brain-computer interface. IEEE Transactions on neural systems and rehabilitation engineering 9, 3(2001), 283–288. https://doi.org/10.1109/7333.948456
[14]
Sahar Sadeghi and Ali Maleki. 2019. Accurate estimation of information transfer rate based on symbol occurrence probability in brain-computer interfaces. Biomedical Signal Processing and Control 54 (2019), 101607. https://doi.org/10.1016/j.bspc.2019.101607
[15]
William Speier, Corey Arnold, and Nader Pouratian. 2013. Evaluating true BCI communication rate through mutual information and language models. PLoS One 8, 10 (2013), e78432. https://doi.org/10.1371/journal.pone.0078432
[16]
David J. Ward, Alan F. Blackwell, and David J. C. MacKay. 2000. Dasher—a data entry interface using continuous gestures and language models. In In Proceedings of the 13th annual ACM symposium on User interface software and technology. 129–137. https://doi.org/10.1145/354401.354427
[17]
Jacob O. Wobbrock and Brad A. Myers. 2006. Analyzing the input stream for character- level errors in unconstrained text entry evaluations. ACM Transactions on Computer-Human Interaction 13, 4(2006), 458–489. https://doi.org/10.1145/1188816.1188819
[18]
Jacob O. Wobbrock, Michael W Sawyer, and Andrew T. Duchowski. 2008. Longitudinal Evaluation of Discrete Consecutive Gaze Gestures for Text Entry. In In Proceedings of the 2008 symposium on Eye tracking research & applications.11–19. https://doi.org/10.1145/1344471.1344475
[19]
Jonathan R Wolpaw, Niels Birbaumer, Dennis J McFarland, Gert Pfurtscheller, and Theresa M Vaughan. 2002. Brain–computer interfaces for communication and control. Clinical neurophysiology 113, 6 (2002), 767–791. https://doi.org/10.1016/s1388-2457(02)00057-3
[20]
Zhe Zeng and Matthias Roetting. 2018. A text entry interface using smooth pursuit movements and language model. In In Proceedings of the 10th ACM Symposium on Eye Tracking Research & Applications. 1–2. https://doi.org/10.1145/3204493.3207413

Cited By

View all
  • (2024)40 Years of Eye Typing: Challenges, Gaps, and Emergent StrategiesProceedings of the ACM on Human-Computer Interaction10.1145/36555968:ETRA(1-19)Online publication date: 28-May-2024
  • (2024)GazeTrak: Exploring Acoustic-based Eye Tracking on a Glass FrameProceedings of the 30th Annual International Conference on Mobile Computing and Networking10.1145/3636534.3649376(497-512)Online publication date: 4-Dec-2024
  • (2024)LeyenesInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2023.103204184:COnline publication date: 17-Apr-2024
  • Show More Cited By

Index Terms

  1. EyeTell: Tablet-based Calibration-free Eye-typing using Smooth-pursuit movements
      Index terms have been assigned to the content through auto-classification.

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      ETRA '21 Short Papers: ACM Symposium on Eye Tracking Research and Applications
      May 2021
      232 pages
      ISBN:9781450383455
      DOI:10.1145/3448018
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 25 May 2021

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. calibration-free
      2. eye-tracking
      3. mobile
      4. smooth-pursuit
      5. tablet
      6. text entry

      Qualifiers

      • Short-paper
      • Research
      • Refereed limited

      Funding Sources

      Conference

      ETRA '21
      Sponsor:

      Acceptance Rates

      Overall Acceptance Rate 69 of 137 submissions, 50%

      Upcoming Conference

      ETRA '25

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)28
      • Downloads (Last 6 weeks)5
      Reflects downloads up to 08 Mar 2025

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)40 Years of Eye Typing: Challenges, Gaps, and Emergent StrategiesProceedings of the ACM on Human-Computer Interaction10.1145/36555968:ETRA(1-19)Online publication date: 28-May-2024
      • (2024)GazeTrak: Exploring Acoustic-based Eye Tracking on a Glass FrameProceedings of the 30th Annual International Conference on Mobile Computing and Networking10.1145/3636534.3649376(497-512)Online publication date: 4-Dec-2024
      • (2024)LeyenesInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2023.103204184:COnline publication date: 17-Apr-2024
      • (2023)Enhancing Hybrid Eye Typing Interfaces with Word and Letter Prediction: A Comprehensive EvaluationInternational Journal of Human–Computer Interaction10.1080/10447318.2023.229711341:1(161-173)Online publication date: 28-Dec-2023
      • (2023)Development of a real-time eye movement-based computer interface for communication with improved accuracy for disabled people under natural head movementsJournal of Real-Time Image Processing10.1007/s11554-023-01336-120:4Online publication date: 6-Jul-2023
      • (2022) SPEye: A Calibration-Free Gaze-Driven Text Entry Technique Based on Smooth Pursuit IEEE Transactions on Human-Machine Systems10.1109/THMS.2021.312320252:2(312-323)Online publication date: Apr-2022
      • (2022)A One-Point Calibration Design for Hybrid Eye Typing InterfaceInternational Journal of Human–Computer Interaction10.1080/10447318.2022.210118639:18(3620-3633)Online publication date: 24-Jul-2022

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      HTML Format

      View this article in HTML Format.

      HTML Format

      Figures

      Tables

      Media

      Share

      Share

      Share this Publication link

      Share on social media