skip to main content
10.1145/2838739.2838804acmotherconferencesArticle/Chapter ViewAbstractPublication PagesozchiConference Proceedingsconference-collections
short-paper

GazeTry: Swipe Text Typing Using Gaze

Published: 07 December 2015 Publication History

Abstract

Over the last decades, eye gaze has become an alternative form of text entry by some physically challenged people. Recently a dwell-free system has been proposed, which has been proven to be much faster compared to other existing dwell-free systems. However, it is vulnerable to some common text entry problems. In the paper, we propose GazeTry, a dwell-free gaze-based text entry system which allows people to type a word by gazing sequentially at the letters of the word. Simulation and experiments results show that our proposed new dwell-free system, GazeTry with the Moving Window String Matching (MoWing) algorithm has better accuracy and more resilience to text entry errors.

References

[1]
Davies, M. (2011). Word frequency data from the Corpus of Contemporary American English (COCA).
[2]
Hall, P. A., & Dowling, G. R. (1980). Approximate string matching. ACM computing surveys (CSUR), 12(4), 381--402.
[3]
Jacob, R. J. (1990, March). What you look at is what you get: eye movement-based interaction techniques. In Proceedings of the SIGCHI conference on Human factors in computing systems (pp. 11--18). ACM.
[4]
Kristensson, P. O., & Vertanen, K. (2012, March). The potential of dwell-free eye-typing for fast assistive gaze communication. In Proceedings of the Symposium on Eye Tracking Research and Applications (pp. 241--244). ACM.
[5]
MacKenzie, I. S., & Zhang, X. (2008, March). Eye typing using word and letter prediction and a fixation algorithm. In Proceedings of the 2008 symposium on Eye tracking research & applications (pp. 55--58). ACM.
[6]
Majaranta, P., Ahola, U. K., & ŠŠpakov, O. (2009, April). Fast gaze typing with an adjustable dwell time. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 357--360). ACM.
[7]
Majaranta, P. (Ed.). (2011). Gaze Interaction and Applications of Eye Tracking: Advances in Assistive Technologies: Advances in Assistive Technologies. IGI Global.
[8]
Pedrosa, D., Pimentel, M. D. G., Wright, A., & Truong, K. N. (2015). Filteryedping: Design challenges and user performance of dwell-free eye typing. ACM Transactions on Accessible Computing (TACCESS), 6(1), 3.
[9]
Pedrosa, D., Pimentel, M. D. G., & Truong, K. N. (2015, April). Filteryedping: A Dwell-Free Eye Typing Technique. In Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems (pp. 303--306). ACM.
[10]
Räihä, K. J., & Ovaska, S. (2012, May). An exploratory study of eye typing fundamentals: dwell time, text entry rate, errors, and workload. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 3001--3010). ACM.
[11]
Sarcar, S., Panwar, P., & Chakraborty, T. (2013, September). EyeK: an efficient dwell-free eye gaze-based text entry system. In Proceedings of the 11th Asia Pacific Conference on Computer Human Interaction (pp. 215--220). ACM.
[12]
Urbina, M. H., & Huckauf, A. (2010, March). Alternatives to single character entry and dwell time selection on eye typing. In Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications (pp.315--322).ACM.

Cited By

View all
  • (2024)40 Years of Eye Typing: Challenges, Gaps, and Emergent StrategiesProceedings of the ACM on Human-Computer Interaction10.1145/36555968:ETRA(1-19)Online publication date: 28-May-2024
  • (2024)Eye Strokes: An Eye-gaze Drawing System for Mandarin CharactersProceedings of the ACM on Computer Graphics and Interactive Techniques10.1145/36547027:2(1-15)Online publication date: 17-May-2024
  • (2024)SkiMR: Dwell-free Eye Typing in Mixed Reality2024 IEEE Conference Virtual Reality and 3D User Interfaces (VR)10.1109/VR58804.2024.00065(439-449)Online publication date: 16-Mar-2024
  • Show More Cited By

Index Terms

  1. GazeTry: Swipe Text Typing Using Gaze

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    OzCHI '15: Proceedings of the Annual Meeting of the Australian Special Interest Group for Computer Human Interaction
    December 2015
    691 pages
    ISBN:9781450336734
    DOI:10.1145/2838739
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    In-Cooperation

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 07 December 2015

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Assistive technologies
    2. Dwell-free
    3. Eye-typing

    Qualifiers

    • Short-paper
    • Research
    • Refereed limited

    Conference

    OzCHI '15

    Acceptance Rates

    OzCHI '15 Paper Acceptance Rate 47 of 97 submissions, 48%;
    Overall Acceptance Rate 362 of 729 submissions, 50%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)30
    • Downloads (Last 6 weeks)3
    Reflects downloads up to 01 Mar 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)40 Years of Eye Typing: Challenges, Gaps, and Emergent StrategiesProceedings of the ACM on Human-Computer Interaction10.1145/36555968:ETRA(1-19)Online publication date: 28-May-2024
    • (2024)Eye Strokes: An Eye-gaze Drawing System for Mandarin CharactersProceedings of the ACM on Computer Graphics and Interactive Techniques10.1145/36547027:2(1-15)Online publication date: 17-May-2024
    • (2024)SkiMR: Dwell-free Eye Typing in Mixed Reality2024 IEEE Conference Virtual Reality and 3D User Interfaces (VR)10.1109/VR58804.2024.00065(439-449)Online publication date: 16-Mar-2024
    • (2022)TapGazer: Text Entry with Finger Tapping and Gaze-directed Word SelectionProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3501838(1-16)Online publication date: 29-Apr-2022
    • (2021)Hummer: Text Entry by Gaze and HumProceedings of the 2021 CHI Conference on Human Factors in Computing Systems10.1145/3411764.3445501(1-11)Online publication date: 6-May-2021
    • (2020)Comparison of three dwell-time-based gaze text entry methodsACM Symposium on Eye Tracking Research and Applications10.1145/3379157.3388931(1-5)Online publication date: 2-Jun-2020
    • (2020)Dynamic Bayesian Adjustment of Dwell Time for Faster Eye TypingIEEE Transactions on Neural Systems and Rehabilitation Engineering10.1109/TNSRE.2020.301674728:10(2315-2324)Online publication date: Oct-2020
    • (2019)ReTypeProceedings of the 2019 CHI Conference on Human Factors in Computing Systems10.1145/3290605.3300433(1-13)Online publication date: 2-May-2019
    • (2019)CamTypeMachine Vision and Applications10.1007/s00138-018-00997-430:3(407-421)Online publication date: 1-Apr-2019
    • (2018)Eye-Swipe Typing Using Integration of Dwell-Time and Dwell-Free Method2018 15th International Conference on Electrical Engineering/Electronics, Computer, Telecommunications and Information Technology (ECTI-CON)10.1109/ECTICon.2018.8619868(205-208)Online publication date: Jul-2018
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media