skip to main content
10.1145/3334480.3382802acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
abstract

Gaze'N'Touch: Enhancing Text Selection on Mobile Devices Using Gaze

Published: 25 April 2020 Publication History

Abstract

Text selection is a frequent task we do everyday to edit, modify or delete text. Selecting a word requires not only precision but also switching between selections and typing which influences both speed and error rates. In this paper, we evaluate a novel concept that extends text editing with an additional modality, that is gaze. We present a user study (N=16) where we explore how, the novel concepts called GazeButton can improve text selection by comparing it to touch based selection. In addition, we tested the effect of text size on the selection techniques by comparing two different text sizes.Results show that gaze based selection was faster with bigger text size, although not statistically significant. Qualitative feedback show a preference on gaze over touch which motivates a new direction of gaze usage in text editors.

References

[1]
Richard A. Bolt. 1981. Gaze-Orchestrated Dynamic Windows. SIGGRAPH Comput. Graph. 15, 3 (Aug. 1981), 109--119.
[2]
A. Bulling and H. Gellersen. 2010. Toward Mobile Eye-Based Human-Computer Interaction. IEEE Pervasive Computing 9, 4 (October 2010), 8--12.
[3]
Christian Corsten, Marcel Lahaye, Jan Borchers, and Simon Voelker. 2019. ForceRay: Extending Thumb Reach via Force Input Stabilizes Device Grip for Mobile Touch Input. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI '19). ACM, New York, NY, USA, Article 212, 12 pages.
[4]
Christian Holz and Patrick Baudisch. 2010. The Generalized Perceived Input Point Model and How to Double Touch Accuracy by Extracting Fingerprints. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '10). ACM, New York, NY, USA, 581--590.
[5]
Robert J. K. Jacob. 1990. What You Look at is What You Get: Eye Movement-based Interaction Techniques. In Proc. SIGCHI Conference on Human Factors in Computing Systems (CHI '90). ACM, New York, USA, 11--18.
[6]
Topi Kaaresoja, Lorna M Brown, and Jukka Linjama. 2006. Snap-Crackle-Pop: Tactile feedback for mobile touch screens. In Proceedings of Eurohaptics, Vol. 2006. Citeseer, 565--566.
[7]
Markus Löchtefeld, Phillip Schardt, Antonio Krüger, and Sebastian Boring. 2015. Detecting Users Handedness for Ergonomic Adaptation of Mobile User Interfaces. In MUM '15. ACM, 245--249.
[8]
I Scott MacKenzie and R William Soukoreff. 2003. Phrase sets for evaluating text entry techniques. In CHI'03 extended abstracts on Human factors in computing systems. ACM, 754--755.
[9]
Dan Odell and Vasudha Chandrasekaran. 2012. Enabling comfortable thumb interaction in tablet computers: A windows 8 case study. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting, Vol. 56. SAGE Publications, 1907--1911.
[10]
Ken Pfeuffer, Jason Alexander, Ming Ki Chong, and Hans Gellersen. 2014. Gaze-touch: Combining Gaze with Multi-touch for Interaction on the Same Surface. In Proc. 27th Annual Symposium on User Interface Software and Technology (UIST '14). ACM, New York, USA, 509--518.
[11]
Ken Pfeuffer and Hans Gellersen. 2016. Gaze and Touch Interaction on Tablets. In Proc. 29th Annual Symposium on User Interface Software and Technology (UIST '16). ACM, New York, USA, 301--311.
[12]
Ivan Poupyrev, Makoto Okabe, and Shigeaki Maruyama. 2004. Haptic Feedback for Pen Computing: Directions and Strategies. In CHI '04 Extended Abstracts on Human Factors in Computing Systems (CHI EA '04). ACM, New York, NY, USA, 1309--1312.
[13]
Sheikh Rivu, Yasmeen Abdrabou, Thomas Mayer, Ken Pfeuffer, and Florian Alt. 2019. GazeButton: Enhancing Buttons with Eye Gaze Interactions. In Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications (ETRA '19). ACM, New York, NY, USA, Article 73, 7 pages.
[14]
Volker Roth and Thea Turner. 2009. Bezel Swipe: Conflict-free Scrolling and Multiple Selection on Mobile Touch Screen Devices. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '09). ACM, New York, NY, USA, 1523--1526.
[15]
Linda E. Sibert and Robert J. K. Jacob. 2000. Evaluation of Eye Gaze Interaction. In Proc. SIGCHI Conference on Human Factors in Computing Systems (CHI '00). ACM, New York, USA, 281--288.
[16]
Katie A. Siek, Yvonne Rogers, and Kay H. Connelly. 2005. Fat Finger Worries: How Older and Younger Users Physically Interact with PDAs. In Proceedings of the 2005 IFIP TC13 International Conference on Human-Computer Interaction (INTERACT'05). Springer-Verlag, Berlin, Heidelberg, 267--280.
[17]
Shyamli Sindhwani, Christof Lutteroth, and Gerald Weber. 2019. ReType: Quick Text Editing with Keyboard and Gaze. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI'19). ACM, NY, NY, USA, Article Paper 203, 13 pages.
[18]
Sophie Stellmach and Raimund Dachselt. 2012. Look & Touch: Gaze-supported Target Acquisition. In CHI. ACM, 2981--2990.
[19]
Matthieu B Trudeau, Paul J Catalano, Devin L Jindrich, and Jack T Dennerlein. 2013. Tablet keyboard configuration affects performance, discomfort and task difficulty for thumb typing in a two-handed grip. PloS one 8, 6 (2013), e67525.
[20]
Jayson Turner, Andreas Bulling, Jason Alexander, and Hans Gellersen. 2014. Cross-device Gaze-supported Point-to-point Content Transfer. In ETRA. ACM, 19--26.
[21]
Daniel Vogel and Ravin Balakrishnan. 2010. Occlusion-aware Interfaces. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '10). ACM, New York, NY, USA, 263--272.
[22]
Shumin Zhai, Carlos Morimoto, and Steven Ihde. 1999. Manual and Gaze Input Cascaded (MAGIC) Pointing. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '99). ACM, New York, NY, USA, 246--253.

Cited By

View all
  • (2024)SwivelTouch: Boosting Touchscreen Input with 3D Finger Rotation GestureProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36595848:2(1-30)Online publication date: 15-May-2024
  • (2024)GazePuffer: Hands-Free Input Method Leveraging Puff Cheeks for VR2024 IEEE Conference Virtual Reality and 3D User Interfaces (VR)10.1109/VR58804.2024.00055(331-341)Online publication date: 16-Mar-2024
  • (2024)Exploring Controller-based Techniques for Precise and Rapid Text Selection in Virtual Reality2024 IEEE Conference Virtual Reality and 3D User Interfaces (VR)10.1109/VR58804.2024.00047(244-253)Online publication date: 16-Mar-2024
  • Show More Cited By

Index Terms

  1. Gaze'N'Touch: Enhancing Text Selection on Mobile Devices Using Gaze

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    CHI EA '20: Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems
    April 2020
    4474 pages
    ISBN:9781450368193
    DOI:10.1145/3334480
    Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 25 April 2020

    Check for updates

    Author Tags

    1. gaze and touch
    2. gaze selection
    3. interaction
    4. text editing

    Qualifiers

    • Abstract

    Funding Sources

    Conference

    CHI '20
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 6,164 of 23,696 submissions, 26%

    Upcoming Conference

    CHI 2025
    ACM CHI Conference on Human Factors in Computing Systems
    April 26 - May 1, 2025
    Yokohama , Japan

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)69
    • Downloads (Last 6 weeks)11
    Reflects downloads up to 06 Jan 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)SwivelTouch: Boosting Touchscreen Input with 3D Finger Rotation GestureProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36595848:2(1-30)Online publication date: 15-May-2024
    • (2024)GazePuffer: Hands-Free Input Method Leveraging Puff Cheeks for VR2024 IEEE Conference Virtual Reality and 3D User Interfaces (VR)10.1109/VR58804.2024.00055(331-341)Online publication date: 16-Mar-2024
    • (2024)Exploring Controller-based Techniques for Precise and Rapid Text Selection in Virtual Reality2024 IEEE Conference Virtual Reality and 3D User Interfaces (VR)10.1109/VR58804.2024.00047(244-253)Online publication date: 16-Mar-2024
    • (2023)1D-Touch: NLP-Assisted Coarse Text Selection via a Semi-Direct GestureProceedings of the ACM on Human-Computer Interaction10.1145/36264837:ISS(463-482)Online publication date: 1-Nov-2023
    • (2023)Gaze-based Mode-Switching to Enhance Interaction with Menus on TabletsProceedings of the 2023 Symposium on Eye Tracking Research and Applications10.1145/3588015.3588409(1-8)Online publication date: 30-May-2023
    • (2023)BIGaze: An eye-gaze action-guided Bayesian information gain framework for information explorationAdvanced Engineering Informatics10.1016/j.aei.2023.10215958(102159)Online publication date: Oct-2023
    • (2023)Arrow2edit: A Technique for Editing Text on SmartphonesHuman-Computer Interaction10.1007/978-3-031-35596-7_27(416-432)Online publication date: 23-Jul-2023
    • (2022)Understanding and Creating Spatial Interactions with Distant Displays Enabled by Unmodified Off-The-Shelf SmartphonesMultimodal Technologies and Interaction10.3390/mti61000946:10(94)Online publication date: 19-Oct-2022
    • (2022)A Practical Method to Eye-tracking on the Phone: Toolkit, Accuracy and PrecisionProceedings of the 21st International Conference on Mobile and Ubiquitous Multimedia10.1145/3568444.3568463(182-188)Online publication date: 27-Nov-2022
    • (2022)One-handed Input for Mobile Devices via Motion Matching and Orbits ControlsProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/35346246:2(1-24)Online publication date: 7-Jul-2022
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media