skip to main content
10.1145/2556288.2557040acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

Gaze gestures and haptic feedback in mobile devices

Published: 26 April 2014 Publication History

Abstract

Anticipating the emergence of gaze tracking capable mobile devices, we are investigating the use of gaze as an input modality in handheld mobile devices. We conducted a study of combining gaze gestures with vibrotactile feedback. Gaze gestures were used as an input method in a mobile device and vibrotactile feedback as a new alternative way to give confirmation of interaction events. Our results show that vibrotactile feedback significantly improved the use of gaze gestures. The tasks were completed faster and rated easier and more comfortable when vibrotactile feedback was provided.

Supplementary Material

MP4 File (p435-sidebyside.mp4)

References

[1]
Brewster, S., Chohan, F., and Brown, L. Tactile Feedback for Mobile Interactions. In CHI'07, 159--162, ACM Press, 2007.
[2]
Drewes, H., De Luca, A., and Schmidt, A. Eye-Gaze Interaction for Mobile Phones. In Mobility'07, 364--371, ACM Press, New York, 2007.
[3]
Drewes, H., and Schmidt, A. Interacting with the Computer using Gaze Gestures. In Proc. INTERACT 2007, 475--488. Springer, 2007.
[4]
Dybdal, M. L., San Agustin, J., and Hansen, J. P. Gaze Input for Mobile Devices by Dwell and Gestures. In ETRA'12, 225--228, ACM Press, New York, 2012.
[5]
Heikkilä, H., and Räihä, K.-J. Simple Gaze Gestures and the Closure of the Eyes as an Interaction Technique. In ETRA'12, 147--154, ACM Press, New York, 2012.
[6]
Hoggan, E., Brewster, S. A., and Johnston, J. Investigating the Effectiveness of Tactile Feedback for Mobile Touchscreens. In CHI'08, 1573--1582, ACM Press, 2008.
[7]
Hyrskykari, A., Istance, H. and Vickers, S. Gaze Gestures or Dwell-Based Interaction. In ETRA'12, 229--232, ACM Press, New York, 2012.
[8]
Isokoski, P. Text Input Methods for Eye Trackers Using Off-Screen Targets. In ETRA'00, 15--21, ACM Press, 2000.
[9]
Istance, H., Hyrskykari, A., Immonen, L., Mansikkamaa, S., and Vickers, S. Designing Gaze Gestures for Gaming: an Investigation of Performance. In ETRA'10, 323--330, ACM Press, 2010.
[10]
Lukander, K., Jagadeesan, S., Chi, H., and Müller, K. OMG! - A New Robust, Wearable and Affordable Open Source Mobile Gaze Tracker. In MobileHCI'13, 408--411, 2013.
[11]
Nielsen, J. Noncommand User Interfaces. In Communications of the ACM, 36(4), 83--99, 1993.
[12]
Porta, M, and Turina, M. Eye-S: a Full-Screen Input Modality for Pure Eye-based Communication. In ETRA'08, 27--34, ACM Press, New York, 2008.
[13]
Rubine, D. Combining Gestures and Direct Manipulation. In CHI'92, 659--660, ACM, 1992.
[14]
Stellmach, S., and Dachselt, R. Look & Touch: GazeSupported Target Accuisition. In CHI'12, 2981--2990, ACM, 2012.
[15]
Stellmach, S., and Dachselt, R. Still Looking: Investigating Seamless Gaze-supported Selection, Positioning, and Manipulation of Distant Targets. In CHI'13, 285--294, ACM, 2013.
[16]
Wobbrock, J. O., Rubinstein, J., Sawyer, M. W., and Duchowski, A. T. Longitudinal Evaluation of Discrete Consecutive Gaze Gestures for Text Entry. In ETRA'08, 11--18, ACM Press, New York, 2008.

Cited By

View all
  • (2024)Robust Object Selection in Spontaneous Gaze-Controlled Application Using Exponential Moving Average and Hidden Markov ModelIEEE Transactions on Human-Machine Systems10.1109/THMS.2024.341378154:5(485-498)Online publication date: Oct-2024
  • (2023)Comparing Dwell time, Pursuits and Gaze Gestures for Gaze Interaction on Handheld Mobile DevicesProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580871(1-17)Online publication date: 19-Apr-2023
  • (2023)Study on the brightness and graphical display object directions of the Single-Gaze-Gesture user interfaceDisplays10.1016/j.displa.2023.10253780(102537)Online publication date: Dec-2023
  • Show More Cited By

Index Terms

  1. Gaze gestures and haptic feedback in mobile devices

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    CHI '14: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
    April 2014
    4206 pages
    ISBN:9781450324731
    DOI:10.1145/2556288
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 26 April 2014

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. gaze interaction
    2. gaze tracking
    3. haptic feedback

    Qualifiers

    • Research-article

    Conference

    CHI '14
    Sponsor:
    CHI '14: CHI Conference on Human Factors in Computing Systems
    April 26 - May 1, 2014
    Ontario, Toronto, Canada

    Acceptance Rates

    CHI '14 Paper Acceptance Rate 465 of 2,043 submissions, 23%;
    Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

    Upcoming Conference

    CHI 2025
    ACM CHI Conference on Human Factors in Computing Systems
    April 26 - May 1, 2025
    Yokohama , Japan

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)21
    • Downloads (Last 6 weeks)1
    Reflects downloads up to 27 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Robust Object Selection in Spontaneous Gaze-Controlled Application Using Exponential Moving Average and Hidden Markov ModelIEEE Transactions on Human-Machine Systems10.1109/THMS.2024.341378154:5(485-498)Online publication date: Oct-2024
    • (2023)Comparing Dwell time, Pursuits and Gaze Gestures for Gaze Interaction on Handheld Mobile DevicesProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580871(1-17)Online publication date: 19-Apr-2023
    • (2023)Study on the brightness and graphical display object directions of the Single-Gaze-Gesture user interfaceDisplays10.1016/j.displa.2023.10253780(102537)Online publication date: Dec-2023
    • (2022)A One-Point Calibration Design for Hybrid Eye Typing InterfaceInternational Journal of Human–Computer Interaction10.1080/10447318.2022.210118639:18(3620-3633)Online publication date: 24-Jul-2022
    • (2022)The traveling purchaser problem with fast service optionComputers and Operations Research10.1016/j.cor.2022.105700141:COnline publication date: 1-May-2022
    • (2022)The Heterogeneous Flexible Periodic Vehicle Routing ProblemComputers and Operations Research10.1016/j.cor.2021.105662141:COnline publication date: 1-May-2022
    • (2021)Exploring Social Acceptability and Users’ Preferences of Head- and Eye-Based Interaction with Mobile DevicesProceedings of the 20th International Conference on Mobile and Ubiquitous Multimedia10.1145/3490632.3490636(12-23)Online publication date: 5-Dec-2021
    • (2020)Gestatten: Estimation of User's Attention in Mobile MOOCs From Eye Gaze and Gaze Gesture TrackingProceedings of the ACM on Human-Computer Interaction10.1145/33949744:EICS(1-32)Online publication date: 18-Jun-2020
    • (2020)Smartphone-Based Remote Monitoring Tool for e-LearningIEEE Access10.1109/ACCESS.2020.30053308(121409-121423)Online publication date: 2020
    • (2020)Effect of feedback and target size on eye gaze accuracy in an off-screen taskDisability and Rehabilitation: Assistive Technology10.1080/17483107.2020.172987416:7(769-779)Online publication date: 26-Feb-2020
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media