skip to main content
10.1145/3441852.3476569acmconferencesArticle/Chapter ViewAbstractPublication PagesassetsConference Proceedingsconference-collections
poster

GazeMetro: A Gaze-Based Interactive System for Metro Map

Published: 17 October 2021 Publication History

Abstract

In this paper, we propose a gaze-based interactive system for metro map named GazeMetro, which helps explore and interact with the metro map only by eye movements, providing a new experience of interaction. The objective of GazeMetro is to provide metro map viewers with gaze-based interactions to search the metro map without other manual operations. We implement GazeMetro with 4 gaze-based interaction techniques which are Gaze Fisheye, Gaze Scaling and Panning, Gaze Selection and Gaze Hint. We conducted an experiment to evaluate GazeMetro and the results showed a positive evaluation in pragmatic quality and especially in hedonic quality.

Supplementary Material

MP4 File (1037-file2.mp4)
Supplemental materials

References

[1]
Areej Al-Wabil, Arwa Al-Issa, Itisam Hazzaa, May Al-Humaimeedi, Lujain Al-Tamimi, and Bushra Al-Kadhi. 2012. Optimizing gaze typing for people with severe motor disabilities: the iWriter arabic interface. In Proceedings of the 14th international ACM SIGACCESS conference on Computers and accessibility. 261–262.
[2]
Michael Ashmore, Andrew T Duchowski, and Garth Shoemaker. 2005. Efficient eye pointing with a fisheye lens. In Proceedings of Graphics interface 2005. Citeseer, 203–210.
[3]
Tanya Bafna. 2018. Gaze Typing using Multi-key Selection Technique. In Proceedings of the 20th International ACM SIGACCESS Conference on Computers and Accessibility. 477–479.
[4]
Andrew T Duchowski. 2018. Gaze-based interaction: A 30 year retrospective. Computers & Graphics 73(2018), 59–69.
[5]
Kurt Koffka. 1935. Principles of Gestalt Psychology. (1935).
[6]
Alexandra Papoutsaki, Aaron Gokaslan, James Tompkin, Yuze He, and Jeff Huang. 2018. The eye of the typer: a benchmark and analysis of gaze behavior during typing. In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications. 1–9.
[7]
Vijay Rajanna and John Paulin Hansen. 2018. Gaze typing in virtual reality: impact of keyboard design, selection method, and motion. In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications. 1–10.
[8]
Martin Schrepp. 2015. User experience questionnaire handbook. All you need to know to apply the UEQ successfully in your project (2015).
[9]
Dejan Todorovic. 2008. Gestalt principles. Scholarpedia 3, 12 (2008), 5345.
[10]
Erik Wästlund, Kay Sponseller, and Ola Pettersson. 2010. What you see is where you go: testing a gaze-driven power wheelchair for individuals with severe multiple disabilities. In Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications. 133–136.
[11]
Guangtao Zhang and John Paulin Hansen. 2020. People with Motor Disabilities Using Gaze to Control Telerobots. In Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems. 1–9.

Cited By

View all
  • (2024)Beyond Vision Impairments: Redefining the Scope of Accessible Data RepresentationsIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2024.335656630:12(7619-7636)Online publication date: 1-Dec-2024

Index Terms

  1. GazeMetro: A Gaze-Based Interactive System for Metro Map
    Index terms have been assigned to the content through auto-classification.

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    ASSETS '21: Proceedings of the 23rd International ACM SIGACCESS Conference on Computers and Accessibility
    October 2021
    730 pages
    ISBN:9781450383066
    DOI:10.1145/3441852
    Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 17 October 2021

    Check for updates

    Author Tags

    1. eye tracking
    2. gaze-based interaction
    3. graph visualization
    4. metro map

    Qualifiers

    • Poster
    • Research
    • Refereed limited

    Conference

    ASSETS '21
    Sponsor:

    Acceptance Rates

    ASSETS '21 Paper Acceptance Rate 36 of 134 submissions, 27%;
    Overall Acceptance Rate 436 of 1,556 submissions, 28%

    Upcoming Conference

    ASSETS '25

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)37
    • Downloads (Last 6 weeks)2
    Reflects downloads up to 19 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Beyond Vision Impairments: Redefining the Scope of Accessible Data RepresentationsIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2024.335656630:12(7619-7636)Online publication date: 1-Dec-2024

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media