skip to main content
10.1145/3379156.3391363acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
short-paper

Spontaneous Gaze Gesture Interaction in the Presence of Noises and Various Types of Eye Movements

Published:02 June 2020Publication History

ABSTRACT

Gaze gesture is a desirable technique for a spontaneous and pervasive gaze interaction due to its insensitivity to spatial accuracy. Unfortunately, gaze gesture-based object selection utilizing correlation coefficient is prone to a low object selection accuracy due to presence of noises. In addition, effect of various types of eye movements that present in gaze gesture-based object selection has not been tackled properly. To overcome these problems, we propose a denoising method for gaze gesture-based object selection using First Order IIR Filter and an event detection method based on the Hidden Markov Model. The experimental results show that the proposed method yielded the best object selection accuracy of . The result suggests that a spontaneous gaze gesture-based object selection is feasible to be developed in the presence of noises and various types of eye movements.

References

  1. Svetozar Mile Bozic. 1994. Digital and Kalman Filtering. Wiley.Google ScholarGoogle Scholar
  2. Heiko Drewes and Albrecht Schmidt. 2007. Interacting with the computer using gaze gestures. In IFIP Conference on Human-Computer Interaction. Springer, 475–488. https://doi.org/10.1007/978-3-540-74800-7_43Google ScholarGoogle ScholarCross RefCross Ref
  3. Morten Lund Dybdal, Javier San Agustin, and John Paulin Hansen. 2012. Gaze input for mobile devices by dwell and gestures. In Proceedings of the Symposium on Eye Tracking Research and Applications. ACM, 225–228. https://doi.org/10.1145/2168556.2168601Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Herlina, Sunu Wibirama, and Igi Ardiyanto. 2018. Similarity measures of object selection in interactive applications based on smooth pursuit eye movements. In 2018 International Conference on Information and Communications Technology (ICOIACT). IEEE, 639–644. https://doi.org/10.1109/ICOIACT.2018.8350701Google ScholarGoogle Scholar
  5. Mohamed Khamis, Florian Alt, and Andreas Bulling. 2015. A field study on spontaneous gaze-based interaction with a public display using pursuits. In Adjunct Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2015 ACM International Symposium on Wearable Computers. ACM, 863–872. https://doi.org/10.1145/2800835.2804335Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Mohamed Khamis, Ozan Saltuk, Alina Hang, Katharina Stolz, Andreas Bulling, and Florian Alt. 2016. TextPursuits: using text for pursuits-based interaction and calibration on public displays. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing. ACM, 274–285. https://doi.org/10.1145/2971648.2971679Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Päivi Majaranta and Kari Jouko Räihä. 2002. Twenty years of eye typing: systems and design issues. In ETRA, Vol. 2. 15–22. https://doi.org/10.1145/507072.507076Google ScholarGoogle Scholar
  8. Suatmi Murnani. 2020. A Robust Object Selection Technique in Gaze Gesture Application using Exponential Moving Average and Hidden Markov Model. Master’s thesis. Universitas Gadjah Mada, Indonesia.Google ScholarGoogle Scholar
  9. Jami Pekkanen and Otto Lappi. 2017. A new and general approach to signal denoising and eye movement classification based on segmented linear regression. Scientific reports 7, 1 (2017), 17726. https://doi.org/10.1038/s41598-017-17983-xGoogle ScholarGoogle Scholar
  10. Mélodie Vidal, Andreas Bulling, and Hans Gellersen. 2013a. Pursuits: spontaneous interaction with displays based on smooth pursuit eye movement and moving targets. In Proceedings of the 2013 ACM international joint conference on Pervasive and ubiquitous computing. ACM, 439–448. https://doi.org/10.1145/2493432.2493477Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Mélodie Vidal, Andreas Bulling, and Hans Gellersen. 2015. Pursuits: spontaneous eye-based interaction for dynamic interfaces. GetMobile: Mobile Computing and Communications 18, 4(2015), 8–10. https://doi.org/10.1145/2721914.2721917Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Mélodie Vidal, Ken Pfeuffer, Andreas Bulling, and Hans W Gellersen. 2013b. Pursuits: eye-based interaction with moving targets. In CHI’13 Extended Abstracts on Human Factors in Computing Systems. ACM, 3147–3150. https://doi.org/10.1145/2468356.2479632Google ScholarGoogle Scholar
  13. Ronald E Walpole, Raymond H Myers, Sharon L Myers, and Keying Ye. 1993. Probability and statistics for engineers and scientists. Vol. 5. Macmillan New York.Google ScholarGoogle Scholar
  14. Colin Ware and Harutune H Mikaelian. 1987. An evaluation of an eye tracker as a device for computer input2. Acm sigchi bulletin 18, 4 (1987), 183–188. https://doi.org/10.1145/29933.275627Google ScholarGoogle Scholar
  15. Rahmat Aditya Warman, Sunu Wibirama, and Agus Bejo. 2017. Performance comparison of signal processing filters on smooth pursuit eye movements. In 2017 2nd International conferences on Information Technology, Information Systems and Electrical Engineering (ICITISEE). IEEE, 111–115. https://doi.org/10.1109/ICITISEE.2017.8285477Google ScholarGoogle ScholarCross RefCross Ref

Recommendations

Comments

Login options

Check if you have access through your login credentials or your institution to get full access on this article.

Sign in
  • Published in

    cover image ACM Conferences
    ETRA '20 Short Papers: ACM Symposium on Eye Tracking Research and Applications
    June 2020
    305 pages
    ISBN:9781450371346
    DOI:10.1145/3379156

    Copyright © 2020 ACM

    © 2020 Association for Computing Machinery. ACM acknowledges that this contribution was authored or co-authored by an employee, contractor or affiliate of a national government. As such, the Government retains a nonexclusive, royalty-free right to publish or reproduce this article, or to allow others to do so, for Government purposes only.

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    • Published: 2 June 2020

    Permissions

    Request permissions about this article.

    Request Permissions

    Check for updates

    Qualifiers

    • short-paper
    • Research
    • Refereed limited

    Acceptance Rates

    Overall Acceptance Rate69of137submissions,50%

    Upcoming Conference

    ETRA '24
    The 2024 Symposium on Eye Tracking Research and Applications
    June 4 - 7, 2024
    Glasgow , United Kingdom

PDF Format

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format .

View HTML Format