ABSTRACT
Gaze gesture is a desirable technique for a spontaneous and pervasive gaze interaction due to its insensitivity to spatial accuracy. Unfortunately, gaze gesture-based object selection utilizing correlation coefficient is prone to a low object selection accuracy due to presence of noises. In addition, effect of various types of eye movements that present in gaze gesture-based object selection has not been tackled properly. To overcome these problems, we propose a denoising method for gaze gesture-based object selection using First Order IIR Filter and an event detection method based on the Hidden Markov Model. The experimental results show that the proposed method yielded the best object selection accuracy of . The result suggests that a spontaneous gaze gesture-based object selection is feasible to be developed in the presence of noises and various types of eye movements.
- Svetozar Mile Bozic. 1994. Digital and Kalman Filtering. Wiley.Google Scholar
- Heiko Drewes and Albrecht Schmidt. 2007. Interacting with the computer using gaze gestures. In IFIP Conference on Human-Computer Interaction. Springer, 475–488. https://doi.org/10.1007/978-3-540-74800-7_43Google ScholarCross Ref
- Morten Lund Dybdal, Javier San Agustin, and John Paulin Hansen. 2012. Gaze input for mobile devices by dwell and gestures. In Proceedings of the Symposium on Eye Tracking Research and Applications. ACM, 225–228. https://doi.org/10.1145/2168556.2168601Google ScholarDigital Library
- Herlina, Sunu Wibirama, and Igi Ardiyanto. 2018. Similarity measures of object selection in interactive applications based on smooth pursuit eye movements. In 2018 International Conference on Information and Communications Technology (ICOIACT). IEEE, 639–644. https://doi.org/10.1109/ICOIACT.2018.8350701Google Scholar
- Mohamed Khamis, Florian Alt, and Andreas Bulling. 2015. A field study on spontaneous gaze-based interaction with a public display using pursuits. In Adjunct Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2015 ACM International Symposium on Wearable Computers. ACM, 863–872. https://doi.org/10.1145/2800835.2804335Google ScholarDigital Library
- Mohamed Khamis, Ozan Saltuk, Alina Hang, Katharina Stolz, Andreas Bulling, and Florian Alt. 2016. TextPursuits: using text for pursuits-based interaction and calibration on public displays. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing. ACM, 274–285. https://doi.org/10.1145/2971648.2971679Google ScholarDigital Library
- Päivi Majaranta and Kari Jouko Räihä. 2002. Twenty years of eye typing: systems and design issues. In ETRA, Vol. 2. 15–22. https://doi.org/10.1145/507072.507076Google Scholar
- Suatmi Murnani. 2020. A Robust Object Selection Technique in Gaze Gesture Application using Exponential Moving Average and Hidden Markov Model. Master’s thesis. Universitas Gadjah Mada, Indonesia.Google Scholar
- Jami Pekkanen and Otto Lappi. 2017. A new and general approach to signal denoising and eye movement classification based on segmented linear regression. Scientific reports 7, 1 (2017), 17726. https://doi.org/10.1038/s41598-017-17983-xGoogle Scholar
- Mélodie Vidal, Andreas Bulling, and Hans Gellersen. 2013a. Pursuits: spontaneous interaction with displays based on smooth pursuit eye movement and moving targets. In Proceedings of the 2013 ACM international joint conference on Pervasive and ubiquitous computing. ACM, 439–448. https://doi.org/10.1145/2493432.2493477Google ScholarDigital Library
- Mélodie Vidal, Andreas Bulling, and Hans Gellersen. 2015. Pursuits: spontaneous eye-based interaction for dynamic interfaces. GetMobile: Mobile Computing and Communications 18, 4(2015), 8–10. https://doi.org/10.1145/2721914.2721917Google ScholarDigital Library
- Mélodie Vidal, Ken Pfeuffer, Andreas Bulling, and Hans W Gellersen. 2013b. Pursuits: eye-based interaction with moving targets. In CHI’13 Extended Abstracts on Human Factors in Computing Systems. ACM, 3147–3150. https://doi.org/10.1145/2468356.2479632Google Scholar
- Ronald E Walpole, Raymond H Myers, Sharon L Myers, and Keying Ye. 1993. Probability and statistics for engineers and scientists. Vol. 5. Macmillan New York.Google Scholar
- Colin Ware and Harutune H Mikaelian. 1987. An evaluation of an eye tracker as a device for computer input2. Acm sigchi bulletin 18, 4 (1987), 183–188. https://doi.org/10.1145/29933.275627Google Scholar
- Rahmat Aditya Warman, Sunu Wibirama, and Agus Bejo. 2017. Performance comparison of signal processing filters on smooth pursuit eye movements. In 2017 2nd International conferences on Information Technology, Information Systems and Electrical Engineering (ICITISEE). IEEE, 111–115. https://doi.org/10.1109/ICITISEE.2017.8285477Google ScholarCross Ref
Recommendations
Evaluation of eye gaze interaction
CHI '00: Proceedings of the SIGCHI conference on Human Factors in Computing SystemsEye gaze interaction can provide a convenient and natural addition to user-computer dialogues. We have previously reported on our interaction techniques using eye gaze [10]. While our techniques seemed useful in demonstration, we now investigate their ...
Eye-based Interaction Using Embedded Optical Sensors on an Eyewear Device for Facial Expression Recognition
AHs '20: Proceedings of the Augmented Humans International ConferenceNon-verbal information is essential to understand intentions and emotions and to facilitate social interaction between humans and between humans and computers. One reliable source of such information is the eyes. We investigated the eye-based ...
User Defined Eye Movement-Based Interaction for Virtual Reality
Cross-Cultural Design. Methods, Tools, and UsersAbstractMost of the applications of eye movement-based interaction in VR are limited to blinking and gaze at present, however, gaze gestures were neglected. Therefore, the potential of eye movement-based interaction in VR is far from being realized. In ...
Comments