ABSTRACT
Elaborated gaze tracking devices are hitting the consumer market. This gives an existing human-computer interaction technique the chance to be widely applied in software applications. Programmers can benefit from this development. They tend to work on multiple or large screens to interact with diverse tools in parallel. When programmers switch between reading and typing, the keyboard focus might not be where expected. Such distractions leave the programmer dissatisfied. Gaze information can help to determine which tool a programmer focusses on.
Our goal is to explore the use of gaze information for programming environments. Specifically, we investigate a case where a programmer's view focus and the intended keyboard focus correlate. For specific programming tasks, our work shows that it is beneficial to set the keyboard focus to a programmer's view focus.
- H. R. Chennamma and X. Yuan. A Survey on Eye-Gaze Tracking Techniques. Engg Journals Publications, 2013.Google Scholar
- A. Clemotte, M. A. Velasco, D. Torricelli, R. Raya, and R. Ceres. Accuracy and Precision of the Tobii X2-30 Eye-Tracking Under Non Ideal Conditions, 2014.Google Scholar
- H. Glücker, F. Raab, F. Echtler, and C. Wolff. EyeDE: Gaze-Enhanced Software Development Environments. In Proceedings of the International Conference on Human Factors in Computing Systems (CHI), pages 1555--1560. ACM, 2014. Google ScholarDigital Library
- D. Ingalls, T. Kaehler, J. Maloney, S. Wallace, and A. Kay. Back to the Future: The Story of Squeak, a Practical Smalltalk Written in Itself. 32(10):318--326, 1997. Google ScholarDigital Library
- R. J. Jacob. The Use of Eye Movements in Human-Computer Interaction Techniques: What You Look at is What You Get. Transactions on Information Systems (TOIS), 9(2):152--169, 1991. Google ScholarDigital Library
- M. Kumar, A. Paepcke, and T. Winograd. Eyepoint: Practical Pointing and Selection Using Gaze and Keyboard. In Proceedings of the International Conference on Human Factors in Computing Systems (CHI), pages 421--430. ACM, 2007. Google ScholarDigital Library
- M. Kumar, T. Winograd, and A. Paepcke. Gaze-Enhanced Scrolling Techniques. In Proceedings of the International Conference on Human Factors in Computing Systems (CHI), pages 2531--2536. ACM, 2007. Google ScholarDigital Library
- A. M. Penkar. Hypertext Navigation with an Eye Gaze Tracker. PhD thesis, ResearchSpace at Auckland, 2014.Google Scholar
- T. Pfister, X. Li, G. Zhao, and M. Pietikäinen. Recognising Spontaneous Facial Micro-Expressions. In Proceedings of the International Conference on Computer Vision (ICCV), pages 1449--1456. IEEE, 2011. Google ScholarDigital Library
- R. Vertegaal. A Fitts Law Comparison of Eye Tracking and Manual Input in the Selection of Visual Targets. In Proceedings of the International Conference on Multimodal Interfaces (ICMI), pages 241--248. ACM, 2008. Google ScholarDigital Library
- X. Xiong, Z. Liu, Q. Cai, and Z. Zhang. Eye Gaze Tracking Using an RGBD Camera: A Comparison With a RGB Solution. In Proceedings of the International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp), pages 1113--1121. ACM, 2014. Google ScholarDigital Library
- S. Zhai, C. Morimoto, and S. Ihde. Manual and Gaze Input Cascaded (MAGIC) Pointing. In Proceedings of the International Conference on Human Factors in Computing Systems (CHI), pages 246--253. ACM, 1999. Google ScholarDigital Library
Recommendations
Implementing gaze control for peripheral devices
PETMEI '11: Proceedings of the 1st international workshop on pervasive eye tracking & mobile eye-based interactionThe goal of the project "Gaze Controlled Interaction with Peripheral Devices" was to extend the capability of the head based eye tracking system DIKABLIS to detect the gaze allocation to previously defined Areas of Interest (AOI) in real time. This ...
Gliding and saccadic gaze gesture recognition in real time
Eye movements can be consciously controlled by humans to the extent of performing sequences of predefined movement patterns, or 'gaze gestures'. Gaze gestures can be tracked noninvasively employing a video-based eye tracking system. Gaze gestures hold ...
Simple gaze gestures and the closure of the eyes as an interaction technique
ETRA '12: Proceedings of the Symposium on Eye Tracking Research and ApplicationsWe created a set of gaze gestures that utilize the following three elements: simple one-segment gestures, off-screen space, and the closure of the eyes. These gestures are to be used as the moving tool in a gaze-only controlled drawing application. We ...
Comments