ABSTRACT
Gaze Typing, a gaze-assisted text entry method, allows individuals with motor (arm, spine) impairments to enter text on a computer using a virtual keyboard and their gaze. Though gaze typing is widely accepted, this method is limited by its lower typing speed, higher error rate, and the resulting visual fatigue, since dwell-based key selection is used. In this research, we present a gaze-assisted, wearable-supplemented, foot interaction framework for dwell-free gaze typing. The framework consists of a custom-built virtual keyboard, an eye tracker, and a wearable device attached to the user's foot. To enter a character, the user looks at the character and selects it by pressing the pressure pad, attached to the wearable device, with the foot. Results from a preliminary user study involving two participants with motor impairments show that the participants achieved a mean gaze typing speed of 6.23 Words Per Minute (WPM). In addition, the mean value of Key Strokes Per Character (KPSC) was 1.07 (ideal 1.0), and the mean value of Rate of Backspace Activation (RBA) was 0.07 (ideal 0.0). Furthermore, we present our findings from multiple usability studies and design iterations, through which we created appropriate affordances and experience design of our gaze typing system.
- J. P. Hansen, K. Tørning, A. S. Johansen, K. Itoh, and H. Aoki. Gaze typing compared with input by head and hand. In Proceedings of the 2004 symposium on Eye tracking research & applications. ACM, 2004. Google ScholarDigital Library
- I. S. MacKenzie and R. W. Soukoreff. Phrase sets for evaluating text entry techniques. In CHI'03 extended abstracts on Human factors in computing systems, pages 754--755. ACM, 2003. Google ScholarDigital Library
- P. Majaranta, I. S. MacKenzie, A. Aula, and K.-J. Räihä. Auditory and visual feedback during eye typing. In CHI'03 Extended Abstracts on Human Factors in Computing Systems, pages 766--767. ACM, 2003. Google ScholarDigital Library
- P. Majaranta and K.-J. Räihä. Twenty years of eye typing: systems and design issues. In Proceedings of the 2002 symposium on Eye tracking research & applications, pages 15--22. ACM, 2002. Google ScholarDigital Library
- V. Rajanna and T. Hammond. Gawschi: gaze-augmented, wearable-supplemented computer-human interaction. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications, pages 233--236. ACM, 2016. Google ScholarDigital Library
Index Terms
- Gaze Typing Through Foot-Operated Wearable Device
Recommendations
Gaze typing in virtual reality: impact of keyboard design, selection method, and motion
ETRA '18: Proceedings of the 2018 ACM Symposium on Eye Tracking Research & ApplicationsGaze tracking in virtual reality (VR) allows for hands-free text entry, but it has not yet been explored. We investigate how the keyboard design, selection method, and motion in the field of view may impact typing performance and user experience. We ...
HGaze Typing: Head-Gesture Assisted Gaze Typing
ETRA '21 Full Papers: ACM Symposium on Eye Tracking Research and ApplicationsThis paper introduces a bi-modal typing interface, HGaze Typing, which combines the simplicity of head gestures with the speed of gaze inputs to provide efficient and comfortable dwell-free text entry. HGaze Typing uses gaze path information to compute ...
Gaze and Foot Input: Toward a Rich and Assistive Interaction Modality
IUI '16 Companion: Companion Publication of the 21st International Conference on Intelligent User InterfacesTransforming gaze input into a rich and assistive interaction modality is one of the primary interests in eye tracking research. Gaze input in conjunction with traditional solutions to the "Midas Touch" problem, dwell time or a blink, is not matured ...
Comments