Abstract
In this paper, we propose a new intention communication assisting system that uses eye movement. The proposed method solves the problems associated with a conventional eye gaze input method. A hands-free input method that uses the behavior of the eye, including blinking and line of sight, has been used for assisting the intention communication of people with severe physical disabilities. In particular, a line-of-sight input device that uses eye gazes has been used extensively because of its intuitive operation. In addition, this device can be used by any patient, except those with weak eye. However, the eye gaze method has disadvantages such as a certain level of input time is required for determining the eye gaze input, or it is necessary to present the information for fixation when performing input. In order to solve these problems, we propose a new line-of-sight input method, eye glance input method. Eye glance input can be performed in four directions by detecting reciprocating movement (eye glance) in the oblique direction. Using the proposed method, it is possible to perform rapid environmental control with simple measurements. In addition, we developed an evaluation system using electrooculogram based on the proposed method. The evaluation system experimentally evaluated the input accuracy of 10 subjects. As a result, an average accuracy of approximately 84.82 % was determined, which confirms the effectiveness of the proposed method. In addition, we examined the application of the proposed method to actual intention communication assisting systems.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Baljko, M., Tam, A.: Indirect text entry using one or two keys. In: Proceedings of 8th International ACM SIGACCESS Conference on Computer and Accessibility, pp. 18–25, October 2006
Machkenzie, I.S.: The one-key challenge: searching for a fast one-key text entry method. In: Proceedings of 11th International ACM SIGACCESS Conference on Computers and Accessibility, pp. 91–98, October 2009
Yamada, M., Fukuda, T.: Eye word processor (EWP) and peripheral controller for the ALS patient. Phys. Sci. Measur. Instrum. Manag. Educ. IEEE Proc. A 134(4), 328–330 (2008)
Mollenbach, E., Hansen, J.P., Lillholm, M., Gale, A.G.: Single stroke gaze gestures. In: CHI 2009 Extended Abstracts on Human Factors in Computing Systems, pp. 4555–4560, April 2009
Abe, K., Sato, H., Matsuno, S., Ohi, S., Ohyama, M.: Input interface using eye-gaze and blink information. In: Tino, A., Stephanidis, C. (eds.) HCII 2015 Posters. CCIS, vol. 528, pp. 463–467. Springer, Heidelberg (2015). doi:10.1007/978-3-319-21380-4_78
Pedrosa, D., Pimentel, M.D.G., Wright, A., Truong, K.N.: Filteryedping: design challenges and user performance of dwell-free eye typing. ACM Trans. Accessible Comput. 6(1), 1–37 (2015)
Kherlopian, A.R., Gerrein, J.P., Yue, M., Kim, K.R., Kim, J.W., Sukumaran, M., Sajda, P.: Electrooculogram based system for computer control using a multiple feature classification model. In: Proceedings of the 28th IEEE EMBS Annual International Conference, pp. 1295–1298, August 2006
Mondal, C., Azam, Md.K., Ahmad, M., Hasan, S.M.K., Islam, Md.R.: Design and implementation of a prototype electrooculography based data acquisition system. In: Proceedings of International Conference on Electrical Engineering and Information Communication Technology, pp. 1–6, May 2015
Yagi, T.: Eye-gaze interfaces using electro-oculography (EOG). In: Proceedings of the 2010 Workshop on Eye Gaze in Intelligent Human Machine Interaction, pp. 28–32, February 2010
Taher, F.B., Amor, N.B., Jalloulo, M.: A multimodal wheelchair control system based on EEG signals and Eye tracking fusion. In: Proceedings of International Symposium on Innovations in Intelligent SysTems and Applications, pp. 2–4, September 2015
Gao, D., Itakura, N., Mizuno, T., Mito, K.: Improvement of eye gesture interface. J. Adv. Comput. Intell. Intell. Inf. 17(6), 843–850 (2013)
Velichkovsky, B., Sprenger, A., Unema, P.: Towards gaze-mediated interaction: collecting solutions of the “Midas touch problem”. In: Proceedings of IFIP TC13 International Conference on Human-Computer Interaction, 14–18 July, pp. 509–516, September 1997
Vrzakova, H., Bednarik, R.: That’s not norma(n/l): a detailed analysis of midas touch in gaze-based problem-solving. In: CHI 2013 Extended Abstracts on Human Factors in Computing Systems, pp. 85–90, April 2013
Matsuno, S., Akehi, K., Itakura, N., Mizuno, T., Mito, K.: Computer input system using eye glances. In: Yamamoto, S., Abbott, A.A. (eds.) HIMI 2015. LNCS, vol. 9172, pp. 425–432. Springer, Heidelberg (2015). doi:10.1007/978-3-319-20612-7_41
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer International Publishing Switzerland
About this paper
Cite this paper
Matsuno, S., Ito, Y., Itakura, N., Mizuno, T., Mito, K. (2016). A Study of an Intention Communication Assisting System Using Eye Movement. In: Miesenberger, K., Bühler, C., Penaz, P. (eds) Computers Helping People with Special Needs. ICCHP 2016. Lecture Notes in Computer Science(), vol 9759. Springer, Cham. https://doi.org/10.1007/978-3-319-41267-2_69
Download citation
DOI: https://doi.org/10.1007/978-3-319-41267-2_69
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-41266-5
Online ISBN: 978-3-319-41267-2
eBook Packages: Computer ScienceComputer Science (R0)