Skip to main content

A Study of an Intention Communication Assisting System Using Eye Movement

  • Conference paper
  • First Online:
Computers Helping People with Special Needs (ICCHP 2016)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 9759))

Abstract

In this paper, we propose a new intention communication assisting system that uses eye movement. The proposed method solves the problems associated with a conventional eye gaze input method. A hands-free input method that uses the behavior of the eye, including blinking and line of sight, has been used for assisting the intention communication of people with severe physical disabilities. In particular, a line-of-sight input device that uses eye gazes has been used extensively because of its intuitive operation. In addition, this device can be used by any patient, except those with weak eye. However, the eye gaze method has disadvantages such as a certain level of input time is required for determining the eye gaze input, or it is necessary to present the information for fixation when performing input. In order to solve these problems, we propose a new line-of-sight input method, eye glance input method. Eye glance input can be performed in four directions by detecting reciprocating movement (eye glance) in the oblique direction. Using the proposed method, it is possible to perform rapid environmental control with simple measurements. In addition, we developed an evaluation system using electrooculogram based on the proposed method. The evaluation system experimentally evaluated the input accuracy of 10 subjects. As a result, an average accuracy of approximately 84.82 % was determined, which confirms the effectiveness of the proposed method. In addition, we examined the application of the proposed method to actual intention communication assisting systems.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Baljko, M., Tam, A.: Indirect text entry using one or two keys. In: Proceedings of 8th International ACM SIGACCESS Conference on Computer and Accessibility, pp. 18–25, October 2006

    Google Scholar 

  2. Machkenzie, I.S.: The one-key challenge: searching for a fast one-key text entry method. In: Proceedings of 11th International ACM SIGACCESS Conference on Computers and Accessibility, pp. 91–98, October 2009

    Google Scholar 

  3. Yamada, M., Fukuda, T.: Eye word processor (EWP) and peripheral controller for the ALS patient. Phys. Sci. Measur. Instrum. Manag. Educ. IEEE Proc. A 134(4), 328–330 (2008)

    Article  Google Scholar 

  4. Mollenbach, E., Hansen, J.P., Lillholm, M., Gale, A.G.: Single stroke gaze gestures. In: CHI 2009 Extended Abstracts on Human Factors in Computing Systems, pp. 4555–4560, April 2009

    Google Scholar 

  5. Abe, K., Sato, H., Matsuno, S., Ohi, S., Ohyama, M.: Input interface using eye-gaze and blink information. In: Tino, A., Stephanidis, C. (eds.) HCII 2015 Posters. CCIS, vol. 528, pp. 463–467. Springer, Heidelberg (2015). doi:10.1007/978-3-319-21380-4_78

    Chapter  Google Scholar 

  6. Pedrosa, D., Pimentel, M.D.G., Wright, A., Truong, K.N.: Filteryedping: design challenges and user performance of dwell-free eye typing. ACM Trans. Accessible Comput. 6(1), 1–37 (2015)

    Article  Google Scholar 

  7. Kherlopian, A.R., Gerrein, J.P., Yue, M., Kim, K.R., Kim, J.W., Sukumaran, M., Sajda, P.: Electrooculogram based system for computer control using a multiple feature classification model. In: Proceedings of the 28th IEEE EMBS Annual International Conference, pp. 1295–1298, August 2006

    Google Scholar 

  8. Mondal, C., Azam, Md.K., Ahmad, M., Hasan, S.M.K., Islam, Md.R.: Design and implementation of a prototype electrooculography based data acquisition system. In: Proceedings of International Conference on Electrical Engineering and Information Communication Technology, pp. 1–6, May 2015

    Google Scholar 

  9. Yagi, T.: Eye-gaze interfaces using electro-oculography (EOG). In: Proceedings of the 2010 Workshop on Eye Gaze in Intelligent Human Machine Interaction, pp. 28–32, February 2010

    Google Scholar 

  10. Taher, F.B., Amor, N.B., Jalloulo, M.: A multimodal wheelchair control system based on EEG signals and Eye tracking fusion. In: Proceedings of International Symposium on Innovations in Intelligent SysTems and Applications, pp. 2–4, September 2015

    Google Scholar 

  11. Gao, D., Itakura, N., Mizuno, T., Mito, K.: Improvement of eye gesture interface. J. Adv. Comput. Intell. Intell. Inf. 17(6), 843–850 (2013)

    Google Scholar 

  12. Velichkovsky, B., Sprenger, A., Unema, P.: Towards gaze-mediated interaction: collecting solutions of the “Midas touch problem”. In: Proceedings of IFIP TC13 International Conference on Human-Computer Interaction, 14–18 July, pp. 509–516, September 1997

    Google Scholar 

  13. Vrzakova, H., Bednarik, R.: That’s not norma(n/l): a detailed analysis of midas touch in gaze-based problem-solving. In: CHI 2013 Extended Abstracts on Human Factors in Computing Systems, pp. 85–90, April 2013

    Google Scholar 

  14. Matsuno, S., Akehi, K., Itakura, N., Mizuno, T., Mito, K.: Computer input system using eye glances. In: Yamamoto, S., Abbott, A.A. (eds.) HIMI 2015. LNCS, vol. 9172, pp. 425–432. Springer, Heidelberg (2015). doi:10.1007/978-3-319-20612-7_41

    Chapter  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Shogo Matsuno .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing Switzerland

About this paper

Cite this paper

Matsuno, S., Ito, Y., Itakura, N., Mizuno, T., Mito, K. (2016). A Study of an Intention Communication Assisting System Using Eye Movement. In: Miesenberger, K., Bühler, C., Penaz, P. (eds) Computers Helping People with Special Needs. ICCHP 2016. Lecture Notes in Computer Science(), vol 9759. Springer, Cham. https://doi.org/10.1007/978-3-319-41267-2_69

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-41267-2_69

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-41266-5

  • Online ISBN: 978-3-319-41267-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics