Skip to main content

3-Steps Keyboard: Reduced Interaction Interface for Touchless Typing with Head Movements

  • Conference paper
  • First Online:
Proceedings of the 10th International Conference on Computer Recognition Systems CORES 2017 (CORES 2017)

Part of the book series: Advances in Intelligent Systems and Computing ((AISC,volume 578))

Included in the following conference series:

Abstract

This paper introduces a novel technique for touchless typing with head movements allowing to reach any alphabet character in only three steps. Head movements are frequently used for human-computer interaction by users with motor impairments unable to operate standard computer input devices. In such interfaces great difficulty is typing. Many directional head movements are required to reach subsequent characters using the on-screen keyboard and additional mechanism (like eye blink or mouth open) supplements the selection process. In this paper, a reduced interaction keyboard for touchless typing with head movements is proposed. The solution is based on recognition of head movements in four main directions.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Tu, J., Tao, H., Huang, T.: Face as mouse through visual face tracking. Comput. Vis. Image Underst. 108(2007), 35–40 (2007)

    Article  Google Scholar 

  2. Nabati, M., Behrad, A.: 3D Head pose estimation and camera mouse implementation using a monocular video camera. Sign. Image Video Process. 9(1), 39–44 (2015)

    Article  Google Scholar 

  3. Morris, T., Chauhan, V.: Facial feature tracking for cursor control. J. Netw. Comput. Appl. 29(2006), 62–80 (2006)

    Article  Google Scholar 

  4. Varona, J., Manresa-Yee, C., Perales, F.J.: Hands-free vision-based interface for computer accessibility. J. Netw. Comput. Appl. 31(4), 357–374 (2008)

    Article  Google Scholar 

  5. Santis, A., Iacoviello, D.: Robust real time eye tracking for computer interface for disabled people. Comput. Methods Program. Biomed. 96(1), 1–11 (2009)

    Article  Google Scholar 

  6. Shin, Y., Ju, J.S., Kim, E.Y.: Welfare interface implementation using multiple facial features tracking for the disabled people. Pattern Recogn. Lett. 29(2008), 1784–1796 (2008)

    Article  Google Scholar 

  7. Bian, Z.-P., Hou, J., Chau, L.-P., Magnenat-Thalmann, N.: Facial position and expression-based human–computer interface for persons with tetraplegia. IEEE J. Biomed. Health Inf. 20(3), 915–924 (2016)

    Article  Google Scholar 

  8. Gizatdinova, Y., Spakov, O., Surakka, V.: Face typing: vision-based perceptual interface for hands-free text entry with a scrollable virtual keyboard. In: IEEE Workshop on Applications of Computer Vision, Breckenridge, CO, USA, pp. 81–87 (2012)

    Google Scholar 

  9. Nowosielski, A.: Minimal interaction touchless text input with head movements and stereo vision. In: Chmielewski, L.J., Datta, A., Kozera, R., Wojciechowski, K. (eds.) ICCVG 2016. LNCS, vol. 9972, pp. 233–243. Springer, Cham (2016). doi:10.1007/978-3-319-46418-3_21

    Chapter  Google Scholar 

  10. Assistive Context-Aware Toolkit (ACAT). Project page (2017). https://01.org/acat

  11. Enable Viacam (eViacam) (2017). http://eviacam.sourceforge.net

  12. QVirtboard. Project page (2017). http://qvirtboard.sourceforge.net

  13. Nowosielski, A., Chodyła, Ł.: Touchless input interface for disabled. In: Burduk, R., Jackowski, K., Kurzynski, M., Wozniak, M., Zolnierek, A. (eds.) Proceedings of the 8th International Conference on Computer Recognition Systems CORES 2013. AISC, vol. 226, pp. 701–709. Springer, Heidelberg (2013)

    Chapter  Google Scholar 

  14. Nowosielski, A.: Acceleration of touchless typing with head movements by limiting the set of potential characters. Przegl. Elektrotechniczny 2016(12), 241–244 (2016). doi:10.15199/48.2016.12.62

    Google Scholar 

  15. Viola, P., Jones, M.: Robust real-time face detection. International Journal of Computer Vision 57(2), 137–154 (2004)

    Article  Google Scholar 

  16. Torr, P.H.S., Zisserman, A.: MLESAC: a new robust estimator with application to estimating image geometry. Comput. Vis. Image Underst. 78(1), 138–156 (2000)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Adam Nowosielski .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG

About this paper

Cite this paper

Nowosielski, A. (2018). 3-Steps Keyboard: Reduced Interaction Interface for Touchless Typing with Head Movements. In: Kurzynski, M., Wozniak, M., Burduk, R. (eds) Proceedings of the 10th International Conference on Computer Recognition Systems CORES 2017. CORES 2017. Advances in Intelligent Systems and Computing, vol 578. Springer, Cham. https://doi.org/10.1007/978-3-319-59162-9_24

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-59162-9_24

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-59161-2

  • Online ISBN: 978-3-319-59162-9

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics