Skip to main content
Log in

Hands-free interaction with a computer and other technologies

  • Long Paper
  • Published:
Universal Access in the Information Society Aims and scope Submit manuscript

Abstract

Hands-free interaction with technology is a dream for any person with limitations in the use of his/her arms and hands. This paper describes two new original low-cost hands-free computer peripheries—I4Control® and Magic Key, which use movements of the user’s eye or nose as an actuator of a computer cursor. Both systems emulate the PC mouse and thereby mediate direct access to any mouse-controlled computer application. Functionality of the presented systems is compared to that of PC mouse using one of the usability tests recommended by the ISO 9241 methodology. The data obtained as a result of testing a group of ten unimpaired novice users indicated that the users’ performance improves over time of usage of the system, but the process is rather slow. The paper describes several easy to use toy-applications intended to improve the user’s confidence in working with the considered devices. One of these applications demonstrates that I4Control® can be employed to control home appliances or a wheelchair.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18
Fig. 19
Fig. 20
Fig. 21
Fig. 22
Fig. 23

Similar content being viewed by others

References

  1. Bates, R., Istance, H., Oosthuizen, L., Majaranta, P.: Survey of De-Facto standards in eye tracking, project COGAIN. http://www.cogain.org/results/reports/COGAIN-D2.1.pdf (2005)

  2. Betke, M., et al.: The Camera Mouse: visual tracking of body features to provide computer access for people with severe disabilities. IEEE. Trans. Neural Syst. Rehabil. Eng. 10(1), 1–10 (2002)

    Article  Google Scholar 

  3. Cavendish Laboratory, Cambridge: the Dasher Project. http://www.inference.phy.cam.ac.uk/dasher/. Accessed 25 Jan 2007

  4. COGAIN brochure. http://www.cogain.org/downloads/. Accessed 11 April 2006

  5. Cook, D., Das, S.: Smart Environments: Technology Protocols and Application. Wiley-Interscience, New York (2004)

    Google Scholar 

  6. Donegan, M., Oosthuizen: the ‘KEE’ concept for eye-control and complex disabilities: knowledge-based, end user-focused and evolutionary, In: Proceedings of COGAIN 2006—Gazing into the Future, pp. 83–89. Politecnico di Torino, Torino (2006)

  7. Fejtová, M., Fejt, J.: System I4Control: the eye as a new computer periphery. In: The 3rd European Medical and Biological Engineering Conference—EMBEC′05 (CD-ROM), vol. 11. Společnost biomedicínského inženýrství a lékařské informatiky ČLS JEP, Praha. ISSN 1727-1983

  8. Figueiredo, L.F., Gomes, A.I., Raimundo, J.B.: Magic Key, In: Proceedings of COGAIN 2006—Gazing into the Future, pp. 104–107. Politecnico di Torino, Torino (2006)

  9. Gonzalez, R., Woods, R.: Digital Image Processing. Prentice Hall, New Jersey (2001)

    Google Scholar 

  10. Gorodnichy, D.O.: Towards automatic retrieval of blink-based lexicon for persons suffered from brain–stem injury using video cameras. In: CD-ROM Proceedings of the First IEEE CVPR Workshop on Face Processing in Video (FPIV’04), Washington DC, (2004)

  11. Gorodnichy, D.O., Roth, G.: Nouse ‘Use your nose as a mouse’ perceptual vision technology for hands-free games and interfaces. Image Vis. Comput. 22(12), 931–942 (2004)

    Article  Google Scholar 

  12. Hansen, J.P., Tørning, K., Johansen, S., Itoh, K.: Eye tracking research and application archive. Gaze typing compared with input by head and hand. In: Proceedings of the 2004 Symposium on Eye Tracking Research and Applications, San Antonio, Texas

  13. Haro, A., Flickner, M., Essa, I.: Detecting and tracking eyes by using their physiological properties, dynamics, and appearance. In: IEEE CVPR 2000, pp. 163–168 (2000)

  14. Hodač, J.: SUDOKU—implementation of an interactive environment, Bc. Thesis, FEE, Czech Technical University, (2006)

  15. http://www.cogain.org/eyetrackers/. Accessed 30 July 2008)

  16. http://www.EyeCan.ca

  17. http://www.i4control.eu/

  18. http://www.magickey.ipg.pt

  19. http://www.nrc-cnrc.gc.ca/eng/education/innovations/spotlight/nouse.html

  20. http://www.spectronicsinoz.com/product.asp?product=18455

  21. Hussain, Z.: Digital Image Processing-Practical Applications of Parallel Processing Techniques. Ellis Horwood, West Sussex (1991)

    Google Scholar 

  22. Intel Corporation: IA-32 Intel Architecture Software Developer’s Manual. http://developer.intel.ru/design/pentium4/manuals/245470.htm. Accessed 2006

  23. ISO 9241-11: Ergonomic requirements for office work with visual display terminals (VDTs)—Part 11. Guidance of usability. The European Standard (1998)

  24. ISO 9241-9: Ergonomic requirements for office work with visual display terminals (VDTs)—Part 9. Requirements for non-keyboard input devices. The European Standard (2000)

  25. Istance, H., Hyrskykari, A., Koskinen, D., Bates, R.: Gaze-based attentive user interfaces (AUIs) to support disabled users: towards a research agenda, In Proceedings of COGAIN 2006—Gazing into the Future, pp. 56-62. Politecnico di Torino, Torino (2006)

  26. Kim, D.H., Kim, J.H., Yoo, D.H., Lee, Y.J., Chung, M.J.: A human–robot interface using eye-gaze tracking system for people with motor disabilities. Trans. Control Autom. Syst. Eng. 3(4), 229–235 (2001)

    Google Scholar 

  27. Kšára, M.: SW tool for writing messages using I4Control, Bc. Thesis, FEE, Czech Technical University (2006)

  28. MacKenzie I.S., Kauppinen, T., Silfverberg, M.: Accuracy measures for evaluating computer pointing devices. In: Proceedings of ACM CHI 2001, 9–16. ACM, New York (2001)

  29. Majaranta, P., Raiha K.-J.: Twenty years of eye typing: systems and design issues. In: Proceedings of ETRA ‘02, pp. 15–22. ACM Press, Cambridge. doi:10.1145/507072.507076

  30. Muller, G.R., Pfurtscheller, J., Gerner, H.J., Rupp, R.: Thought—control of functional electrical stimulation to restore hand grasp in a patient with tetraplegia. Neurosci. Lett. 351, 33–36 (2003)

    Article  Google Scholar 

  31. Nonaka, H.: Communication interface with eye-gaze and head gesture using successive DP matching and fuzzy inference. J. Intell. Inf. Syst. 21(2), 105–112 (2003)

    Article  Google Scholar 

  32. Nussbaum, G., Miesenberger, K.: The Assistive Home—More than Just another Approach to Independent Living? Lecture Notes in Computer Science, vol. 3118, pp. 891–897. Springer, Heidelberg (2004)

  33. Randy, A.: Crafting the New Conversational Speech Systems, Elsevier-Morgan Kaufmann Publishing, Amsterdam ISBN 1-55860-768-4 (2005)

  34. Tanaka, K., Matsunaga, K., Wang, H.O.: Electroencephalogram—based control of an electric wheelchair. IEEE Trans. Robot. 21(4), 762–766 (2005)

    Article  Google Scholar 

Download references

Acknowledgments

The presented research and development has been partially supported by the grant MSM 6840770012 of Czech Ministry of Education Transdisciplinary Research in the Area of Biomedical Engineering II and by IST Network of Excellence IST-2003-511598 COGAIN (Communication by Gaze Interaction).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Olga Štěpánková.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Fejtová, M., Figueiredo, L., Novák, P. et al. Hands-free interaction with a computer and other technologies. Univ Access Inf Soc 8, 277–295 (2009). https://doi.org/10.1007/s10209-009-0147-2

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10209-009-0147-2

Keywords

Navigation