Skip to main content
Log in

Design recommendations for camera-based head-controlled interfaces that replace the mouse for motion-impaired users

  • Long paper
  • Published:
Universal Access in the Information Society Aims and scope Submit manuscript

Abstract

This work focuses on camera-based systems that are designed for mouse replacement. Usually, these interfaces are based on computer vision techniques that capture the user’s face or head movements and are specifically designed for users with disabilities. The work identifies and reviews the key factors of these interfaces based on the lessons learnt by the authors’ experience and by a comprehensive analysis of the literature to describe the specific points to consider in their design. These factors are as follows: user features to track, initial user detection (calibration), position mapping, feedback, error recovery, event execution, profiles and ergonomics of the system. The work compiles the solutions offered by different systems to help new designers avoid problems already discussed by the others.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

References

  1. Porta, M.: Vision-based user interfaces: methods and applications. Int. J. Hum. Comput. Stud. 57(1), 27–73 (2002)

    Article  Google Scholar 

  2. Foggia, P., Sansone, C., Vento, M., Azzari, P., Di Stefano, L.: Vision-based markerless gaming interface. In: Proceedings of the Image Analysis and Processing—ICIAP 2009. LNCS, vol. 5716, pp. 288–296. Springer, Heidelberg (2009)

  3. Zabulis, X., Baltzakis, H., Argyros, A.: Vision-based hand gesture recognition for human computer interaction. In: Stephanidis, C. (ed.) The Universal Access Handbook, pp. 34.1–34.30. Taylor and Francis, London (2009)

    Google Scholar 

  4. Watzman, S.: Visual design principles for usable interfaces. In: Sears, A., Jacko, J.A. (eds.) The Human–Computer Interaction Handbook: Fundamentals, Evolving Technologies and Emerging Applications, pp. 329–354. Lawrence Erlbaum, New Jersy (2008)

    Google Scholar 

  5. Kuniavsky, M.: User experience and HCI. In: Sears, A., Jacko, J.A. (eds.) The Human–Computer Interaction Handbook: Fundamentals, Evolving Technologies and Emerging Applications, pp. 897–916. Lawrence Erlbaum, New Jersey (2008)

    Google Scholar 

  6. Bank-Mikkelsen, N.E.: El principio de normalización. Siglo Cero 37(16–21), 1975 (1975)

    Google Scholar 

  7. Sears, A., Young, M.: Physical disabilities and computing technologies: an analysis of impairments. In: Sears, A., Jacko, J.A. (eds.) The Human–Computer Interaction Handbook: Fundamentals, Evolving Technologies and Emerging Applications, pp. 482–503. Lawrence Erlbaum, New Jersey (2003)

    Google Scholar 

  8. Fejtová, M., Figueiredo, L., Novák, P., Stepánková, O., Gomes, A.: Hands-free interaction with a computer and other technologies. Univers. Access Inf. Soc. 8(4), 277–295 (2009)

    Article  Google Scholar 

  9. Vickers, S., Istance, H., Hyrskykari, A., Ali, N., Bates, R.: Keeping an eye on the game: eye-gaze interaction with massively multiplayer online games and virtual communities for motor impaired users. In: Proceedings of the 7th International conference on disability, virtual reality and associated technologies, pp. 1–8 (2008)

  10. Betke, M., Gips, J., Fleming, P.: The camera mouse: visual tracking of body features to provide computer access for people with severe disabilities. IEEE Trans. Neural Syst. Rehabil. Eng. 10(1), 1–10 (2002)

    Article  Google Scholar 

  11. Cloud, R.L., Betke, M., Gips, J.: Experiments with a camera-based human–computer interface system. In: Proceedings of the 7th ERCIM Workshop “User Interfaces for All” UI4ALL, pp. 103–110 (2002)

  12. Akram, W., Tiberii, L., Betke, M.: A customizable camera-based human computer interaction system allowing people with disabilities autonomous hands-free navigation of multiple computing tasks. In: Proceedings of the 9th International Conference on User Interfaces for All, pp. 28–43 (2006)

  13. Magee, J.J., Epstein, S., Missimer, E.S., Betke, M.: Adaptive mappings for mouse-replacement interfaces. In: Proceedings of ACM SIGACCESS Conference on Computers and Accessibility, pp. 231–232 (2011)

  14. Fagiani, C., Betke, M., Gips, J.: Evaluation of tracking methods for human–computer interaction. In: Proceedings of the IEEE Workshop on Applications in Computer Vision (WACV), pp. 121–126 (2002)

  15. Connor, C., Yu, E., Magee, J., Cansizoglu, E., Epstein, S., Betke, M.: Movement and recovery analysis of a mouse-replacement interface for users with severe disabilities. In: UAHCI ‘09 Proceedings of the 5th International on Conference Universal Access in Human–Computer Interaction. Part II: Intelligent and Ubiquitous Interaction Environments. Springer, Berlin (2009)

  16. Perini, E., Soria, S., Prati, A., Cucchiara, R.: Facemouse: a human–computer interface for tetraplegic people. In: ECCV Workshop on HCI, volume 3979 of Lecture Notes in Computer Science, pp. 99–108 (2006)

  17. Kjeldsen, R.: Improvements in vision-based pointer control. In: Proceedings of ACM SIGACCESS Conference on Computers and Accessibility, pp. 189–196. ACM Press (2006)

  18. Kjeldsen, R., Hartman, J.: Design issues for vision-based computer interaction systems. In: Perceptual User Interfaces 2001, Orlando, Fla (2001)

  19. Gorodnichy, D.O.: On importance of nose for face tracking. In: Proceedings of the IEEE International conference on automatic face and gestures recognition, pp. 181–186. Washington, DC (2002)

  20. Gorodnichy, D., Dubrofsky, E., Mohammad, A.: Working with a computer hands-free using the nouse perceptual vision interface. In: International Workshop on Video Processing and Recognition (VideoRec’07). NRC 49354 (2007)

  21. Gorodnichy, D.O., Malik, S., Roth, G.: Nouse ‘use your nose as a mouse’ perceptual vision technology for hands-free games and interfaces. Image Video Comput. 22(12), 931–942 (2004)

    Article  Google Scholar 

  22. Gorodnichy, D.O.: Perceptual cursor—a solution to the broken loop problem in vision-based hands-free computer control devices. Technical report, NRC/ERB-1133 (2006)

  23. Gorodnichy, D.O.: Towards automatic retrieval of blink-based lexicon for persons suffered from brain-stem injury using video cameras. In: Proceedings of the IEEE Computer Vision and Pattern Recognition, Workshop on Face Processing in Video. Washington DC, USA, NRC 47139 (2004)

  24. Mauri, C., Granollers, T., Lores, J., García, M.: Computer vision interaction for people with severe movement restrictions. Hum. Technol. 2(1), 38–54 (2006)

    Google Scholar 

  25. Mauri, C., Granollers, T., Solanas, A.: On the assessment of the interaction quality of users with cerebral palsy. In: Proceedings Availability, Reliability and Security, 2007. ARES, pp. 799–805 (2007)

  26. Manresa-Yee, C., Varona, J., Perales, F.J.: Towards hands-free interfaces based on real-time robust facial gesture recognition. In: Proceedings AMDO’06, Lecture Notes in Computer Science, vol. 4069, pp. 504–513 (2006)

  27. Manresa-Yee, C., Varona, J., Perales, F.J., Negre, F., Muntaner, J.J.: Experiences Using a Hands-Free Interface. In: Proceedings of ACM SIGACCESS Conference on Computers and Accessibility, pp. 261–262 (2008)

  28. Manresa-Yee, C., Ponsa, P., Varona, J., Perales, F.J.: User experience to improve the usability of a vision-based interface. Interact. Comput. 22(6), 594–605 (2010)

    Article  Google Scholar 

  29. Manresa-Yee, C., Ponsa, P., Salinas, I., Perales, F.J., Negre, F., Varona, J.: Observing the use of an input device for rehabilitation purposes. Behav. Info. Technol. (2013). doi:10.1080/0144929X.2013.795607

    Google Scholar 

  30. Manresa-Yee, C.: Advanced and Natural System for Motion-Impaired Users, Thesis dissertation. Ed. Servei de biblioteca i documentació, pp. 1–213 (2009)

  31. Varona, J., Manresa-Yee, C., Perales, F.J.: Hands-free vision-based interface for computer accessibility. J. Netw. Comput. Appl. 31(4), 357–374 (2008)

    Article  Google Scholar 

  32. Toyama, K.: Look, ma—no hands! Hands-free cursor control with real-time 3d face tracking. In: Proceedings of the Workshop on Perceptual User Interfaces (PUI), pp. 49–54 (1998)

  33. Tu, J., Tao, H., Huang, T.: Face as mouse through visual face tracking. Comput. Vis. Image Underst. 108, 35–40 (2007)

    Article  Google Scholar 

  34. Bradski, G.R.: Computer vision face tracking for use in a perceptual user interface. Intel Technol. J. Num. 1(O2), 1–15 (1998)

    Google Scholar 

  35. De Silva, G.C., Lyons, M.J., Kawato, S., Tetsutani, N.: Human factors evaluation of a vision-based facial gesture interface. In: Proceedings of the 2003 Conference on Computer Vision and Pattern Recognition Workshop, vol. 5, pp. 52–59 (2003)

  36. Lyons, M.J., de Silva, G.C., Kuwabara, K.: Facial Gesture interfaces for hands-free input. In: Proceedings 18th British HCI Group Annual Conference HCI-2004, pp. 57–60 (2004)

  37. Hannuksela, J., Heikkilä, J., Pietikäinen, M.: Human–computer interaction using head movements. In: Proceedings Workshop on Processing Sensory Information for Proactive Systems (PSIPS 2004), pp. 30–36 (2004)

  38. Morris, T., Chauhan, V.: Facial feature tracking for cursor control. J. Netw. Comput. Appl. 29(1), 62–80 (2006)

    Article  Google Scholar 

  39. Kumar, R., Kumar, A.: Black pearl: an alternative for mouse and keyboard. Int. J. Graph. Vis. Image Process. 8, 1–6 (2008)

    Article  Google Scholar 

  40. El-Afifi, L., Karaki, M., Sorban, J.: Hands-free interface. A fast and accurate tracking procedure for real time human computer interaction. In: Proceedings of the Fourth IEEE International Symposium on Signal Processing and Information Technology, pp. 517–520 (2004)

  41. Palleja, T., Rubion, W., Teixido, M., Tresanchez, M., Fernández del Viso, A., Rebate, C., Palacin, J.: Using the optical flow to implement a relative virtual mouse controlled by head movements. J. Univers. Comput. Sci. 14(19), 3127–3141 (2009)

    Google Scholar 

  42. Su, M.U., Su, S., Chen, G.: A low-cost vision-based human–computer interface for people with severe disabilities. Biomed. Eng. Appl. Basis Commun. 17, 284–292 (2005)

    Article  Google Scholar 

  43. Saleiro, M., Rodrigues, J. du Buf, J.M.H.: Automatic hand or head gesture interface for individuals with motor impairments, senior citizens and young children. In: Proceedings of the International Conference on Software Development for Enhancing Accessibility and Fighting Info-exclusion (DSAI2009), pp. 165–171 (2009)

  44. Manresa-Yee, C., Muntaner, J.J., Arellano, D.: A motion-based interface to control environmental stimulation for children with severe to profound disabilities. In: Proceedings CHI ‘13 Extended Abstracts on Human Factors in Computing Systems, pp. 7–12. ACM, New York, NY, USA, París, France (2013)

  45. Li, S., Ngan, K.N., Sheng, L.: A head pose tracking system using RGB-D camera. In: Leibe, C.B., Neumann B. (eds.) Proceedings ICVS 2013, LNCS 7963, pp. 153–162 (2013)

  46. Gould, J.D., Lewis, C.: Designing for usability: key principles and what designers think. Commun. ACM 28(3), 300–311 (1985)

    Article  Google Scholar 

  47. Dumas, J.S., Redish, J.C.: A Practical Guide to Usability Testing, revised edition Ed. Intellect Books (1999)

  48. EyeTwig. http://www.eyetwig.com. Last Accessed Mar 2012

  49. QualiLife SA. http://qualilife-sa.software.informer.com/. Last Accessed Mar 2012

  50. Norman, D.A.: Cognitive engineering. In: Norman, D.A., Draper, S.W. (eds.) User Centered System Design: New Perspectives on Human–Computer Interaction, pp. 31–61. Erlbaum, Hillsdale (1986)

  51. Norman, D.A.: The Design of Everyday Things. Basic Books, London (2002)

    Google Scholar 

  52. ISO 9241-11:1998 Ergonomic requirements for office work with visual display terminals (VDTs) part 11: guidance on usability

  53. ISO 9241-1: 1997/Amd 1:2001 for office work with visual display terminals (VDTs)—part 1: general introduction (1997)

  54. ISO 9241-210:2010. Ergonomics of human-system interaction—part 210: human-centred design for interactive systems (2010)

  55. ISO 9241-3:1992. Ergonomic requirements for office work with visual display terminals (VDTs)—part 3: Visual display requirements (1992)

  56. ISO 9241-300:2008. Ergonomics of human-system interaction—part 300: introduction to electronic visual display requirements (2008)

  57. ISO 9241-303:2008. Ergonomics of human-system interaction—part 303: requirements for electronic visual display requirements (2008)

  58. ISO 9241-400:2008. Ergonomics of human-system interaction—part 400: Principles and requirements for physical input devices

  59. ISO 9241-410:2008. Ergonomics of human-system interaction—part 410: design principles for physical input devices (2008)

  60. ISO 9241-5:1998. Ergonomic requirements for office work with visual display terminals (VDTs)—part 5: workstation layout and postural requirements (1998)

  61. ISO 9241-9:2000. Ergonomic requirements for office work with visual display terminals (VDTs)—part 9: requirements for non-keyboard input devices (2000)

  62. LoPresti, E.F., Brienza, D.M., Angelo, J.: Head-operated computer controls: effect of control method on performance for subjects with and without disability. Interact. Comput. 14(4), 359–377 (2002)

    Article  Google Scholar 

  63. Wu, S.P., Yang, C.H., Ho, C.P., Jane, D.H.: VDT screen height and inclination effects on visual and musculoskeletal discomfort for Chinese wheelchair users with spinal cord injuries. Ind. Health 47(1), 89–93 (2009)

    Article  Google Scholar 

  64. Oetjen, S., Ziefle, M.: A visual ergonomic evaluation of different screen types and screen technologies with respect to discrimination performance. Appl. Ergonomics 40(1), 69–81 (2009)

    Article  Google Scholar 

  65. Lin, Y.P., Chao, Y.P., Lin, C.C., Chen, J.H.: Webcam mouse using face and eye tracking in various illumination environments. In: Proceedings of the 2005 IEEE Engineering in Medicine and Biology 27th Annual Conference, pp. 1–4 (2005)

  66. Lee, C.-H., Jang, C.-Y.I., Chen, T.-H.D., Wetzel, J., Shen, Y.-T.B., Selker, T.: Attention meter: a vision-based input toolkit for interaction designers. In: Extended Abstracts of the ACM Conference on Human Factors in Computing System (CHI’06), pp. 1007–1012. ACM Press (2006)

  67. Borg, G.: Psychophysical scaling with applications in physical work and the perception of exertion. Scand. J. Work Environ. Health 16(Suppl. 1), 55–58 (1990)

    Article  Google Scholar 

  68. Jansen, A., Findlater, L., Wobbrock, J.O.: From the lab to the world: lessons from extending a pointing technique for real-world use. In: Extended Abstracts of the ACM Conference on Human Factors in Computing Systems (CHI ‘11), pp. 1867–1872. ACM Press (2011)

  69. Findlater, L., Jansen, A., Shinohara, K., Dixon, M., Kamb, P., Rakita, J., Wobbrock, J.O.: Enhanced area cursors: reducing fine-pointing demands for people with motor impairments. In: Proceedings of the ACM Symposium on User Interface Software and Technology (UIST ‘10), pp. 153–162. ACM Press (2010)

  70. Wobbrock, J.O., Fogarty, J., Liu, S., Kimuro, S., Harada, S.: The angle mouse: target-agnostic dynamic gain adjustment based on angular deviation. In: Proceedings of the ACM Conference on Human Factors in Computing Systems (CHI ‘09), pp. 1401–1410. ACM Press (2009)

Download references

Acknowledgments

The authors would like to thank all the anonymous reviewers for their constructive comments. This work was supported in part by FRIVIG, A1/037910/11 Formación de Recursos Humanos e Investigación en el Área de Visión por Computador e Informática Gráfica, granted by MAEC-AECID (Programa de Cooperación Interuniversitaria e Investigación Científica de España e Iberoamérica), 28/2011 (Ajudes grup competitiu UGIVIA) granted by the Govern de les Illes Balears and TIN12-35427 granted by the Gobierno de España.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Cristina Manresa-Yee.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Manresa-Yee, C., Varona, J., Perales, F.J. et al. Design recommendations for camera-based head-controlled interfaces that replace the mouse for motion-impaired users. Univ Access Inf Soc 13, 471–482 (2014). https://doi.org/10.1007/s10209-013-0326-z

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10209-013-0326-z

Keywords

Navigation