Skip to main content
Log in

A camera-joystick for sound-augmented non-visual navigation and target acquisition: a case study

  • Long Paper
  • Published:
Universal Access in the Information Society Aims and scope Submit manuscript

Abstract

This paper presents the results of a comparative study of user input with a camera-joystick and a manual joystick used in a target acquisition task when neither targets nor pointer could be perceived visually. The camera-joystick is an input technique in which each on-screen item is accessible from the center with a predefined vector of head motion. Absolute pointing was implemented with an acceleration factor of 1.7 and a moving average on 5 detected head positions. The underlying assumption was that, in order to provide a robust input for blind users, the interaction technique has to be based on perceptually well-discriminated human movements, which compose a basic framework of an accessible virtual workspace demanding minimum external auxiliary cues. The target spots, having a diameter of 35 mm and a distance between the centers of adjacent spots of 60 mm, were arranged in a rectangular grid of 5 rows by 5 columns. The targets were captured from a distance of 600 mm. The results have shown that the camera input is a promising technique for non-visual human–computer interaction. The subjects demonstrated, more than twice, better performance in the target acquisition task with the camera-joystick versus the manual joystick. All the participants reported that the camera-joystick was a robust and preferable input technique when visual information was not available. Blind interaction techniques could be significantly further improved allowing a user-dependent activation of the navigational cues to better coordinate feedbacks with exploratory behavior.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11

Similar content being viewed by others

References

  1. Arato, A., Juhasz, Z., Blenkhorn, P. et al.: Java-powered Braille Slate Talker. In: Proceedings of the 9th Int. Conference on Computers Helping People with Special Needs (ICCHP 2004) (Paris, France, July 2004). LNCS, vol. 3118. Springer, Heidelberg, pp. 506–513 (2004)

  2. Betke, M., Gips, J., Fleming, P.: The camera mouse: visual tracking of body features to provide computer access for people with severe disabilities. IEEE Trans. Neural Syst. Rehabil. Eng. 10(1), 1–10 (2002)

    Article  Google Scholar 

  3. Blenkhorn, P., Crombie, D., Dijkstra, S., et al.: Access to technical diagrams for blind people. In: Proceedings of AAATE 2003: Assistive Technology—Shaping the Future, pp. 466–470. IOS, Amsterdam (2003)

  4. Brown, L., Brewster, S.: Drawing by ear: interpreting sonified line graphs. In: Proc. of ICAD 2003, Boston, MA, USA, pp. 152–156 (2003)

  5. Chen, X., Tremaine, M., Lutz, R., Chung, J-W., Lacsina, P.: AudioBrowser: a mobile browsable information access for the visually impaired. Univ. Access Inform. Soc. (UAIS). 5(1), 4–22 (2006)

    Article  Google Scholar 

  6. Cummings, A.H.: The evolution of game controllers and control schemes and their effect on their games. In: The 17th Annual University of Southampton Multimedia Systems Conference, January 2007. Available at: http://mms.ecs.soton.ac.uk/2007/ (2007)

  7. DEMOR Location based 3D audiogame. Available at (web site 2004): http://demor.hku.nl (2004)

  8. Dix A.: Closing the loop: modelling action, perception and information. In: Proceedings of the Workshop On Advanced Visual Interfaces, Gubbio, Italy, pp. 20–28. ACM, New York (1996)

  9. Donker, H., Klante, P., Gorny, P.: The design of auditory user interfaces for blind users. In: Proceedings of the Second NordiCHI, (Aarhus Denmark October 2002), pp. 149–155. ACM, New York (2002)

  10. Edwards, A.D.N.: Soundtrack: An auditory interface for blind users. Hum Com Interact. 4(1), 45–66 (1989)

    Article  Google Scholar 

  11. Edwards, A.D.N., Evreinov, G.E., Agranovski, A.V.: Isomorphic sonification of spatial relations. In: Proceedings of HCI International’99, Munich, Germany. Lawrence Erlbaum, Mahwah, New Jersey, London, vol. 1, pp. 526–530 (1999)

  12. Eriksson, Y., Gärdenfors, D.: Computer games for children with visual impairments. In: Proceedings of the 5th Int. Conference of Disability, pp. 79–86. Virtual Reality and Associated Technologies. Oxford, UK (2004)

  13. Evreinov, G., Raisamo, R.: An evaluation of three sound mappings through the localization behavior of the eyes. In: Proc. of AES 22nd Int. Conf. on Virtual, pp. 239–248. Synthetic and Entertainment Audio 2002, Espoo, Finland (2002)

  14. Evreinova, T.G., Vesterinen, L.K., Evreinov, G., Raisamo, R.: An exploration of directional-predictive sounds for non-visual interaction with graphs. KAIS J. ISSN 0219–1377, http://dx.doi.org/10.1007/s10115-006-0059-x (2007)

  15. Evreinova, T.V., Evreinov, G., Raisamo, R.: Camera based head-mouse: optimization of template-based cross-correlation matching. In: Proc. of Int. Conference on Computer Vision Theory, pp. 507–514. VISAPP 2007. Barcelona, Spain, INSTICC (2007)

  16. FaceMOUSE Product information. Web site 2005: http://www.aidalabs.com/ (2005)

  17. Ford, D.N.: A behavioral approach to feedback loop dominance analysis. Syst Dynam Rev. 15(1), 3–36 (1999)

    Article  Google Scholar 

  18. Franklin, K., Roberts, J.C.: Pie chart sonification. In: Proc. of the 7th Int. Conf. on Information Visualization (IV03), London, UK, pp. 4–9 (2003)

  19. Franklin, K.M., Roberts, J.C.: A path based model for sonification. In: Eighth International Conference on Information Visualisation (IV’04), pp. 865–870 (2004)

  20. Friberg, J., Gärdenfors, D.: Audio games: new perspectives on game audio. In: Proceedings of the Int. Conference on Advances in Computer Entertainment Technology (ACE’04), pp. 148–154. ACM, New York (2004)

  21. Game controller The article available at (web site 2007): http://en.wikipedia.org/wiki/Game_controller (2007)

  22. Gentilucci, M., Jeannerod, M., Tadary, B., Decety, J.: Dissociating visual and kinesthetic coordinates during pointing movements. J. Exp. Brain Res. 102(2), 359–366 (1994)

    Google Scholar 

  23. Gorny, P.: Typographic semantics of Webpages accessible for visually impaired users: mapping layout and interaction objects to an auditory interaction space. In: Proceedings of the Int. Conference on Computers Helping People with Special Needs, ACM Press Int. Conf. Proceedings Series vol. 31, New York, pp. 251–257 (2004)

  24. Hohryakov, S.S.: Motionless Electronic Pen. Available at (web site 2004) http://www.geocities.com/hoh21.geo/ (2004)

  25. Hollands, M.A., Ziavra, N.V., Bronstein, A.M.: A new paradigm to investigate the roles of head and eye movements in the coordination of whole-body movements. J. Exp. Brain Res. 154(2), 261–266 (2004)

    Article  Google Scholar 

  26. Intuitive Mensch-Technik Interaktion (INVITE) Leitprojekt des Bundesministeriums für Bildung und Forschung Project site 2004: http://ls7-www.informatik.uni-dortmund.de/research/projekte/invite/ (2004)

  27. Jaimes, A.: Posture and activity silhouettes for self-reporting, interruption management, and attentive interfaces. In: Proceedings of the Int. Conference on Intelligent User Interfaces (IUI’06), (Sydney, Australia, Jan. 28–Feb. 1). ACM Press, New York, pp. 24–31 (2006)

  28. Jilin Tu, T., Huang, T., Tao, H.: Face as mouse through visual face tracking. In: Proceedings of the 2nd Canadian Conf Computer and Robot Vision (CRV05), (Victoria, BC, Canada), IEEE Computer Society Press, pp. 339–346 (2005)

  29. Jones, L.A.: Kinesthetic sensing. In: Proc. of the Workshop on Human and Machine Haptics, HMH’97. Human and machine haptics, MIT Press 2000. Available at (web site 2007): http://brl.ee.washington.edu/Education/EE589/Readings/jones00.pdf (1997)

  30. Kamel, H.M.: The Integrated Communication 2 Draw. Ph.D. Dissertation, Electrical Engineering and Computer Sciences Department, University of California, Berkeley. Available at (Web site 2005) http://dub.washington.edu/pubs/ (2003)

  31. Kamel, H.M., Roth, P., Sinha, R.R.: Graphics and User’s Exploration via Simple Sonics (GUESS): providing interrelational representation of objects in a non-visual environment. In: Proceedings of the Int. Conference on Auditory Display (ICAD 2001) Otamedia Oy, Espoo, Finland, pp. 261–266 (2001)

  32. Keates, S., Robinson, P.: The use of gestures in multimodal input. In: Proceedings of the Third International ACM Conference on Assistive Technologies. ACM, New York, pp. 35–42 (1998)

  33. King, A., Blenkhorn, P., Crombie, D., et al.: Presenting UML software engineering diagrams to blind people. In: Proceedings of the 9th Int. Conf. on Computers Helping People with Special Needs. (ICCHP 2004), Paris, France. LNCS3118, Springer, Heidelberg, pp. 522–529 (2004)

  34. Kjeldsen, R., Hartman, J.: Design issues for vision-based computer interaction systems. In: Proc. of the Workshop on Perceptual User Interfaces. ACM International Conference Proceeding Series, Orlando, vol. 15, pp. 1–8 (2001)

  35. Kramer, G. (ed.): Auditory display—sonification, audification, and auditory interfaces. In: Proc. Volume XVIII, Addison Wesley, Reading (1994)

  36. Kurze, M.: TDraw: A computer-based tactile drawing tool for blind people. In: Proceedings of Second Annual ACM Conference on Assistive Technologies: Assets’96, pp. 131–138 (1996)

  37. Kyunghan, H.K.: Modeling of Head and Hand Coordination in Unconstrained Three-Dimensional Movements. PhD, The University of Michigan. Available at http://www.engin.umich.edu/dept/ioe/HUMOSIM/dissertations/Kim_Dissertation_2005.pdf (2005)

  38. Lindenberger, U.: Aging, professional expertise, and cognitive plasticity: the sample case of imagery-based memory functioning in expert graphic designers. (Nebst) Elektronischer Ressource. Max-Planck-Inst. für Bildungsforschung, Berlin. Available at (web site 2006) http://library.mpib-berlin.mpg.de/ (1991)

  39. Loomis J.M., Golledge R.G., Klatzky R.L.: GPS-based navigation systems for the visually impaired. In: Barfield, W., Caudell, T. (Eds.) Fundamentals of Wearable Computers and Augmented Reality, pp. 429–446. Lawrence Erlbaum, Mahwah (2001)

    Google Scholar 

  40. Loomis J.M., Knapp J.M.: Visual perception of egocentric distance in real and virtual environments. In: Hettinger, L.J., Haas, M.W. (eds.) Virtual and Adaptive Environments, pp. 21–46. Erlbaum, Mahwah (2003)

    Google Scholar 

  41. Macaluso E., Driver J., van Velzen J., Eimer M.: Influence of gaze direction on crossmodal modulation of visual ERPS by endogenous tactile spatial attention. Brain Res Cogn Brain Res. 23(2–3), 406–417 (2005)

    Article  Google Scholar 

  42. MacKenzie I.S.: Motor behaviour models for human–computer interaction. In: Carroll, J.M. (ed.) HCI Models, Theories, and Frameworks: Toward a Multidisciplinary Science, pp. 27–54. Morgan Kaufmann, San Francisco (2003)

    Google Scholar 

  43. Meijer, P.B.L.: The vOICe Math Functions. Accessible Graphing Calculator for the Blind (web site 2005). http://www.seeingwithsound.com/winmath.htm (2005)

  44. Morley, S., Petrie, H., O’Neill, A-M.: Auditory navigation in hyperspace: design and evaluation of a non-visual hypermedia system for blind users. In: Proceedings of the Third Int. ACM Conference on Assistive Technology: ASSETS’98. Marina del Rey, CA. ACM Press, New York, pp. 100–107 (1998)

  45. Nielsen, J.: Alternative interfaces for accessibility. In: Bi-Weekly Column Alertbox: Current Issues in Web Usability, of April 7 2003. Available at (website 2007): http://www.useit.com/alertbox/ (2003)

  46. Norman, K.L.: The Psychology of Menu Selection: Designing Cognitive Control at the Human–Computer Interface. Ablex Publishing Corporation, Norwood (1991)

  47. Ouerfelli, M., Kumar, V., Harwin, W.S.: Kinematic modeling of head–neck movements. Systems, Man and Cybernetics, Part A. IEEE Trans Syst Man Cybern A Syst Hum. 29(6), 604–615 (1999)

    Article  Google Scholar 

  48. Oviatt, S.: Ten myths of multimodal interaction. Communications of the ACM, vol. 42, pp. 74–81. http://www.cse.ogi.edu/CHCC/Personnel/oviatt.html (1999)

  49. Perini, E., Soria, S., Prati, A., Cucchiara, R.: FaceMouse: A Human–computer interface for tetraplegic people. In: Proceedings of the Workshop on HCI, Computer Vision in Human–Computer Interaction, (HCI/ECCV 2006), (Graz, Austria). LNCS 3979, pp. 99–108, Springer, Heidelberg (2006)

  50. Philbeck, J.W., Loomis, J.M., Beall, A.C.: Visually perceived location is an invariant in the control of action. Percept Psychophys. 59(4), 601–612 (1997)

    Google Scholar 

  51. Product information on EyeTwig.com Website 2005. http://www.eyetwig.com (2005)

  52. Raz, N., Amedi, A., Zohary, E.: V1 activation in congenitally blind is associated with episodic retrieval. Cerebral Cortex. 15, 1459–1468 (2005)

    Article  Google Scholar 

  53. Rinott, M.: SonicTexting. In: Proceedings of CHI 05. ACM, New York, pp. 1144–1145 (2005)

  54. Sánchez, J., Flores, H.: AudioMath: Blind children learning mathematics through audio. In: Proceedings of The 5th International Conference on Disability, Virtual Reality and Associated Technologies, (ICDVRAT 2004) (September 20–22, Oxford, United Kingdom), pp. 183–189 (2004)

  55. Savidis, A., Stephanidis, C., Korte, A., Crispie, K., Fellbaum, K.: A generic direct-manipulation 3D-auditory environment for hierarchical navigation in non-visual interaction. In: Proc. of ACM ASSETS’96, pp. 117–123 (1996)

  56. Schubotz, R.I., von Cramon, D.Y.: Predicting perceptual events activates corresponding motor schemes in lateral premotor cortex: an fMRI study. NeuroImage. 15, 787–796 (2002)

    Article  Google Scholar 

  57. Schönpflug, W.: The trade-off between internal and external information storage. J. Mem. Lang.. 25, 657–675 (1986)

    Article  Google Scholar 

  58. Seisenbacher, G., Mayer, P., Panek, P., Zagler, W.L.: 3D-Finger System for auditory support of haptic exploration in the education of blind and visually impaired students—idea and feasibility study. In: Proceedings of the 8th European Conference for the Advancement of Assistive Technology, (AAATE 2005) (Lille, France). IOS Press, vol. 16, pp. 73–77 (2005)

  59. Sellen, A., Kurtenbach, G., Buxton, W.: The prevention of mode errors through sensory feedback. Hum. Comput. Interact.. 7(2), 141–164 (1992)

    Article  Google Scholar 

  60. Ullman, J.: UllmanMouseTM Product information available at (web site 2002) http://www.ullman.se/ullmanmouse/home/ (2002)

  61. Ungar, S.: Cognitive mapping without visual experience. In: Kitchin, R. Freundschuh, S. (eds.) Cognitive Mapping: Past, Present and Future. Routledge, London (2000)

  62. Walker, B.N., Lindsay, J.: Auditory navigation performance is affected by waypoint capture radius. In Proc. of ICAD 04, Sydney, Australia, July 6–9 [On-line]. Available at: http://www.icad.org/websiteV2.0/Conferences/ICAD2004/papers/walker_lindsay.pdf (2004)

  63. Wall, S.A., Brewster, S.A.: Non-visual feedback for pen-based interaction with digital graphs. In: Proceedings of ICDVRAT 2006, Esbjerg, Denmark, pp. 223–230 (2006)

  64. Yanz, J.L., Anderson, B.A., John, M.J.: Programmable interface for fitting hearing devices. US Patent Application 20040071304 (2004)

  65. Yfantidis, G., Evreinov, G.: Adaptive blind interaction technique for touchscreens. J. Univ. Access Inform. Soc. Springer. 4(4), 344–345 (2004)

    Google Scholar 

  66. Zheng, X.S., McConkie, G.W., Schaeffer, B.: Navigational control effect on representing virtual environments. In: Proceedings of the Human Factors and Ergonomics Society 47th Annual Meeting. Available at (web site 2005) http://www.isl.uiuc.edu/Publications/publications.htm (2003)

Download references

Acknowledgments

This work was supported by the Academy of Finland (grant 107278), and the project MICOLE funded by the European Commission, IST-2003-511592 STP.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Grigori Evreinov.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Evreinova, T.V., Evreinov, G. & Raisamo, R. A camera-joystick for sound-augmented non-visual navigation and target acquisition: a case study. Univ Access Inf Soc 7, 129–144 (2008). https://doi.org/10.1007/s10209-007-0109-5

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10209-007-0109-5

Keywords

Navigation