Skip to main content
Log in

Non-visual interaction with graphs assisted with directional-predictive sounds and vibrations: a comparative study

  • Long Paper
  • Published:
Universal Access in the Information Society Aims and scope Submit manuscript

Abstract

Blind and visually impaired students need special educational and developmental tools to allow them to interact with graphic entities on PDA and desktop platforms. In previous research, stylus movements regarding the hidden graph were sonified with three directional-predictive sound (DPS) signals, taking into account an exploration behavior and the concept of the capture radius. The results indicated that the scanpaths were by 24–40% shorter in length and task completion times decreased by 20–25%. The goal of the study presented in this paper was to measure and compare the subjective performance recorded with directional-predictive vibrations (DPV) regarding the subjective performance achieved when the hidden graphic images were explored with DPS. The study also aimed to find out which kind of feedback cues would require less cognitive efforts in interpreting their meaning. The prototype of vibro-tactile pen with embedded vibration motor was used to produce DPV instead of sounds. The performance of eight blindfolded subjects was investigated in terms of the number of both feedbacks used and the time spent to complete non-visual inspection of the hidden graphs. There was a statistically significant difference between the average number of DPS and vibrations and task completion time taken by the players to discover the features of hidden graphs being explored with different capture radius. The experimental findings confirmed the beneficial use of DPS signals in the task when cross-modal coordination should benefit the user in the absence of visual information when compared with DPV patterns.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

References

  1. Brown, A., Stevens, R., Pettifer, S.: Audio representation of graphs: a quick look. In Proceedings of the 12th International Conference on Auditory Display, pp. 83–90. ICAD, London (2006)

  2. Evreinov, G., Evreinova, T., Raisamo, R.: Manipulating vibro-tactile sequences on mobile PC. In: Rémi Bastide, Philippe Palanque, Jörg Roth (eds.) Joint Working Conferences EHCI-DSVIS 2004, Revised Selected Papers, LNCS 3425, pp. 245–252. Springer-Verlag, Hamburg (2004) doi:10.1007/11431879_16

  3. Evreinov, G., Evreinova, T., Raisamo, R.: Mobile Games for Training Tactile Perception. In: Matthias Rauterberg (ed.) Entertainment Computing—ICEC 2004, Third International Conference, Proceedings. LNCS 3166, pp. 468–475. Springer, Eindhoven (2004) ISBN 3-540-22947-7

  4. Evreinova, T.G., Evreinov, G., Raisamo, R.: Evaluating the length of virtual horizontal bar chart columns augmented with wrench and sound feedback. In: Proceedings of the 10th International Conference on Computers Helping People with Special Needs. ICCHP 2006. LNCS 4061, pp. 353–360. Springer-Verlag, Berlin (2006)

  5. Franklin, K.F., Roberts, J.C.: Pie chart sonification. In: Proceedings of Information Visualization (IV03), pp. 4–9. IEEE Computer Society (2003)

  6. Fritz, J.P., Way, T.P., Barner, K.E.: Haptic representation of scientific data for visually impaired or blind persons. In: Proceedings of the 11th Annual Technology and Persons with Disabilities Conference, California State University, Northridge, Los Angeles (1996)

  7. Grabowski, N.A., Barner, K.E.: Data visualisation methods for the blind using force feedback and sonification. In: Part of the SPIE Conference on Telemanipulator and Telepresence Technologies V, Boston, pp. 131–139 (1998)

  8. Immersion Studio 4.0.3 software, Web site (2005), http://www.immersion.com

  9. Kamel, H., Landay, J.A. Study of blind drawing practice: creating graphical information without the visual channel. In: Proceedings of the 4th International Conference on Assistive Technologies, pp. 17–25. ACM Press, Arlington (2000)

  10. Logitech Inc., Web site (2005): http://www.logitech.com/

  11. Mansur, D.L., Blattner, M.M., Joy, K.I.: Sound graphs: a numerical data analysis method for the blind. J. Med. Syst. 9(3), 163–174 (1985)

    Article  Google Scholar 

  12. McAllister, G., Staiano, J., Yu, W.: Creating accessible bitmapped graphs for the internet. In: Glasgow, U.K., McGookin, D., Brewster, S. (eds.) Proceedings of the 1st International Workshop, HAID 2006. Lecture Notes in Computer Science, vol. 4129/2006, pp. 92–101. Springer-Verlag, Berlin (2006) ISBN: 978-3-540-37595-1

  13. McGookin, D.K., Brewster, S.: Contextual audio in haptic graph browsing. In: Proceedings of the 12th International Conference on Auditory Display, pp. 91–94. ICAD, London (2006)

  14. Nees, M.A., Walker, B.N.: Relative intensity of auditory context for auditory graph design. In: Proceedings of the 12th International Conference on Auditory Display, pp. 95–98. ICAD, London (2006)

  15. Ramloll, R., Yu, W., Brewster, S., Riedel, B., Burton, M., Dimigen, G.: Constructing sonified haptic line graphs for the blind student: first steps. In: Proceedings of the 4th International ACM Conference on Assistive Technologies. ACM, Arlington (2000)

  16. Roth, P., Kamel, H., Petruccil, L., Pun, T.: Do you feel what I hear? J. Vis. Impair. Blind. 96, (2002)

  17. Sahyun, S.C., Gardner, J.A.: Testing the equivalence of visual and auditory graphs. In: Proceedings of the International Conference on Auditory Display, Palo Alto, 2–5 November, 1997

  18. Seki, Y.: Sound Field for Obstacle Perception Training, Web site (2004) http://staff.aist.go.jp/yoshikazu-seki/CD/index.html

  19. Sharmin, S., Evreinov, G., Raisamo, R.: Non-visual feedback cues for pen computing. In: Proceedings of the 1st Joint Eurohaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, WordHaptics, WHC 2005, pp. 625–628. IEEE Inc., Pisa (2005)

  20. Sribunruangrit, N., Marque, C., Lenay, C., Gapenne, O.: Improving blind people’s spatial ability by bimodal-perception assistive device for accessing graphic information. In: Proceedings of the AAATE’03, pp. 476–480. IOS Press, Amsterdam (2003)

  21. Vanderheiden, G.C.: Dynamic and static strategies for nonvisual presentation of graphical information. In: Electronic proceedings of the EASI High Resolution Tactile Graphics Conference, (1994). Web site (2004) http://www.rit.edu/∼easi/easisem/ easisemtactile.html

  22. Van Scoy, F., McLaughlin, D., Fullmer, A.: Auditory augmentation of haptic graphs: developing a graphic tool for teaching precalculus skill to blind students. In: Proceedings of 11th International Conference on Auditory Display. ICAD, Limerick (2005)

  23. Vesterinen, L.K., Evreinov, G.: Sonification of interaction with hidden graphs. In: Proceedings of the 2nd International Conference on Enactive Interfaces, Enactive 2005. Genoa (CD proceedings) (2005)

  24. Wall, S., Brewster, S.A.: Providing external memory aids in haptic visualisations for blind computer users. In: Proceedings of the 5th International Conference on Disability, Virtual Reality and Associated Technologies, School of Systems Engineering, University of Reading (2004)

  25. Wall, S., Brewster, S.A.: Feeling what you hear: tactile feedback for navigation of audio graphs. To appear in Proceedings of ACM CHI 2006 (Montreal, Canada). ACM Press Addison-Wesley (2006)

  26. Yu, W., Brewster, S.: Comparing two haptic interfaces for multimodal graph rendering. In: Proceedings of the IEEE VR2002, 10th Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, Florida (2002)

Download references

Acknowledgment

This work was financially supported by the Academy of Finland (grant 107278).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tatiana G. Evreinova.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Evreinova, T.G., Evreinov, G., Raisamo, R. et al. Non-visual interaction with graphs assisted with directional-predictive sounds and vibrations: a comparative study. Univ Access Inf Soc 7, 93–102 (2008). https://doi.org/10.1007/s10209-007-0105-9

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10209-007-0105-9

Keywords

Navigation