Skip to main content
Log in

Exploration of directional-predictive sounds for nonvisual interaction with graphs

  • Regular Paper
  • Published:
Knowledge and Information Systems Aims and scope Submit manuscript

Abstract

Sonification of stylus movements accompanied with kinesthetic feedback is one of possible techniques to develop cross-modal coordination in the absence of visual information. The investigated problems are the following: how to minimize the number of sounds while increasing the information they contain and how to choose a natural sonification grammar which would not require extra cognitive efforts. We demonstrate two case studies of employing directional-predictive sounds (DPS). Stylus movements were sonified through three sound signals taking into account the exploration behavior and the concept of the capture radius. The performance of eight subjects was evaluated in terms of the stylus deviation in relation to the points of the virtual graph, a length of the scanpaths, and the task completion time. When stylus movements were accompanied with the DPS signals within four capture radiuses, the deviation of the stylus from the graph inspected was always less than one capture radius. The scanpaths were 24–40% shorter in length and the task completion times decreased by 20–25%. We also demonstrate the game application which was designed to optimize an exploration behavior enhanced by the DPS. The results of the proposed sonification technique based on the model of the exploration behavior are discussed.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Barrass S (2003) Sonification design patterns. In: Proceedings of the 2003 international conference on auditory display, Boston, MA, pp 170–175

  2. Bellik Y, Farcy R (2002) Comparison of various interface modalities for a locomotion assistance device. In: Proceedings of the 8th international conference on computers helping people with special needs, ICCHP, Linz, Austria, LNCS, vol 2398. Springer-Verlag, Berlin Heidelberg, pp 421–428

    Google Scholar 

  3. Bonebright T, Nees M, Connerley T, McCain G (2001) Testing the effectiveness of sonified graphs for education: a programmatic research project. In: Proceedings of the 2001 ICAD, Espoo, Finland, pp 62–66

  4. Brewster S (1998) Using non-speech sounds to provide navigation cues. ACM Trans Comput Hum Interact 5(2):254–259

    MathSciNet  Google Scholar 

  5. Brown L, Brewster S (2003) Drawing by ear: interpreting sonified line graphs. In: Proceedings of the ICAD 2003, Boston, MA, pp 152–156

  6. Brown L, Brewster S, Ramloll R, Riedel B, Yu W (2003) Design guidelines for audio presentation of graphs and tables. In: Proceedings of the ICAD 2003, Boston, MA, pp 284–287

  7. Edwards ADN, Evreinov G, Agranovski AV (1999) Isomorphic sonification of spatial relations. In: Proceedings of the 8th international conference on human–computer interaction, vol 1. Munich, Germany, pp 526–530

  8. Evreinov G, Raisamo R (2002) An evaluation of three sound mappings through the localization behavior of the eyes. In: Proceedings of the AES 22nd international conference on virtual, synthetic and entertainment audio, Espoo, Finland, pp 239–248

  9. Franklin K, Roberts J (2003) Pie chart sonification. In: Proceedings of information visualization (IV03). IEEE Computer Society, Seattle, Washington, pp 4–9

  10. Holland S, Morse D, Gedenryd H (2002) AudioGPS: spatial audio navigation with a minimal attention interface. ACM Pers Ubiguitous Comput 6(4):253–259

    Article  Google Scholar 

  11. Jacobson R (2002) Representing spatial information through multimodal interfaces: overview and preliminary results in non-visual interfaces. In: Proceedings of the 6th international conference on information visualization (IV02). IEEE, London, pp 730–734

    Chapter  Google Scholar 

  12. Keuning H, Houtsma A (2001) Cursor trajectory analysis. In: Bouwhuis, IPO Annual Progress Report 35. Eindhoven University Press, Eindhoven, The Netherlands, pp 128–139

    Google Scholar 

  13. Kramer G (1994) Auditory display: sonification, audification, and auditory interfaces. In: Proceedings, vol XVIII. Addison-Wesley, Reading, MA

    Google Scholar 

  14. Meijer P (2005) The vOICe math functions. Accessible graphing calculator for the blind, Website: http://www.seeingwithsound.com/winmath.htm (checked June 19, 2006)

  15. Nesbitt K (2004) Comparing and reusing visualisation and sonification designs using the Ms-Taxonomy. In: Proceedings of ICAD 04—tenth meeting of the international conference on auditory display, Sydney, Australia

  16. Seki Y (2004) Sound field for obstacle perception training, Website: http://staff.aist.go.jp/yoshikazu-seki/CD/index.html, (checked 19 June 2006)

  17. Sribunruangrit N, Marque C, Lenay C, Gapenne O (2003) Improving blind people's spatial ability by bimodal-perception assistive device for accessing graphic information. In: Proceedings of the AAATE'03, Dublin, Ireland, pp 476–480

  18. Tran T, Letowski T, Abouchacra K (2000) Evaluation of acoustic beacon characteristics for navigation tasks. Ergonomics 43(6):807–827

    Article  Google Scholar 

  19. Walker B, Lindsay J (2003) Effect of beacon sounds on navigation performance in a virtual reality environment. In: Proceedings of the 2003 international conference on auditory display, Boston, MA, pp 204–207

  20. Walker B, Lindsay J (2004) Auditory navigation performance is affected by waypoint capture radius. In: Proceedings of the ICAD 04, Sydney, Australia

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tatiana G. Evreinova.

Additional information

Categories and subject descriptors: H5.2. User interfaces. Input devices and strategies · I.3.6 Methodology and techniques. Interaction techniques

General terms: Performance · Design · Experimentation

Tatiana G. Evreinova received the M.Sc. degree in radio physics and electronics from the Department of Radio Physics of the Rostov State University, Russia, in 2000. She received the Ph.D. degree in computer science from the University of Tampere, Finland, in 2005. She has a background of conducting research on human–computer interaction and multimodal interaction since 1998. In 2000–2001, Tatiana G. Evreinova was a Research Assistant in the Laboratory for Designing of Information Imaging Systems in Specvuzautomatika Design Bureau, Rostov-on-Don, Russia. Her current research interests are working on designing advanced assistive input–output interaction techniques and software for people with a sensory impairment.

Leena K. Vesterinen received the B.Sc. degree in computer science from the University of Reading, UK, in 2003. She furthered her education at the University of Tampere and received the M.Sc. degree in interactive technology, from the Department of Computer Sciences, in 2006. Since 2005, she has worked as a Consulting Usability Professional in the UK-based companies. Her current research interests include mobile terminal applications, Web application usability, and information visualization.

Grigori Evreinov received the Ph.D. degree in technical sciences (computer science) from Rostov State University (RSU), Russia, in 1998. He has been conducting research on human–computer interaction since 1982. He specializes in sensor technology and transducer construction for alternative input–output techniques. Until 2001, he worked as a Senior Research Scientist at the Computer Centre in RSU and as the Head of the Laboratory for Designing of Information Imaging Systems in Specvuzautomatika Design Bureau. Since 2001, he has worked as a Researcher and Assistant Professor at the Department of Computer Sciences in the University of Tampere. His current research interests include advanced assistive user interfaces based on nonspeech audio, tactile signals, and adaptive algorithms for cross-modal transformation of textual and graphical information.

Roope Raisamo received his Ph.D. degree in computer science from the University of Tampere, Finland, in 1999. He has been conducting research on human–computer interaction since 1995. He has specialized in multimodal interaction and constructive user interface research. The present research projects in his group include those pertaining to haptic interaction, multimodal information presentation, proactive and intelligent environments, and multimodal interfaces for the visually impaired children. The main sources of research funding have been Academy of Finland, National Technology Agency of Finland, Nordic Development Centre for Rehabilitation Technology, and the European Commission.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Evreinova, T.G., Vesterinen, L.K., Evreinov, G. et al. Exploration of directional-predictive sounds for nonvisual interaction with graphs. Knowl Inf Syst 13, 221–241 (2007). https://doi.org/10.1007/s10115-006-0059-x

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10115-006-0059-x

Keywords

Navigation