ABSTRACT
iSonic is an interactive sonification tool for vision impaired users to explore geo-referenced statistical data, such as population or crime rates by geographical regions. Users use a keyboard or a smooth surface touchpad to interact with coordinated map and table views of the data. The integrated use of musical sounds and speech allows users to grasp the overall data trends and to explore the data to get more details. Scenarios of use are described.
- Brown, L., Brewster, S. Drawing by ear: interpreting sonified line graphs. Proc. ICAD 2003Google Scholar
- Flowers, J.H., Buhman, D.C., and Turnage, K.D. Cross-modal equivalence of visual and auditory scatterplots for exploring bivariate data samples. Human Factors, 39, 3(1997), 340--350Google Scholar
- Landau, S., and Gourgey, K. Development of a talking tactile tablet, Information Technology and Disabilities, VII(2) 2001.Google Scholar
- Pauletto, S., Hunt, A. A toolkit for interactive sonification. Proc. ICAD 2004Google Scholar
- Ramloll, R., Yu, W., Riedel, B., and Brewster, S.A. Using non-speech sounds to improve access to 2D tabular numerical information for visually impaired users. Proc. BCS IHM-HCI 2001Google Scholar
- Parente, P., and Bishop, G. BATS: the blind audio tactile mapping system, Proc. ACM South Eastern Conference, 2003Google Scholar
- Walker, B. N. and Cothran, J. T. Sonification Sandbox: a graphical toolkit for auditory graphs. Proc. ICAD 2003Google Scholar
- Zhao, H., Plaisant, C., and Shneiderman, B. I hear the pattern - Interactive Sonification of geographical data patterns, Proc. SIGCHI Extended Abstracts 2005 Google ScholarDigital Library
Index Terms
- iSonic: interactive sonification for non-visual data exploration
Recommendations
Data Sonification for Users with Visual Impairment: A Case Study with Georeferenced Data
We describe the development and evaluation of a tool, iSonic, to assist users with visual impairment in exploring georeferenced data using coordinated maps and tables, augmented with nontextual sounds and speech output. Our in-depth case studies with 7 ...
Interactive sonification of geo-referenced data
CHI EA '05: CHI '05 Extended Abstracts on Human Factors in Computing SystemsThis paper describes an investigation of using interactive sonification (non-speech sound) to present geo-referenced statistical data to vision-impaired users for problem solving and decision making. By working with vision-impaired users, the work will ...
"I hear the pattern": interactive sonification of geographical data patterns
CHI EA '05: CHI '05 Extended Abstracts on Human Factors in Computing SystemsInteractive sonification (non-speech sound) is a novel strategy to present the geographical distribution patterns of statistical data to vision impaired users. We discuss the design space with dimensions of interaction actions, data representation forms,...
Comments