Skip to main content
Log in

Non-visual navigation of spreadsheets

Enhancing accessibility of Microsoft Excel™

  • Long paper
  • Published:
Universal Access in the Information Society Aims and scope Submit manuscript

Abstract

The problem of non-visual navigation of the information in Excel™ spreadsheet is that using current technologies, no overview is available for the user about different components found in the spreadsheet. Several attributes associated with spreadsheets make them not easy to navigate by individuals who are blind. The large amount of information stored in the spreadsheet, the multi-dimensional nature of the contents, and the several features it includes cannot be readily linearized by a screen reader or Braille display (e.g., charts and tables). A user-centered design paradigm is followed to build an accessible system for non-visual navigation of Microsoft Excel™ spreadsheet. The proposed system provides the user with a hierarchical overview of the navigated components in an Excel™ spreadsheet. The system is multi-modal, and it provides the user with different navigation and reading modes for the non-visual navigation of a spreadsheet. This will help the users in adapting the non-visual navigation according to the task the user needs to accomplish.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15

Similar content being viewed by others

References

  1. Alty, J.L., Rigas, D.: Exploring the use of structured musical stimuli to communicate simple diagrams: the role of context. Int. J. Hum. Comput. Stud. 62(1), 21–40 (2005)

    Article  Google Scholar 

  2. Bruce, I., McKennell, A., Walker, E.: Blind and partially sighted adults in britain: the RNIB survey. Technical report, Her Majesty’s Stationery Office (1991)

  3. Douglas, G., Corcoran, C., Pavey, S.: The role of the who icf as a framework to interpret barriers and to inclusion: visually impaired people’s views and experiences of personal computers. Br. J. Vis. Impair. 25(1), 32–50 (2007)

    Article  Google Scholar 

  4. Doush, I.A., Pontelli, E.: Building a programmable architecture for non-visual navigation of mathematics: using rules for guiding presentation and switching between modalities. In: Universal Access in Human-Computer Interaction, Applications and Services, HCI 2009, vol. 5616, pp. 3–13 (2009)

  5. Doush, I.A., Pontelli, E., Simon, D., Cao Son, T., Ma, O.: Making Microsoft ExcelTM: multimodal presentation of charts. In ACM SIGACCESS Conference on Computers and Accessibility, pp. 147–154. ACM, New York, NY, USA (2009)

  6. Edman, P.: Tactile Graphics. American Foundation for the Blind, New York (1991)

    Google Scholar 

  7. Elzer, S., Green, N., Carberry, S., Carberry, R., McCoy, K.: Extending plan inference techniques to recognize intentions in information. In: Proceedings of the Ninth International Conference on User Modeling (2003)

  8. Ferreiral, H., Freitas, D.: Audiomath: Towards automatic readings of mathematical expressions. In: Human Computer Interaction International (HCII), Las Vegas, Nevada (2005)

  9. Ferres, L., Verkhogliad, P., Lindgaard, G., Boucher, L., Chretien, A., Lachance, M.: Improving accessibility to statistical graphs: the igraph-lite system. In: Assets ’07: Proceedings of the 9th international ACM SIGACCESS conference on Computers and accessibility, pp. 67–74. ACM, New York, NY, USA (2007)

  10. Flood, D., Mc Daid, K., Mc Caffery, F., Bishop, B.: Intelligent voice navigation of spreadsheets. In: TSD ’08: Proceedings of the 11th International Conference on Text, Speech and Dialogue, pp. 577–584. Springer, Berlin, Heidelberg (2008)

  11. Flowers, J.H., Buhman, D.C., Turnage, K.D.: Data sonification from the desktop: should sound be part of standard data analysis software?. ACM Trans. Appl. Percept. 2(4), 467–472 (2005)

    Article  Google Scholar 

  12. Franklin, K.M., Roberts, J.C.: Pie chart sonification. In: Proceedings of the Seventh International Conference on Information Visualization, vol. IV. IEEE Computer Society (2003)

  13. Fritz, J.P., Barner, K.E.: Design of a haptic data visualization system for people with visual impairments. IEEE Trans. Rehab. Eng. 7(3), 372–384 (1999)

    Article  Google Scholar 

  14. Gardner, J.A., Lundquist, R., Sahyun, S.: Triangle: a tri-modal access program for reading, writing, and doing math. In: International Conference on Technology and Persons with Disabilities (CSUN) (1998)

  15. Grabowski, N.A., Barner, K.E.: Data visualization methods for the blind using force feedback and sonification. Telemanipulator Telepresence Technol. 3524, 131–139 (1998)

    Google Scholar 

  16. Gunther, E.: Skinscape: a tool for composition in the tactile modality. Master’s Thesis, Massachusetts Institute of Technology (2001)

  17. Hill, D.R., Grieb, C.: Substitution for a restricted visual channel in multimodal computer human dialogue. IEEE Trans. Syst. Man Cybern. 18(2), 285–304 (1988)

    Article  Google Scholar 

  18. Hoggan, E., Brewster, S.A., Johnston, J.: Investigating the effectiveness of tactile feedback for mobile touchscreens. In: Proceeding of the Twenty-Sixth Annual SIGCHI Conference on Human Factors in Computing Systems, CHI ’08, pp. 1573–1582. ACM, New York, NY, USA (2008)

  19. HyperBraille Project: World Wide Web electronic publication. http://www.hyperbraille.de/project (2011)

  20. Iglesias, R., Casado, S., Gutierrez, T., Barbero, J.I., Avizzano, C.A., Marcheschi, S., Bergamasco, M.: Computer graphics access for blind people through a haptic and audio virtual environment. In: The 3rd IEEE International Workshop on Haptic, Audio and Visual Environments and Their Applications. HAVE 2004, pp. 13–18 (2004)

  21. International Organization for Standardization. ISO: Ergonomics of human-system interaction: Part 920: Guidance on tactile and haptic interactions. In: ISO 9241-920:2009, Geneva, Switzerland (2009)

  22. Jay, C., Stevens, R., Hubbold, R., Glencross, M.: Using haptic cues to aid nonvisual structure recognition. ACM Trans. Appl. Percep. 5(2) (2008)

  23. Kaiser, M.: Tablex: a system for the generation of natural language descriptions of tabularly arranged information. In: Proceedings of the Conference on People with Disabilities, Los Angeles, USA (1995)

  24. Kildal, J., Brewster, S.A.: Non-visual overviews of complex data sets. In: ACM Conference on Human Factors in Computing Systems (CHI), pp. 947–952. ACM, Montreal, QC, Canada (2006)

  25. Kildal, J., Brewster, S.A.: Providing a size-independent overview of non-visual tables. In: 12th International Conference on Auditory Display (ICAD2006), pp. 8–15, Queen Mary, University of London (2006)

  26. Kirchner, C.: Data on blindness and visual impairment in the U.S. Technical report, American Foundation for the Blind (1988)

  27. Krufka, S.E., Barner, K.E.: A user study on tactile graphic generation methods. Behav. Inf. Technol. 25, 297–311 (2006)

    Article  Google Scholar 

  28. Kurze, M.: Giving blind people access to graphics. In: Proceedings Software Ergonomie Workshop “Nicht-visuelle graphische Benutzungsoberflächen” (1995)

  29. Ladner, R.E., Ivory, M.Y., Rao, R., Burgstahler, S., Comden, D., Hahn, S., Renzelmann, M., Krisnandi, S., Ramasamy, M., Slabosky, B., Martin, A., Lacenski, A., Olsen, S., Groce, D.: Automating tactile graphics translation. In Assets ’05: Proceedings of the 7th international ACM SIGACCESS conference on Computers and accessibility, pp. 150–157. ACM, New York, NY, USA (2005)

  30. Lighthouse Inc: Lighthouse International report. World Wide Web electronic publication. http://www.lighthouse.org (2009)

  31. Mansur, D.L., Blattner, M.M., Joy, K.I.: Sound graphs: a numerical data analysis method for the blind. J. Med. Syst. 9(3), 163–174 (1985)

    Article  Google Scholar 

  32. McGookin, D.K., Brewster, S.A.: Contextual audio in haptic graph browsing. In: 12th International Conference on Auditory Display (ICAD06), pp. 91–94, London, UK (2006)

  33. McGookin, D., Brewster, S., Jiang, W.W.: Investigating touchscreen accessibility for people with visual impairments. In: Proceedings of the 5th Nordic conference on Human-Computer Interaction: Building Bridges, NordiCHI ’08, pp. 298–307. ACM, New York, NY, USA (2008)

  34. Metatla, O., Bryan-Kinns, N., Stockman, T.: Using hierarchies to support non-visual access to relational diagrams. In: Proceedings of the 21st British HCI Group Annual Conference on People and Computers: HCI\(\ldots\)but not as we know it—vol. 1, BCS-HCI ’07, pp. 215–225. British Computer Society, Swinton, UK, UK (2007)

  35. Neuhoff, J.G., Wayand, J.: Pitch change, sonification, and musical expertise: which way is up? In: Proceedings of the 2002 International Conference on Auditory Display (2002)

  36. Oogane, T., Asakawa, C.: An interactive method for accessing tables in html. In: Proceedings of the Third International ACM Conference on Assistive Technologies, pp. 126 – 128, Marina del Rey, CA, USA (1998)

  37. Pemberton, J., Robson, A.: Spreadsheets in business. Ind. Manag. Syst. 100, 379–388 (2000)

    Article  Google Scholar 

  38. Pontelli, E., Son, T.C., Kottapally, K.R., Ngo, C.T., Kotthuru, R.R., Gillan, D.J.: A system for automatic structure discovery and reasoning-based navigation of the web. Interact. Comput. 16(3), 451 (2004)

    Article  Google Scholar 

  39. Pontelli, E., Gupta, G., Karshmer, A.I.: Mathematics Accessibility: A Survey. CRC Press. http://www.cs.utdallas.edu/gupta/mathaccsurvey.pdf (2009)

  40. Prescher, D., Weber, G., Spindler, M.: A tactile windowing system for blind users. In: Proceedings of the 12th International ACM SIGACCESS Conference on Computers and Accessibility, ASSETS ’10, pp. 91–98. ACM, New York, NY, USA (2010)

  41. Raman, T.V.: Audio system for technical readings. PhD Thesis, Cornell University (1994)

  42. Ramloll, R., Brewster, S., Yu, W., Riedel, B.: Using non-speech sounds to improve access to 2d tabular numerical information for visually impaired users. In: Proceedings of BCS IHM-HCI 2001, pp. 515–530 (2001)

  43. Rassmus-Grohn, K., Magnusson, C., Eftring, H.: Iterative design of an audio-haptic drawing application. In: CHI Extended Abstracts, pp. 2627–2632. San Jose, CA, USA (2007)

  44. Rassmus-Gröhn, K., Magnusson, C., Eftring, H.E.: Iterative design of an audio-haptic drawing application. In: CHI ’07 Extended Abstracts on Human factors in Computing Systems, CHI EA ’07, pp. 2627–2632. ACM, New York, NY, USA (2007)

  45. Roth, P., Kamel, H., Petrucci, L., Pun, T.: A comparison of three nonvisual methods for presenting scientific graphs. J. Vis. Impair. Blind. 96(6), 420–428 (2002)

    Google Scholar 

  46. Shamilian, J.H., Baird, H.S., Wood, T.L.: A retargetable table reader. In: Proceedings of the 4th International Conference on Document Analysis and Recognition (ICDAR), pp. 158–163 (1997)

  47. Sjostrom, C., Danielsson, H., Magnusson, C., Rassmus-Groumlhn, K.: Phantom-based haptic line graphics for blind persons. Vis. Impair. Res. 5(1), 13–32 (2003)

    Article  Google Scholar 

  48. Soiffer, N.: A flexible design for accessible spoken math. In: Universal Access in HCI, Part III, HCI 2009. LNCS, vol. 5616, pp. 130–139 (2009)

  49. Stockman, T.: The design and evaluation of auditory access to spreadsheets. In: Proceedings of ICAD 04-Tenth Meeting of the International Conference on Auditory Display, Sydney, Australia (2004)

  50. Stockman, T., Hind, G., Frauenberger, C.: Interactive sonification of spreadsheets. In: Proceedings of the 11th International Conference on Auditory Display (ICAD2005), pp. 134–139, Limerick, Ireland (2005)

  51. Sturm, I., Schiewe, M., Köhlmann, W., Jürgensen, H.: Communicating through gestures without visual feedback. In: Proceedings of the 2nd International Conference on PErvasive Technologies Related to Assistive Environments, PETRA ’09, pp. 15:1–15:8. ACM, New York, NY, USA (2009)

  52. Volkel, T., Weber, G., Baumann, U.: Tactile graphics revised: the novel brailledis 9000 pin-matrix device with multitouch input. In: Proceedings of the 11th International Conference on Computers Helping People with Special Needs, pp. 835–842. Springer (2008)

  53. Walker, B.N., Cothran, J.T.: Sonification sandbox: a graphical toolkit for auditory graphs. In Proceedings of the Ninth International Conference on Auditory Display ICAD2003, pp. 161–163, Boston, MA (2003)

  54. Walker, B.N., Nees, M.A.: An agenda for research and development of multimodal graphs. In: Proceedings of ICAD 05-Eleventh Meeting of the International Conference on Auditory Display, pp. 428–432 (2005)

  55. Yu, W., Brewster, S.A.: Evaluation of multimodal graphs for blind people. Univ. Access Inf. Soc. 2(2), 105–124 (2003)

    Article  Google Scholar 

  56. Yu, W., Ramloll, R., Brewster, S.A.: Haptic graphs for blind computer users. In: Proceedings of the First International Workshop on Haptic Human-Computer Interaction, vol. 2058, pp. 41–51 (2000)

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Iyad Abu Doush.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Doush, I.A., Pontelli, E. Non-visual navigation of spreadsheets . Univ Access Inf Soc 12, 143–159 (2013). https://doi.org/10.1007/s10209-012-0272-1

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10209-012-0272-1

Keywords

Navigation