Skip to main content

Accessibility of Brainstorming Sessions for Blind People

  • Conference paper
Computers Helping People with Special Needs (ICCHP 2014)

Abstract

Today, research focuses on the accessibility of explicit information for blind users. This gives only partly access to the information flow in brain-storming sessions, since non-verbal communication is not supported. Advances in ICT however allow capturing implicit information like hand gestures as important part of non-verbal communication. Thus, we describe a system that al-lows integrating blind people into a brainstorming session using a mind map.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Ellis, C.A., Gibbs, S.J., Rein, G.: Groupware: Some Issues and Experiences. Communications of the ACM 34, 39–58 (1991)

    Article  Google Scholar 

  2. Kunz, A., Alavi, A., Sinn, P.: Integrating Pointing Gesture Detection for Enhancing Brain-storming Meetings Using Kinect and PixelSense. In: Proc. of the 8th International Conference on Digital Enterprise Technology, ETH-Zürich, Zürich, Switzerland, pp. 1–8 (2014)

    Google Scholar 

  3. von Thun, F.S.: Miteinander reden 1 – Störungen und Klärungen. Allgemeine Psychologie der Kommunikation. rororo, Reinbek, Germany (1998)

    Google Scholar 

  4. Kendon, A.: Some functions of gaze-direction on social interaction. Acta Psychologica 26, 22–63 (1967)

    Article  Google Scholar 

  5. Argyle, M., Cook, M.: Gaze & Mutual Gaze. Cambridge University Press, New York (1976)

    Google Scholar 

  6. Ishii, H., Ullmer, B.: Tangible Bits: Toward Seamless Interfaces between People, Bits and Atoms. In: Proc. of the SIGCHI Conference on Human Factors in Computing Systems (CHI 1997), pp. 234–241. ACM, New York (1997)

    Chapter  Google Scholar 

  7. Fitzmaurice, G., Ishii, H., Buxton, W.: Bricks: Laying the Foundation for Graspable User Interfaces. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI 1995), pp. 442–449. ACM, New York (1995)

    Chapter  Google Scholar 

  8. Ganser, C., Steinemann, A., Kunz, A.: InfrActables: Supporting Collocated Groupwork by Combining Pen-based and Tangible Interaction. In: Proceedings of IEEE Tabletop 2007, pp. 1–2. IEEE, Washington (2007)

    Google Scholar 

  9. Hofer, R., Kunz, A.: TNT: Touch ‘n’ Tangibles on LC-displays. In: Natkin, S., Dupire, J. (eds.) ICEC 2009. LNCS, vol. 5709, pp. 222–227. Springer, Heidelberg (2009)

    Chapter  Google Scholar 

  10. Window Eyes by GW Micro, http://www.gwmicro.com/Window-Eyes

  11. W3C/WAI Guidelines, http://www.w3.org/WAI/intro/wcag.php

  12. Yayl, L.: Huseby Zoom Maps: A Design Methodology for Tactile Graphics. Journal of Visual Impairment & Blindness 3(5), 270–276 (2009)

    Google Scholar 

  13. Manshad, M.S., Manshad, A.S.: Multimodal Vision Glove for Touchscreens. In: Proceedings of the 10th International ACM SIGACCESS Conference on Computers and Accessibility, pp. 251–252. ACM, New York (2008)

    Google Scholar 

  14. King, A., Blenkhorn, P., Crombie, D., Dijkstra, S., Evans, G., Wood, J.: Presenting UML Software Engineering Diagrams to Blind People. In: Miesenberger, K., Klaus, J., Zagler, W.L., Burger, D. (eds.) ICCHP 2004. LNCS, vol. 3118, pp. 522–529. Springer, Heidelberg (2004)

    Chapter  Google Scholar 

  15. Petrie, H., et al.: TeDUB: A System for Presenting and Exploring Technical Drawings for Blind People. In: Miesenberger, K., Klaus, J., Zagler, W.L. (eds.) ICCHP 2002. LNCS, vol. 2398, pp. 537–539. Springer, Heidelberg (2002)

    Chapter  Google Scholar 

  16. Petit, G., Dufresne, A., Levesque, V., Hayward, V., Trudeau, N.: Refreshable Tactile Graphics Applied to Schoolbook Illustrations for Students with Visual Impairments. In: Proceedings of the 10th International ACM SIGACCESS Conference on Computers and Accessibility (Assets 2008), pp. 89–96. ACM, New York (2008)

    Google Scholar 

  17. Landua, S., Welss, L.: Merging Table Sensory Input and Audio Data by Means of the Talking Tactile Tablet. In: Proceedings of Eurohaptics, pp. 414–418. IEEE Computer Society, Washington (2003)

    Google Scholar 

  18. Ferres, L., Verkhogliad, P., Lindgaard, G., Boucher, L., Chretien, A., Lachance, M.: Improving Accessibility to Statistical Graphs: The iGraph-Lite System. In: Proceedings of the 9th International SIGACCESS Conference on Computers and Accessibility (Assets 2007), pp. 67–74. ACM, New York (2007)

    Chapter  Google Scholar 

  19. Winberg, F., Bowers, J.: Assembling the Senses: Towards the Design of Cooperative Inter-faces for Visually Impaired Users. In: Proceedings of the 2004 ACM Conference on Computer Supported Cooperative Work (CSCW 2004), pp. 332–341. ACM, New York (2004)

    Chapter  Google Scholar 

  20. Kunz, A., Nescher, T., Küchler, M.: CollaBoard: A Novel Interactive Electronic Whiteboard for Remote Collaboration with People on Content. In: Proceedings of the International Conference on Cyberworlds (CW 2010), pp. 430–437. IEEE, Washington (2010)

    Chapter  Google Scholar 

  21. Ba, S.O., Odobez, J.-M.: A Study on Visual Focus of Attention Recognition from Head Pose in a Meeting Room. In: Renals, S., Bengio, S., Fiscus, J.G. (eds.) MLMI 2006. LNCS, vol. 4299, pp. 75–87. Springer, Heidelberg (2006)

    Chapter  Google Scholar 

  22. Morency, L.-P., Quattoni, A., Darrell, T.: Latent-Dynamic Discriminative Models for Continuous Gesture Recognition. In: Proceedings of the 2007 IEEE Conference on Computer Vision and Pattern Recognition (CVPR 2007), pp. 1–8. IEEE, Washington (2007)

    Chapter  Google Scholar 

  23. Oliveira, F., Cowan, H., Faug, B., Quek, F.: Enabling Multimodal Discourse for the Blind. In: Proceedings of the International Conference on Multimodal Interfaces and the Workshop on Machine Learning for Multimodal Interaction (ICMI-MLMI 2010), vol. (18). ACM, New York (2010)

    Google Scholar 

  24. Kane, S.K., Morris, M.R., Perkins, A.Z., Wigdor, D., Ladner, R.E., Wobbrock, J.O.: Access Overlays: Improving Non-visual Access to Large Touch screens for Blind Users. In: Proceedings of the 24th Annual ACM Symposium on User Interface Software and Technology (UIST 2011), pp. 273–282. ACM, New York (2011)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer International Publishing Switzerland

About this paper

Cite this paper

Kunz, A. et al. (2014). Accessibility of Brainstorming Sessions for Blind People. In: Miesenberger, K., Fels, D., Archambault, D., Peňáz, P., Zagler, W. (eds) Computers Helping People with Special Needs. ICCHP 2014. Lecture Notes in Computer Science, vol 8547. Springer, Cham. https://doi.org/10.1007/978-3-319-08596-8_38

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-08596-8_38

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-08595-1

  • Online ISBN: 978-3-319-08596-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics