skip to main content
10.1145/2982142.2982163acmconferencesArticle/Chapter ViewAbstractPublication PagesassetsConference Proceedingsconference-collections
research-article

LucentMaps: 3D Printed Audiovisual Tactile Maps for Blind and Visually Impaired People

Published:23 October 2016Publication History

ABSTRACT

Tactile maps support blind and visually impaired people in orientation and to familiarize with unfamiliar environments. Interactive approaches complement these maps with auditory feedback. However, commonly these approaches focus on blind people. We present an approach which incorporates visually impaired people by visually augmenting relevant parts of tactile maps. These audiovisual tactile maps can be used in conjunction with common tablet computers and smartphones. By integrating conductive elements into 3D printed tactile maps, they can be recognized by a single touch on the mobile device's display, which eases the handling for blind and visually impaired people. To allow multiple elevation levels in our transparent tactile maps, we conducted a study to reconcile technical and physiological requirements of off-the-shelf 3D printers, capacitive touch inputs and the human tactile sense. We propose an interaction concept for 3D printed audiovisual tactile maps, verify its feasibility and test it with a user study. Our discussion includes economic considerations crucial for a broad dissemination of tactile maps for both blind and visually impaired people.

References

  1. Blender Foundation. blender.org - Home of the Blender project - Free and Open 3D Creation Software.Google ScholarGoogle Scholar
  2. Anke M. Brock. 2013. Interactive Maps for Visually Impaired People: Design, Usability and Spatial Cognition.Google ScholarGoogle Scholar
  3. Anke M. Brock, Philippe Truillet, Bernard Oriola, Delphine Picard, and Christophe Jouffrais. 2015. Interactivity Improves Usability of Geographic Maps for Visually Impaired People. Human-Computer Interaction 30, 2: 156--194. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Craig Brown and Amy Hurst. 2012. VizTouch: automatically generated tactile visualizations of coordinate spaces. Proceedings of the Sixth International Conference on Tangible, Embedded and Embodied Interaction, ACM, 131--138. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Emeline Brulé, Gilles Bailly, Anke M. Brock, Frédéric Valentin, Grégoire Denis, and Christophe Jouffrais. 2016. MapSense: Multi-Sensory Interactive Maps for Children Living with Visual Impairments. ACM CHI 2016-chi4good, ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. M. Espinosa, Simon Ungar, Esperanza Ochaíta, Mark Blades, and Christopher Spencer. 1998. Comparing Methods for Introducing Blind and Visually Impaired People to Unfamiliar Urban Environments. Journal of Environmental Psychology 18, 3: 277--287.Google ScholarGoogle ScholarCross RefCross Ref
  7. Timo Götzelmann. 2016. CapMaps. In 15th International Conference on Computers Helping People with Special Needs (ICCHP'16). Springer, 146--152.Google ScholarGoogle Scholar
  8. Timo Götzelmann and Aleksander Pavkovic. 2014. Towards Automatically Generated Tactile Detail Maps by 3D Printers for Blind Persons. 14th International Conference on Computers Helping People with Special Needs (ICCHP'14), Springer, 1--7.Google ScholarGoogle ScholarCross RefCross Ref
  9. Timo Götzelmann and Klaus Winkler. 2015. SmartTactMaps: A Smartphone-Based Approach to Support Blind Persons in Exploring Tactile Maps. Proceedings of the 8th International Conference on PErvasive Technologies Related to Assistive Environments (PETRAE'15), ACM, 1--8. http://doi.org/10.1145/2769493.2769497 Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Jaume Gual, Marina Puyuelo, and Joaquim Lloveras. 2015. The effect of volumetric (3D) tactile symbols within inclusive tactile maps. Applied Ergonomics 48: 1--10.Google ScholarGoogle ScholarCross RefCross Ref
  11. Ronald AL Hinton. 1993. Tactile and audio-tactile images as vehicles for learning. Non-Visual Human-Computer-Interactions - Prospects for the Visually Handicapped, John Libbey Eurotext Ltd., 169--180.Google ScholarGoogle Scholar
  12. David Huggins-Daines, Mohit Kumar, Arthur Chan, Alan W. Black, Mosur Ravishankar, and Alex I. Rudnicky. 2006. Pocketsphinx: A free, real-time continuous speech recognition system for hand-held devices. Proc. of IEEE International Conference on Acoustics, Speech and Signal Processing, IEEE, 185--188.Google ScholarGoogle Scholar
  13. R. Dan Jacobson. 1998. Navigating maps with little or no sight: An audio-tactile approach. Proceedings of Content Visualization and Intermedia Representations: 95--102.Google ScholarGoogle Scholar
  14. Sandra Jehoel, Snir Dinar, Don McCallum, Jonathan Rowell, and Simon Ungar. 2005. A scientific approach to tactile map design: minimum elevation of tactile map symbols. Proceedings of XXII International Cartographic Conference A Coruña 2005 proceedings, CD.Google ScholarGoogle Scholar
  15. Sandra Jehoel, Don McCallum, Jonathan Rowell, and Simon Ungar. 2006. An empirical approach on the design of tactile maps and diagrams: The cognitive tactualization approach. British Journal of Visual Impairment 24, 2: 67--75.Google ScholarGoogle ScholarCross RefCross Ref
  16. Shaun K. Kane, Meredith Ringel Morris, and Jacob O. Wobbrock. 2013. Touchplates: low-cost tactile overlays for visually impaired touch screen users. Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility, ACM, 22. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Marius Kintel and Clifford Wolf. OpenSCAD - The Programmers Solid 3D CAD Modeller. http://www.openscad.org/. 2014-12-05.Google ScholarGoogle Scholar
  18. Sven Kratz, Tilo Westermann, Michael Rohs, and Georg Essl. 201 CapWidgets: Tangible Widgets versus Multi-touch Controls on Mobile Devices. CHI'11 EA Hum. Factors in Computing Systems, ACM, 1351--1356. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Robert H. LaMotte and Mandayam A. Srinivasan. 1991. Surface Microgeometry: Tactile Perception and Neural Encoding. In Information Processing in the Somatosensory System. Macmillan Education UK, 49--58.Google ScholarGoogle Scholar
  20. Steven Landau and Lesley Wells. 2003. Merging Tactile Sensory Input and Audio Data by Means of the Talking Tactile Tablet. Proceedings of EuroHaptics'03, 414--418.Google ScholarGoogle Scholar
  21. David McGookin, Stephen Brewster, and WeiWei Jiang. 2008. Investigating touchscreen accessibility for people with visual impairments. Proceedings of the 5th Nordic conference on Human-computer interaction: building bridges, ACM, 298--307. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Joshua. A. Miele, Steven Landau, and Deborah Gilden. 2006. Talking TMAP: Automated Generation of Audio-tactile Maps using Smith-Kettlewell's TMAP Software. British Journal of Visual Impairment 24, 2: 93--100. http://doi.org/10.1177/0264619606064436Google ScholarGoogle ScholarCross RefCross Ref
  23. Nazatul Naquiah Abd Hamid and Alistair D.N. Edwards. 2013. Facilitating route learning using interactive audio-tactile maps for blind and visually impaired people. CHI'13 Extended Abstracts on Human Factors in Computing Systems, ACM, 37--42. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Benjamin Poppinga, Charlotte Magnusson, Martin Pielot, and Kirsten Rassmus-Gröhn. 2011. TouchOver map: audio-tactile exploration of interactive maps. Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services, ACM, 545--550. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. Bernhard Schmitz and Thomas Ertl. 2012. Interactively Displaying Maps on a Tactile Graphics Display. SKALID 2012-Spatial Knowledge Acquisition with Limited Information Displays: 13.Google ScholarGoogle Scholar
  26. Caterina Senette, Maria Claudia Buzzi, Marina Buzzi, Barbara Leporini, and Loredana Martusciello. 2013. Enriching Graphic Maps to Enable Multimodal Interaction by Blind People. In Universal Access in Human-Computer Interaction. Design Methods, Tools, and Interaction Techniques for eInclusion. Springer, 576--583. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. Jing Su, Alyssa Rosenzweig, Ashvin Goel, Eyal de Lara, and Khai N. Truong. 2010. Timbremap: enabling the visually-impaired to use maps on touch-enabled devices. Proc. of the 12th International Conference on Human Computer Interaction with Mobile Devices and Services, ACM, 17--26. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. Saiganesh Swaminathan, Thijs Roumen, Robert Kovacs, David Stangl, Stefanie Mueller, and Patrick Baudisch. 2016. Linespace: A Sensemaking Platform for the Blind. Proc. SIGCHI Conf. Human Factors in Computing Systems, ACM (to appear). Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. A. F. Tatham. 1991. The design of tactile maps: theoretical and practical considerations. Proceedings of international cartographic association: mapping the nations: 157--166.Google ScholarGoogle Scholar
  30. Timo Götzelmann and Daniel Schneider. 2016. CapCodes: Capacitive 3D Printable Identification and On-screen Tracking for Tangible Interaction. Proceedings of 9th Nordic Conference on Human-Computer Interaction, ACM (to appear). Google ScholarGoogle ScholarDigital LibraryDigital Library
  31. Simon Voelker, Kosuke Nakajima, Christian Thoresen, Yuichi Itoh, Kjell Ivar Øvergård, and Jan Borchers. 2013. PUCs: Detecting Transparent, Passive Untouched Capacitive Widgets on Unmodified Multi-touch Displays. Proc. ACM Int. Conf. Interactive tabletops and surfaces, ACM Press, 101--104. Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. Zheshen Wang, Baoxin Li, Terri Hedgpeth, and Teresa Haven. 2009. Instant Tactile-audio Map: Enabling Access to Digital Maps for People with Visual Impairment. Proceedings of the 11th International ACM SIGACCESS Conference on Computers & Accessibility, ACM, 43--50. Google ScholarGoogle ScholarDigital LibraryDigital Library
  33. Thomas P. Way and Kenneth E. Barner. 1997. Automatic visual to tactile translation. i. human factors, access methods and image manipulation. Rehabilitation Engineering, IEEE Transactions on 5, 1: 81--94.Google ScholarGoogle ScholarCross RefCross Ref
  34. Alexander Wiethoff, Hanna Schneider, Michael Rohs, Andreas Butz, and Saul Greenberg. 2012. Sketch-a-TUI: Low Cost Prototyping of Tangible Interactions Using Cardboard and Conductive Ink. Proc. Conf. Tangible, Embedded and Embodied Interaction, ACM, 309--312. Google ScholarGoogle ScholarDigital LibraryDigital Library
  35. Limin Zeng, Mei Miao, and Gerhard Weber. 2015. Interactive Audio-haptic Map Explorer on a Tactile Display. Interacting with Computers 27, 4: 413--429. http://doi.org/10.1093/iwc/iwu006Google ScholarGoogle ScholarCross RefCross Ref
  36. Limin Zeng and Gerhard Weber. 2011. Accessible Maps for the Visually Impaired. Proceedings of the IFIP INTERACT Workshop on Accessible Design in the Digital World, 54--60.Google ScholarGoogle Scholar
  37. Limin Zeng and Gerhard Weber. 2012. ATMap: Annotated Tactile Maps for the Visually Impaired. In Cognitive Behavioural Systems. Springer, 290--298. Google ScholarGoogle ScholarDigital LibraryDigital Library
  38. WHO | Visual impairment and blindness. Retrieved from http://www.who.int/mediacentre/factsheets/fs282/en/. 2015-12-05.Google ScholarGoogle Scholar

Index Terms

  1. LucentMaps: 3D Printed Audiovisual Tactile Maps for Blind and Visually Impaired People

          Recommendations

          Comments

          Login options

          Check if you have access through your login credentials or your institution to get full access on this article.

          Sign in
          • Published in

            cover image ACM Conferences
            ASSETS '16: Proceedings of the 18th International ACM SIGACCESS Conference on Computers and Accessibility
            October 2016
            362 pages
            ISBN:9781450341240
            DOI:10.1145/2982142

            Copyright © 2016 ACM

            Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

            Publisher

            Association for Computing Machinery

            New York, NY, United States

            Publication History

            • Published: 23 October 2016

            Permissions

            Request permissions about this article.

            Request Permissions

            Check for updates

            Qualifiers

            • research-article

            Acceptance Rates

            ASSETS '16 Paper Acceptance Rate24of95submissions,25%Overall Acceptance Rate436of1,556submissions,28%

          PDF Format

          View or Download as a PDF file.

          PDF

          eReader

          View online with eReader.

          eReader