ABSTRACT
Tactile maps support blind and visually impaired people in orientation and to familiarize with unfamiliar environments. Interactive approaches complement these maps with auditory feedback. However, commonly these approaches focus on blind people. We present an approach which incorporates visually impaired people by visually augmenting relevant parts of tactile maps. These audiovisual tactile maps can be used in conjunction with common tablet computers and smartphones. By integrating conductive elements into 3D printed tactile maps, they can be recognized by a single touch on the mobile device's display, which eases the handling for blind and visually impaired people. To allow multiple elevation levels in our transparent tactile maps, we conducted a study to reconcile technical and physiological requirements of off-the-shelf 3D printers, capacitive touch inputs and the human tactile sense. We propose an interaction concept for 3D printed audiovisual tactile maps, verify its feasibility and test it with a user study. Our discussion includes economic considerations crucial for a broad dissemination of tactile maps for both blind and visually impaired people.
- Blender Foundation. blender.org - Home of the Blender project - Free and Open 3D Creation Software.Google Scholar
- Anke M. Brock. 2013. Interactive Maps for Visually Impaired People: Design, Usability and Spatial Cognition.Google Scholar
- Anke M. Brock, Philippe Truillet, Bernard Oriola, Delphine Picard, and Christophe Jouffrais. 2015. Interactivity Improves Usability of Geographic Maps for Visually Impaired People. Human-Computer Interaction 30, 2: 156--194. Google ScholarDigital Library
- Craig Brown and Amy Hurst. 2012. VizTouch: automatically generated tactile visualizations of coordinate spaces. Proceedings of the Sixth International Conference on Tangible, Embedded and Embodied Interaction, ACM, 131--138. Google ScholarDigital Library
- Emeline Brulé, Gilles Bailly, Anke M. Brock, Frédéric Valentin, Grégoire Denis, and Christophe Jouffrais. 2016. MapSense: Multi-Sensory Interactive Maps for Children Living with Visual Impairments. ACM CHI 2016-chi4good, ACM. Google ScholarDigital Library
- M. Espinosa, Simon Ungar, Esperanza Ochaíta, Mark Blades, and Christopher Spencer. 1998. Comparing Methods for Introducing Blind and Visually Impaired People to Unfamiliar Urban Environments. Journal of Environmental Psychology 18, 3: 277--287.Google ScholarCross Ref
- Timo Götzelmann. 2016. CapMaps. In 15th International Conference on Computers Helping People with Special Needs (ICCHP'16). Springer, 146--152.Google Scholar
- Timo Götzelmann and Aleksander Pavkovic. 2014. Towards Automatically Generated Tactile Detail Maps by 3D Printers for Blind Persons. 14th International Conference on Computers Helping People with Special Needs (ICCHP'14), Springer, 1--7.Google ScholarCross Ref
- Timo Götzelmann and Klaus Winkler. 2015. SmartTactMaps: A Smartphone-Based Approach to Support Blind Persons in Exploring Tactile Maps. Proceedings of the 8th International Conference on PErvasive Technologies Related to Assistive Environments (PETRAE'15), ACM, 1--8. http://doi.org/10.1145/2769493.2769497 Google ScholarDigital Library
- Jaume Gual, Marina Puyuelo, and Joaquim Lloveras. 2015. The effect of volumetric (3D) tactile symbols within inclusive tactile maps. Applied Ergonomics 48: 1--10.Google ScholarCross Ref
- Ronald AL Hinton. 1993. Tactile and audio-tactile images as vehicles for learning. Non-Visual Human-Computer-Interactions - Prospects for the Visually Handicapped, John Libbey Eurotext Ltd., 169--180.Google Scholar
- David Huggins-Daines, Mohit Kumar, Arthur Chan, Alan W. Black, Mosur Ravishankar, and Alex I. Rudnicky. 2006. Pocketsphinx: A free, real-time continuous speech recognition system for hand-held devices. Proc. of IEEE International Conference on Acoustics, Speech and Signal Processing, IEEE, 185--188.Google Scholar
- R. Dan Jacobson. 1998. Navigating maps with little or no sight: An audio-tactile approach. Proceedings of Content Visualization and Intermedia Representations: 95--102.Google Scholar
- Sandra Jehoel, Snir Dinar, Don McCallum, Jonathan Rowell, and Simon Ungar. 2005. A scientific approach to tactile map design: minimum elevation of tactile map symbols. Proceedings of XXII International Cartographic Conference A Coruña 2005 proceedings, CD.Google Scholar
- Sandra Jehoel, Don McCallum, Jonathan Rowell, and Simon Ungar. 2006. An empirical approach on the design of tactile maps and diagrams: The cognitive tactualization approach. British Journal of Visual Impairment 24, 2: 67--75.Google ScholarCross Ref
- Shaun K. Kane, Meredith Ringel Morris, and Jacob O. Wobbrock. 2013. Touchplates: low-cost tactile overlays for visually impaired touch screen users. Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility, ACM, 22. Google ScholarDigital Library
- Marius Kintel and Clifford Wolf. OpenSCAD - The Programmers Solid 3D CAD Modeller. http://www.openscad.org/. 2014-12-05.Google Scholar
- Sven Kratz, Tilo Westermann, Michael Rohs, and Georg Essl. 201 CapWidgets: Tangible Widgets versus Multi-touch Controls on Mobile Devices. CHI'11 EA Hum. Factors in Computing Systems, ACM, 1351--1356. Google ScholarDigital Library
- Robert H. LaMotte and Mandayam A. Srinivasan. 1991. Surface Microgeometry: Tactile Perception and Neural Encoding. In Information Processing in the Somatosensory System. Macmillan Education UK, 49--58.Google Scholar
- Steven Landau and Lesley Wells. 2003. Merging Tactile Sensory Input and Audio Data by Means of the Talking Tactile Tablet. Proceedings of EuroHaptics'03, 414--418.Google Scholar
- David McGookin, Stephen Brewster, and WeiWei Jiang. 2008. Investigating touchscreen accessibility for people with visual impairments. Proceedings of the 5th Nordic conference on Human-computer interaction: building bridges, ACM, 298--307. Google ScholarDigital Library
- Joshua. A. Miele, Steven Landau, and Deborah Gilden. 2006. Talking TMAP: Automated Generation of Audio-tactile Maps using Smith-Kettlewell's TMAP Software. British Journal of Visual Impairment 24, 2: 93--100. http://doi.org/10.1177/0264619606064436Google ScholarCross Ref
- Nazatul Naquiah Abd Hamid and Alistair D.N. Edwards. 2013. Facilitating route learning using interactive audio-tactile maps for blind and visually impaired people. CHI'13 Extended Abstracts on Human Factors in Computing Systems, ACM, 37--42. Google ScholarDigital Library
- Benjamin Poppinga, Charlotte Magnusson, Martin Pielot, and Kirsten Rassmus-Gröhn. 2011. TouchOver map: audio-tactile exploration of interactive maps. Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services, ACM, 545--550. Google ScholarDigital Library
- Bernhard Schmitz and Thomas Ertl. 2012. Interactively Displaying Maps on a Tactile Graphics Display. SKALID 2012-Spatial Knowledge Acquisition with Limited Information Displays: 13.Google Scholar
- Caterina Senette, Maria Claudia Buzzi, Marina Buzzi, Barbara Leporini, and Loredana Martusciello. 2013. Enriching Graphic Maps to Enable Multimodal Interaction by Blind People. In Universal Access in Human-Computer Interaction. Design Methods, Tools, and Interaction Techniques for eInclusion. Springer, 576--583. Google ScholarDigital Library
- Jing Su, Alyssa Rosenzweig, Ashvin Goel, Eyal de Lara, and Khai N. Truong. 2010. Timbremap: enabling the visually-impaired to use maps on touch-enabled devices. Proc. of the 12th International Conference on Human Computer Interaction with Mobile Devices and Services, ACM, 17--26. Google ScholarDigital Library
- Saiganesh Swaminathan, Thijs Roumen, Robert Kovacs, David Stangl, Stefanie Mueller, and Patrick Baudisch. 2016. Linespace: A Sensemaking Platform for the Blind. Proc. SIGCHI Conf. Human Factors in Computing Systems, ACM (to appear). Google ScholarDigital Library
- A. F. Tatham. 1991. The design of tactile maps: theoretical and practical considerations. Proceedings of international cartographic association: mapping the nations: 157--166.Google Scholar
- Timo Götzelmann and Daniel Schneider. 2016. CapCodes: Capacitive 3D Printable Identification and On-screen Tracking for Tangible Interaction. Proceedings of 9th Nordic Conference on Human-Computer Interaction, ACM (to appear). Google ScholarDigital Library
- Simon Voelker, Kosuke Nakajima, Christian Thoresen, Yuichi Itoh, Kjell Ivar Øvergård, and Jan Borchers. 2013. PUCs: Detecting Transparent, Passive Untouched Capacitive Widgets on Unmodified Multi-touch Displays. Proc. ACM Int. Conf. Interactive tabletops and surfaces, ACM Press, 101--104. Google ScholarDigital Library
- Zheshen Wang, Baoxin Li, Terri Hedgpeth, and Teresa Haven. 2009. Instant Tactile-audio Map: Enabling Access to Digital Maps for People with Visual Impairment. Proceedings of the 11th International ACM SIGACCESS Conference on Computers & Accessibility, ACM, 43--50. Google ScholarDigital Library
- Thomas P. Way and Kenneth E. Barner. 1997. Automatic visual to tactile translation. i. human factors, access methods and image manipulation. Rehabilitation Engineering, IEEE Transactions on 5, 1: 81--94.Google ScholarCross Ref
- Alexander Wiethoff, Hanna Schneider, Michael Rohs, Andreas Butz, and Saul Greenberg. 2012. Sketch-a-TUI: Low Cost Prototyping of Tangible Interactions Using Cardboard and Conductive Ink. Proc. Conf. Tangible, Embedded and Embodied Interaction, ACM, 309--312. Google ScholarDigital Library
- Limin Zeng, Mei Miao, and Gerhard Weber. 2015. Interactive Audio-haptic Map Explorer on a Tactile Display. Interacting with Computers 27, 4: 413--429. http://doi.org/10.1093/iwc/iwu006Google ScholarCross Ref
- Limin Zeng and Gerhard Weber. 2011. Accessible Maps for the Visually Impaired. Proceedings of the IFIP INTERACT Workshop on Accessible Design in the Digital World, 54--60.Google Scholar
- Limin Zeng and Gerhard Weber. 2012. ATMap: Annotated Tactile Maps for the Visually Impaired. In Cognitive Behavioural Systems. Springer, 290--298. Google ScholarDigital Library
- WHO | Visual impairment and blindness. Retrieved from http://www.who.int/mediacentre/factsheets/fs282/en/. 2015-12-05.Google Scholar
Index Terms
- LucentMaps: 3D Printed Audiovisual Tactile Maps for Blind and Visually Impaired People
Recommendations
Visually Augmented Audio-Tactile Graphics for Visually Impaired People
Paper from ASSETS 2016 and Regular PapersTactile graphics play an essential role in knowledge transfer for blind people. The tactile exploration of these graphics is often challenging because of the cognitive load caused by physiological constraints and their complexity. The coupling of ...
T-TATIL: mobile application for helping visually impaired people in reading and interpreting tactile drawings
IHC '19: Proceedings of the 18th Brazilian Symposium on Human Factors in Computing SystemsGraphic contents are important resources to communicate information and their use is fundamental in the teaching process. Visually impaired people do not have easy access to this resource owing to their impairment. They mainly use tactile graphics to ...
Facilitating route learning using interactive audio-tactile maps for blind and visually impaired people
CHI EA '13: CHI '13 Extended Abstracts on Human Factors in Computing SystemsIn preparing to navigate in an unfamiliar location, a blind person may use non-visual maps. This project is aimed at developing more effective, interactive audio-tactile maps. The maps will be novel in using speech and non-speech sounds and allowing the ...
Comments