Abstract
In this paper a 3D map solution combined with a mobile phone application is presented. This solution enables blind users to perceive their spatial location from tactile stimulation, but also contextual information from a mobile application that provides this information via mobile phone, using audio. In the proposed model, 3d map sections embedding NFC technology support the application scenario described in this work.
You have full access to this open access chapter, Download conference paper PDF
Similar content being viewed by others
Keywords
1 Introduction
Since 1990, the American Disability Act (ADA) determines that environments must be accessible to people with disabilities (Bentzen 2007; ADA 1990). In 1993, the United Nations (UN) publishes The Standard Rules on the Equalization of Opportunities for Persons with Disabilities, by focusing on accessibility and valuing it as a key area for equal participation (UN, 1993).
However, blind people are one of the most vulnerable groups of people with regard to physical accessibility. It is considerably difficult to define and specify navigation systems not only because of the risk of collision due to insufficient information but also for the complexity and time spent on planning alternative routes around obstacles (Loomis et al. 2007).
Visual disability has a major impact on individuals’ quality of life, not only in their ability to work and develop personal relationships but most importantly regarding their mobility. There are 285 million people with visual impairments, worldwide (WHO 2014). Specifically, 39 million are blind people and 246 million suffer from low vision. According to World Health Organization (WHO), in their International Classification of Diseases, version 10, there are four levels of visual function: normal vision, moderate visual impairment, severe visual impairment and blindness (WHO 2010).
Due to the specific characteristics of their disability, blind people cannot interpret visual stimulus. That stimulus is very important for the interpretation of information about their spatial location. At the same time, navigation systems are generally designed to be used by people without disability, i.e., having no stimuli impairment. The blind are not the only group of people with difficulties in navigation and way finding. Elderly people are one of those groups (Hess 2005; Kirasic 2002), as well as the visually impaired (Helal et al. 2001; Golledge et al. 1996), dementia or Alzheimer’s diseases (Rosenbaum et al. 2005; Pai 2006). Therefore, it is commonly accepted that there is a need to study solutions that can overcome the spatial barriers by creating forms of tactile learning, with the same goal. Blind people can very easily recognize three-dimensional shapes through tactile sensations (Teshima 2010).
In this context, we present a navigation solution for special location recognition based on 3D maps and NFC technology under development in project CE4BLIND (UTAP-EXPL/EEI-SII/0043/2014). This solution is aimed to allow blind users to perceive their spatial location from the tactile stimulation, but also with the use of an application that provides spatial information via mobile phone. To present this solution the paper is structured as follow: a background section where the main concepts about navigation for the blind are exposed, the proposed model and its description (3D maps features and NFC technology and finally some conclusions and future work.
2 Navigation for the Blind
Broadly, a navigation system consists of an artefact that enables a user to follow a predetermined path between an explicit origin and destination. Thus, the system needs to know the person’s position and orientation, continuously, according to the environment and through to the final destination (Aslan and Krüger 2004; Rieser 2007). Furthermore, an effective navigation aims to ensure the best path based on a specific variable, such as: shortest distance, time, minimum cost, type of road, etc. (Teshima 2010).
In the literature we can find several research contributions using Geographic Information Systems (GIS) and Global Positioning System (GNS) -based navigation systems for the visually impaired (Golledge et al. 2004; Helal et al. 2001; Ponchillia et al. 2007; Blake 2011), as analyzed in the work of Teshima (2010). Some present non-visual spatial displays, for example: auditory (Kim et al. 2000; Marston et al. 2007); haptic (Loomis et al. 2007; Marston et al. 2007); and/or virtual acoustic displays (Kim and Song 2007).
Another technology used is Radio Frequency Identification (RFID) tagging. It is a solution presented in different studies (Willis and Helal 2005). However, some authors, such as Liao (2012) highlighted the disadvantages of this technology: an RFID information grid requires a short range communication (7 to 15-cm) and high density of tags (30 cm or 12 in apart) (Liao 2012).
Other studies proposed several assistive technologies to provide navigation assistance to the blind. For example, Wilson et al. (2007) presented a wearable audio navigation system that uses GPS technology, digital compass, cameras and a light sensor to transmit 3D audio cues that could help, not only the blind, but also the visually impaired (Wilson et al. 2007). Other example was a wearable tactile belt that has GPS, compass, inertial sensor, battery and small motors (Zelek and Holbein 2008). This belt provided effective navigational help for people with or without disabilities. Kim et al. (2010) presented an electronic cane with an integrated camera, ZigBee wireless radio and a RFID tag reader (Kim et al. 2010).
In their work, Voženílek et al. (2009) analysed characteristics of interpretation and perception of geospace by using tactile maps based on 3D printing. In this context, they present three types of tactile maps (all maps were coloured):
-
“Map of type A is a tactile map printed by 3D printing technology (Contex 3D printers) as traditional relief tactile map with 5 mm thick background using both positive and negative relief with labelling by Braille letters.
-
Map of type B is an inverse form of tactile map printed by 3D printing technology which will be used for casting type A tactile maps.
-
Map of type C is a sound tactile map derived from map of type A posed onto box with digital voice records of geoinformation (attributes, navigations etc.) activated by touch on maps surface” (Voženílek et al. 2009).
Smartphone applications to assist the blind people are not a newly technology implemented. Liao (2012) presented a work that uses this equipment to help the blind at signalized intersections (Liao 2012).
Tactile Graphics, Touch graphics and the University of Buffalo presented a project that uses a similar technology to the one that is proposed here. A 3D mapping system which works with the tactile sense. In this project, 3D building models were developed on a horizontal map with sensing wires connected to a computer which helps users to localize places. Users must put their fingers on the buildings and the system reads the pressure sending signals to a computer which responds with an auditory stimulus, announcing the building’s name and its particular paths (Fig. 1) (Tactile graphics et al., ONLINE).
3 The Proposed Model
In the model proposed in this work two technologies are combined: 3D mapping and Near Field Communication (NFC), one that provides tactile stimulus and another which stimulates the auditory sense, respectively. A more detailed description is presented in the following sections. The objective is to enhance the spatial perception of the blind.
The overall setup assumes that the location of the user is estimated from a combination of several inputs, namely Global Navigation Satellite System (GNSS) and Radio Frequency Identification (RFID). The user carries an electronic white cane, such as the white cane developed in the Blavigator prototype (Faria et al. 2010; Fernandes et al. 2013a, b) which senses tags on a specific area of interest (such as touristic locations). The tags are placed on a topology that consists of connected lines and clusters, which globally compose a network of safe paths and points of interest. Locally, physically placed on each point of interest, a QR-code also provides information regarding each specific spot.
However, the current model is focused on the contribution that 3D mapping can give while creating a mental map of the environment, before the navigation itself occurs.
Concerning the 3D mapping, it can be created from the existing blueprints or floor plans and modelled with a 3D tool (software). In Fig. 2, we can see the first version of the proposed model with several elements modelled (Fig. 2).
The second version was created with the intention of simplifying the model (Fig. 3).
The different heights of the various points of interest enable easy and fast tactile interpretation by a blind user, combining different textures of the various elements for a better memorization and usability of the attached label, which identifies the areas of special interest (Fig. 4).
The implementation of this map (or sculpture) creates extended visual value also for sighted users, without any visual impairment, as it provides an excellent catalogue, grouped by different colours, on the available points of interest in the infrastructure that can be visited. This accessible 3D map was designed with focus on universal design.
Specifically, the main objective was to create a 3d model to help blind people in their spatial perception. In this context, a plant in Computer Aided format was used. However, this blueprint needed to be adapted to ensure our modelling needs.
First, a graphic tool was used to erase unnecessary elements (such as adjacent buildings and urban furniture). This step was very important to highlight important areas and hide others, not so important. It was important to define the level of detail of the model because it is believed that too much detail could lead to an enormous amount of elements to be perceived, thus confusing the blind user. Also, with this software other elements were added to delimit the model (such as: background, map limits, and sections) and other complex elements which had to be redesigned following the original route. Areas with similar elements were joined into groups in order to facilitate their future recognition. All this redesign was made to simplify the model and making it ‘lighter’, providing a model that is as “low poly” as possible (Fig. 5).
After this process, the file was exported to an 3d modelling software. In this software the model was resized (width: 84 cm, length: 118 cm and height: 4 cm). After creating the solid faces, they were subsequently extruded with different values for each group (previously defined) in order to be properly recognized, according to the different types of elements. The maximum height size was defined to 4 cm.
As the model was designed as a universal design model, it is intended to be used as a guide for all tourists (not only for the blind), so different colors and textures were added to the different elements.
In terms of audio information, the model proposes the use of NFC technology embedded on the physical map to provide audio feedback. Near field communication (NFC) is a set of communication protocols that enable devices to establish communication by bringing them close to each other (4 cm, on average). Pervasive computing research has explored the potential benefits of creating a connection between the information that can be stored on the virtual world and elements that are present in the physical world (Want 2011). Typical applications are the communication of two devices for file sharing. Another application is the use of a mobile device to read an NFC enabled credit card, or tag. The latter is the feature used in the model proposed.
Using an NFC reader embedded on the electronic white cane, the blind can use the cane to obtain contextual information about each specific point of interest, marked in the map. The audio information is delivered by a mobile application, using information stored on the NFC tags. The NFC tags are placed physically on the 3D map, on strategic places regarding special points-of-interest, as Fig. 6 suggests. This helps the user on creating a mental map of the expected features to visit, or to expect, on the environment on which he intends to navigate.
4 Conclusions and Future Work
This paper presented a model to enhance the perception of spatial location for the blind, combining NFC technology and 3D mapping. After analysing different, related, work and the technologies they presented as options an innovative solution is presented to connect new and old technologies that are believed to be an interesting and usable solution for blind people to use. In this context, as future work, the design of the textures and usability testing with the blind users will be made.
References
ADA - American Discrimination Act. To establish a clear and comprehensive prohibition of discrimination on the basis of disability (1990). Retrieved from https://www.gpo.gov/fdsys/pkg/STATUTE-104/pdf/STATUTE-104-Pg327.pdf. Accessed 10 Feb 2016
Aslan, I., Krüger, A.: The Bum Bag Navigator (BBN): an advanced pedestrian navigation system. In: Proceedings of AIMS 2004, Nottingham, U.K. (2004). Retrieved from: http://w5.cs.uni-sb.de/~baus/aims04/cameraready/P3.pdf. Accessed 10 Feb 2016
Bentzen, B.L.: Making the environment accessible to pedestrains who are visually impaired: policy research. In: Rieser, J.J., Ashmead, D.H., Ebner, F., Corn, A.L. (eds.) Blindness and Brain Plasticity in Navigation and Object Perception, pp. 313–333. Psychology Press, New York (2007)
Blake, L.: Proving his point. Star Tribune article (2011). Retrieved from: http://www.startribune.com/local/west/114846184.html?elr=KArksUUUoDEy3LGDiO7aiU. Accessed 11 Jan 2016
Faria, J., Lopes, S., Fernandes, H., Martins, P., Barroso, J.: Electronic white cane for blind people navigation assistance. In: World Automation Congress (WAC), pp. 1–7. IEEE, September 2010
Fernandes, H., Faria, J., Lopes, S. Martins, P., Barroso, J.: Electronic white cane for blind people navigation assistance. In: Proceedings of the World Automation Congress 2010, Kobe (2010)
Fernandes, H., Faria, J., Martins, P., Paredes, H., Barroso, J.: RFID mesh network as an infrastructure for location based services for the blind. In: Kurosu, M. (ed.) HCII/HCI 2013, Part V. LNCS, vol. 8008, pp. 39−45. Springer, Heidelberg (2013a)
Fernandes, H., Filipe, V., Costa, P., Barroso, J.: Location based services for the blind supported by RFID technology. In: 5th International Conference on DSAI2013 - Software Development and Technologies for Enhancing Accessibility and Fighting Info-exclusion, Vigo, Spain (2013b). Computer Science Journal, Elsevier
Golledge, R.G., Marston, J.R., Loomis, J.M., Klatzky, R.L.: Stated preferences for components of a personal guidance system for non-visual navigation. J. Vis. Impair. Blindness 98(3), 135–147 (2004)
Golledge, R.G., Gärling, T.: Cogntive maps and urban travel. In: Hensher, D.A., Button, K.J., Haynes, K.E., Stopher, P.R. (eds.) Handbook of Transport Geography and Spatial Systems. Elsevier, Amsterdam (2004). chap. 28
Helal, A., Moore, S., Ramachandran, B.: Drishti: an integrated navigation system for visually impaired and disabled. In: Proceedings of the 5th International Symposium on Wearable Computer, Zurich, Switzerland (2001). Retrieved from: http://www.icta.ufl.edu/projects/publications/wearableConf.pdf. Accessed 20 Dec 2015
Hess, T.M.: Memory and aging in context. Psychol. Bull. 131(3), 383–406 (2005)
Kim, Y., Kim, C.-H., Kim, B.: Design of an auditory guidance system for the blind with signal transformation from stereo ultrasonic to binaural audio. Artif. Life Robot. 4(4), 220–226 (2000)
Kim, C.-G., Song, B.-S.: Design of a wearable walking-guide system for the blind. In: Proceedings of the 1st International Convention on Rehabilitation Engineering; Assistive Technology: in Conjunction with 1st Tan Tock Seng Hospital Neurorehabilitation Meeting, Singapore (2007)
Kirasic, K.C.: Age differences in adults’ spatial abilities, learning environmental layout, and wayfinding behavior. Spat. Cogn. Comput. 2(2), 117–134 (2002)
Klatzky, R.L., Beall, A.C., Loomis, J.M., Golledge, R.G., Philbeck, J.W.: Human navigation ability: tests of the encoding-error model of path integration. Spat. Cogn. Comput. 1, 31–65 (1999)
Liao, C.-F.: Using a Smartphone App to Assist the Visually Impaired at Signalized Intersections. Final Report. Minnesota Traffic Observatory Laboratory. Department of Civil Engineering. University of Minnesota (2012)
Loomis, J.M., Golledge, R.G., Klatzky, R.L., Marston, J.R.: Assisting way finding in visually impaired travelers. In: Allen, G.L. (ed.) Applied Spatial Cognition: From Research to Cognitive Technology, pp. 179–203. Lawrence Erlbaum Associates, Mahwah, N.J. (2007)
Marston, J.R., Loomis, J.M., Klatzky, R.L., Golledge, R.G.: Nonvisual route following with guidance from a simple haptic or auditory display. J. Vis. Impair. Blindness 101(4), 203–211 (2007)
Pai, M.C.: The neuropsychological studies of dementias in Taiwan: focus on wayfinding problems in Alzheimer’s patients. Acta Neurol. Taiwan 15(1), 58–60 (2006)
Rieser, J.J., Ashmead, D.H., Ebner, F., Corn, A.L.: Blindness and Brain Plasticity in Navigation and Object Perception. Psychology Press, New York (2007)
Rosenbaum, R.S., Gao, F., Richards, B., Black, S.E., Moscovitch, M.: Where to? remote memory for spatial relations and landmark identity in former taxi drivers with alzheimer’s disease and encephalitis. J. Cogn. Neurosci. 17(3), 446–462 (2005). The MIT Press
Ponchillia, P., Rak, E., Freeland, A., LaGrow, S.J.: Accessible GPS: reorientation and target location among users with visual impairments. J. Vis. Impair. Blindness 101(7), 389–401 (2007)
Tactile graphics, Touch graphics & University of Buffalo (ONLINE). My engineering website (2015). Retrieved from: http://www.myengineering.net/technology/3d-map-talks-to-help-blind-people-find-their-way/. Accessed 16 Dec 2015
Teshima, Y.: Three-dimensional tactile models for blind people and recognition of 3D objects by touch: introduction to the special thematic session. In: Miesenberger, K., Klaus, J., Zagler, W., Karshmer, A. (eds.) ICCHP 2010, Part II. LNCS, vol. 6180, pp. 513–514. Springer, Heidelberg (2010)
Voženílek, V., Kozáková, M., Šťávová, Z., Ludíková, L., Růžičková, V., Finková, D.: 3D printing technology in tactile maps compiling (2009). Retrieved from: http://icaci.org/files/documents/ICC_proceedings/ICC2009/html/refer/8_4.pdf. Accessed 16 Dec 2015
Want, R.: Near field communication. IEEE Pervasive Comput. 10(3), 4–7 (2011)
WHO-World Health Organization. Visual impairment and blindness. Fact Sheet n 282 (2014). Retrieved from: http://www.who.int/mediacentre/factsheets/fs282/en/. Accessed 16 Jan 2016
WHO-World Health Organization. International Statistical Classification of Diseases and Related Health Problems (ICD-10 -Vol. 2) (2010)
Willis, S., Helal, S.: RFID information grid for blind navigation and wayfinding. In: Ninth IEEE International Symposium on Wearable Computers, pp.34–37, Galway, Ireland (2005)
Wilson, J., Walker, B.N., Lindsay, J., Cambias, C., Dellaert, F.: SWAN: system for wearable audio navigation. In: Proceedings of the 11th International Symposium on Wearable Computers, Boston, MA (2007). Retrieved from: http://sonify.psych.gatech.edu/publications/pdfs/2007ISWC-Wilson-et-al-submitted.pdf. Accessed 10 Jan 2016
UN – United Nations. Declaração dos direitos das pessoas com deficiência (1975). Retrieved from: http://adg.org.pt/DECLAR. Accessed 10 Jan 2016
Zelek, J.S., Holbein, M.: Wearable tactile navigation system. US Patent Application number: 11/707,031. Publication number: US 2008/0120029 A1 (2008)
Acknowledgments
The paper is supported by the project CE4blind- Context extraction for the blind using computer vision, with project reference UTAP-EXPL/EEI-SII/0043/2014, a research grant with reference SFRH/BD/89759/2012.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer International Publishing Switzerland
About this paper
Cite this paper
Rocha, T., Fernandes, H., Paredes, H., Barroso, J. (2016). Combining NFC and 3D Mapping to Enhance the Perception of Spatial Location for the Blind. In: Antona, M., Stephanidis, C. (eds) Universal Access in Human-Computer Interaction. Users and Context Diversity. UAHCI 2016. Lecture Notes in Computer Science(), vol 9739. Springer, Cham. https://doi.org/10.1007/978-3-319-40238-3_58
Download citation
DOI: https://doi.org/10.1007/978-3-319-40238-3_58
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-40237-6
Online ISBN: 978-3-319-40238-3
eBook Packages: Computer ScienceComputer Science (R0)