Keywords

1 Introduction

Since 1990, the American Disability Act (ADA) determines that environments must be accessible to people with disabilities (Bentzen 2007; ADA 1990). In 1993, the United Nations (UN) publishes The Standard Rules on the Equalization of Opportunities for Persons with Disabilities, by focusing on accessibility and valuing it as a key area for equal participation (UN, 1993).

However, blind people are one of the most vulnerable groups of people with regard to physical accessibility. It is considerably difficult to define and specify navigation systems not only because of the risk of collision due to insufficient information but also for the complexity and time spent on planning alternative routes around obstacles (Loomis et al. 2007).

Visual disability has a major impact on individuals’ quality of life, not only in their ability to work and develop personal relationships but most importantly regarding their mobility. There are 285 million people with visual impairments, worldwide (WHO 2014). Specifically, 39 million are blind people and 246 million suffer from low vision. According to World Health Organization (WHO), in their International Classification of Diseases, version 10, there are four levels of visual function: normal vision, moderate visual impairment, severe visual impairment and blindness (WHO 2010).

Due to the specific characteristics of their disability, blind people cannot interpret visual stimulus. That stimulus is very important for the interpretation of information about their spatial location. At the same time, navigation systems are generally designed to be used by people without disability, i.e., having no stimuli impairment. The blind are not the only group of people with difficulties in navigation and way finding. Elderly people are one of those groups (Hess 2005; Kirasic 2002), as well as the visually impaired (Helal et al. 2001; Golledge et al. 1996), dementia or Alzheimer’s diseases (Rosenbaum et al. 2005; Pai 2006). Therefore, it is commonly accepted that there is a need to study solutions that can overcome the spatial barriers by creating forms of tactile learning, with the same goal. Blind people can very easily recognize three-dimensional shapes through tactile sensations (Teshima 2010).

In this context, we present a navigation solution for special location recognition based on 3D maps and NFC technology under development in project CE4BLIND (UTAP-EXPL/EEI-SII/0043/2014). This solution is aimed to allow blind users to perceive their spatial location from the tactile stimulation, but also with the use of an application that provides spatial information via mobile phone. To present this solution the paper is structured as follow: a background section where the main concepts about navigation for the blind are exposed, the proposed model and its description (3D maps features and NFC technology and finally some conclusions and future work.

2 Navigation for the Blind

Broadly, a navigation system consists of an artefact that enables a user to follow a predetermined path between an explicit origin and destination. Thus, the system needs to know the person’s position and orientation, continuously, according to the environment and through to the final destination (Aslan and Krüger 2004; Rieser 2007). Furthermore, an effective navigation aims to ensure the best path based on a specific variable, such as: shortest distance, time, minimum cost, type of road, etc. (Teshima 2010).

In the literature we can find several research contributions using Geographic Information Systems (GIS) and Global Positioning System (GNS) -based navigation systems for the visually impaired (Golledge et al. 2004; Helal et al. 2001; Ponchillia et al. 2007; Blake 2011), as analyzed in the work of Teshima (2010). Some present non-visual spatial displays, for example: auditory (Kim et al. 2000; Marston et al. 2007); haptic (Loomis et al. 2007; Marston et al. 2007); and/or virtual acoustic displays (Kim and Song 2007).

Another technology used is Radio Frequency Identification (RFID) tagging. It is a solution presented in different studies (Willis and Helal 2005). However, some authors, such as Liao (2012) highlighted the disadvantages of this technology: an RFID information grid requires a short range communication (7 to 15-cm) and high density of tags (30 cm or 12 in apart) (Liao 2012).

Other studies proposed several assistive technologies to provide navigation assistance to the blind. For example, Wilson et al. (2007) presented a wearable audio navigation system that uses GPS technology, digital compass, cameras and a light sensor to transmit 3D audio cues that could help, not only the blind, but also the visually impaired (Wilson et al. 2007). Other example was a wearable tactile belt that has GPS, compass, inertial sensor, battery and small motors (Zelek and Holbein 2008). This belt provided effective navigational help for people with or without disabilities. Kim et al. (2010) presented an electronic cane with an integrated camera, ZigBee wireless radio and a RFID tag reader (Kim et al. 2010).

In their work, Voženílek et al. (2009) analysed characteristics of interpretation and perception of geospace by using tactile maps based on 3D printing. In this context, they present three types of tactile maps (all maps were coloured):

  • “Map of type A is a tactile map printed by 3D printing technology (Contex 3D printers) as traditional relief tactile map with 5 mm thick background using both positive and negative relief with labelling by Braille letters.

  • Map of type B is an inverse form of tactile map printed by 3D printing technology which will be used for casting type A tactile maps.

  • Map of type C is a sound tactile map derived from map of type A posed onto box with digital voice records of geoinformation (attributes, navigations etc.) activated by touch on maps surface” (Voženílek et al. 2009).

Smartphone applications to assist the blind people are not a newly technology implemented. Liao (2012) presented a work that uses this equipment to help the blind at signalized intersections (Liao 2012).

Tactile Graphics, Touch graphics and the University of Buffalo presented a project that uses a similar technology to the one that is proposed here. A 3D mapping system which works with the tactile sense. In this project, 3D building models were developed on a horizontal map with sensing wires connected to a computer which helps users to localize places. Users must put their fingers on the buildings and the system reads the pressure sending signals to a computer which responds with an auditory stimulus, announcing the building’s name and its particular paths (Fig. 1) (Tactile graphics et al., ONLINE).

Fig. 1.
figure 1

3D map from tactile graphics, touch graphics and the University of Buffalo (Tactile graphics et al., ONLINE).

3 The Proposed Model

In the model proposed in this work two technologies are combined: 3D mapping and Near Field Communication (NFC), one that provides tactile stimulus and another which stimulates the auditory sense, respectively. A more detailed description is presented in the following sections. The objective is to enhance the spatial perception of the blind.

The overall setup assumes that the location of the user is estimated from a combination of several inputs, namely Global Navigation Satellite System (GNSS) and Radio Frequency Identification (RFID). The user carries an electronic white cane, such as the white cane developed in the Blavigator prototype (Faria et al. 2010; Fernandes et al. 2013a, b) which senses tags on a specific area of interest (such as touristic locations). The tags are placed on a topology that consists of connected lines and clusters, which globally compose a network of safe paths and points of interest. Locally, physically placed on each point of interest, a QR-code also provides information regarding each specific spot.

However, the current model is focused on the contribution that 3D mapping can give while creating a mental map of the environment, before the navigation itself occurs.

Concerning the 3D mapping, it can be created from the existing blueprints or floor plans and modelled with a 3D tool (software). In Fig. 2, we can see the first version of the proposed model with several elements modelled (Fig. 2).

Fig. 2.
figure 2

First version model of 3D maps

The second version was created with the intention of simplifying the model (Fig. 3).

Fig. 3.
figure 3

First version model of 3D maps

The different heights of the various points of interest enable easy and fast tactile interpretation by a blind user, combining different textures of the various elements for a better memorization and usability of the attached label, which identifies the areas of special interest (Fig. 4).

Fig. 4.
figure 4

View details of the distinctive elements cemeteries (before textures)

The implementation of this map (or sculpture) creates extended visual value also for sighted users, without any visual impairment, as it provides an excellent catalogue, grouped by different colours, on the available points of interest in the infrastructure that can be visited. This accessible 3D map was designed with focus on universal design.

Specifically, the main objective was to create a 3d model to help blind people in their spatial perception. In this context, a plant in Computer Aided format was used. However, this blueprint needed to be adapted to ensure our modelling needs.

First, a graphic tool was used to erase unnecessary elements (such as adjacent buildings and urban furniture). This step was very important to highlight important areas and hide others, not so important. It was important to define the level of detail of the model because it is believed that too much detail could lead to an enormous amount of elements to be perceived, thus confusing the blind user. Also, with this software other elements were added to delimit the model (such as: background, map limits, and sections) and other complex elements which had to be redesigned following the original route. Areas with similar elements were joined into groups in order to facilitate their future recognition. All this redesign was made to simplify the model and making it ‘lighter’, providing a model that is as “low poly” as possible (Fig. 5).

Fig. 5.
figure 5

Top view details

After this process, the file was exported to an 3d modelling software. In this software the model was resized (width: 84 cm, length: 118 cm and height: 4 cm). After creating the solid faces, they were subsequently extruded with different values for each group (previously defined) in order to be properly recognized, according to the different types of elements. The maximum height size was defined to 4 cm.

As the model was designed as a universal design model, it is intended to be used as a guide for all tourists (not only for the blind), so different colors and textures were added to the different elements.

In terms of audio information, the model proposes the use of NFC technology embedded on the physical map to provide audio feedback. Near field communication (NFC) is a set of communication protocols that enable devices to establish communication by bringing them close to each other (4 cm, on average). Pervasive computing research has explored the potential benefits of creating a connection between the information that can be stored on the virtual world and elements that are present in the physical world (Want 2011). Typical applications are the communication of two devices for file sharing. Another application is the use of a mobile device to read an NFC enabled credit card, or tag. The latter is the feature used in the model proposed.

Using an NFC reader embedded on the electronic white cane, the blind can use the cane to obtain contextual information about each specific point of interest, marked in the map. The audio information is delivered by a mobile application, using information stored on the NFC tags. The NFC tags are placed physically on the 3D map, on strategic places regarding special points-of-interest, as Fig. 6 suggests. This helps the user on creating a mental map of the expected features to visit, or to expect, on the environment on which he intends to navigate.

Fig. 6.
figure 6

Global model of the use of the technologies (3D mapping, NFC, RFID, iBeacon).

4 Conclusions and Future Work

This paper presented a model to enhance the perception of spatial location for the blind, combining NFC technology and 3D mapping. After analysing different, related, work and the technologies they presented as options an innovative solution is presented to connect new and old technologies that are believed to be an interesting and usable solution for blind people to use. In this context, as future work, the design of the textures and usability testing with the blind users will be made.