1 Introduction

Mobile phone application development has taken a huge step from its first days of development on monochrome screens. Today, sophisticated features are available and there are a large number of platforms to develop new software on. With smartphones having become a commodity tool and the general household appliances making its way online through the Internet of Things (IoT) movement, we are still moving at a slow pace. Common, and not unseen issues include the lack of complete application implementations, as highlighted by Paganelli et al. [1].

The IoT technologies and applications are still in their infancy [2], and the academic community still have a number of topics to investigate in depth to thoroughly cover the area. Although IoT initially was meant to describe a network of radio frequency enabled devices, it has since been expanded and grown to include in its definition interoperable devices and covers global network infrastructures from traditional cloud computing [3] to small embedded ad hoc networks in fog computing [4]. Interaction is first and foremost based on touch screen input and the feedback mechanisms rely heavily on audio and visual [5]. The movement of IoT has removed the barrier for interpreting sensor data and innovative solutions may exploit such feeds [6].

According to Perera et al. [7], most of the research done on IoT during the last two decades has been focused around prototypes, systems and solutions with a limited number of data sources. However, as the technology develops, a need to be context - aware and able to utilise a large number of sensors arises, and with this, the need to develop solutions which implement a strong core architecture and is flexible and modular enough to be combined with other IoT solutions. Especially office buildings are currently being enriched with sensor data and mobile ad hoc networks emerge for such sensors and actuators to be added and removed in an ad hoc manner [8]. Context-aware solutions and applications have been around for years and are often the target of research. The ability to successfully enrich applications with contextual dimensions can inherently increase success. To build on previous achievements and utilise context as a source of information is vital for user adaptation, interpretation and interface tailoring [9].

Building on Zhang et al. [8], the motivation in the field of information visualisation is to convey information efficiently to the user, in that the user should be able to do a task as efficiently and with so little cognitive effort as possible [10]. Users are often overwhelmed by large amounts of information to perceive. Designing a user-friendly interface hence involves giving priority to the information that is most crucial and that the user must be able to perceive. Especially in stressed situations, it is important to have clear purpose instructions. Current established solutions fail in one or more of these aspects.

With this as the curtain our research, we focus to investigate a novel exploitation of sensor-based feedback on smartphones and the use of haptic feedback as the primary means of navigation. Building upon general principles [11], now widely accepted through cloud computing and service-oriented architecture, we scale down from large scale distributed systems and focus on in-door mobile and ad hoc distributed systems. Here, smartphones, wearables and IoT-based sensors are widespread and engage actively in communication on themselves and users behalf. The motivation for the project has emerged from the case of developing an evacuation application for fire emergencies, where the situation is that visual and audio-based feedback may be difficult to perceive. However, this can be extended to be a general navigation aid for people that are visually impaired.

The rest of the paper is organised as follows. First, we introduce the scenario with mobile smartphones for navigation. Following this, we introduce the prototype design and architecture, before presenting and discussing the results. Towards the end we around off with our concluding remarks and point out possibilities for future work.

2 Scenario of Using Smartphones for Navigation

Smartphones have become the predominant personal assistant and is by the vast majority of the population brought with them at any given time. Accordingly, they are actively used as a vital source of information, and notifications are the most used feature to invoke the users’ attention. This relates both to new information, signs of danger and social interaction. A study by Aranda et al. [12], reports that when a user’s context can be decided correctly, and a notification helps them accomplish a task, it is welcomed. Further, the notifications need to be actionable in the moment, and hence this is an important design consideration to take forward in this study.

The use case for this mobile application is a situation where the user is situated in a larger building (e.g. a hotel or office building) where a fire has erupted. The building typically contains several floors, and potentially several possible emergency exits. The building also has a number of sensors for various purposes; like smoke detectors, motion detectors and temperature sensors. Data from the smoke detectors represent the level of danger associated with moving around in the area near that detector, while the temperature sensor data indicate the distance to the actual fire. These data are useful for monitoring how the fire develops but are also important for determining which areas in the building the users should avoid when evacuating the building. Data from the motion detectors can be used to determine the number of people present in the building, and where potential bottlenecks are, due to people gathering in stairways etc.

The idea then is that the application is able to calculate a safe route out of the building, based on the sensor data. The route should be calculated based on the user’s current position, and where the shortest safe route of the building is. However, the calculation of route is out of scope for this paper and will be discussed in later work. For this paper, it is assumed that such a route already has been calculated, and that the remaining issue is how to convey this route to the user in an efficient way. Further to this, the application must be able to convey this information to the user, potentially blinded by darkness, smoke from the fire or other similar hazards. Navigation must be facilitated through smartphone navigation without relying solemnly on visual aid.

3 Prototype Design and Architecture

In the aforementioned scenario, the mobile application is the primary guidance aid for the user in the emergency situation. The core functionality of the application focuses on guiding the user out of the building or towards the next given waypoint. To achieve this, design principles, UI guidance and system architecture have designated features to support multimodal interaction and communication with the user.

The prototype application was implemented using the Android Framework [13] using Android SDK version 7.0 Nougat following design principles of Google code style guidelines and material design. The architecture is based on the standard Android model view controller framework implementation [14] and a rigid separation of concerns between view, its related logic and the algorithmic functions. Architecture of the aforementioned classes and components are here visualised together with sensor communication and manipulation packages, Fig. 1. This component-based approach makes the prototype suitable for being embedded into other applications, if not further expanded on to lay ground for future sensors integration as a framework component.

Fig. 1.
figure 1

Prototype class diagram architecture

The current prototype, Fig. 1, has implemented functionality for navigation through haptic feedback. By having set a waypoint, and enabled tracking of direction, the application will help the user to navigate. Based on readings from the gyroscope sensor, received data is interpreted in the GyroscopeManager, the bearing of the user is calculated and compared to waypoint. If the user is off track, the haptic feedback will initiate and indicate for the user that s/he is facing in the wrong direction. The HapticNavigatorActivity controls haptic feedback and it is intensified the further away from correct direction the user is. When facing the correct direction, the haptic feedback stops indicating success. To be able to calculate this information the gyroscope is exploited as the primary sensor. Due to the extremely rapid sampling rate from the gyroscope sensor (average 0.06 s) a smoothing filter and a mean filter are applied to even out results to map to human hand movement. Further three complimentary filters using respectively orientation, rotation matrices and quaternions are applied for sensor fusion measurements. A Kalman filter based on quaternions and an allowed offset of two degrees are applied to adjust sensitivity and tolerate slight hand movements when the user is walking around. All logic related to these filtering, smoothing and calculation operations are contained in a package collection of filtering and rotation classes. A Datalogger outputs log data results to file for later analysis purposes and individual adjustments and sensor sensitivity can be adjusted in the ConfigurationActivity.

Further to this we explored the possibilities for indoor navigation using bluetooth beacons as means for way-finding in this mobile ad hoc communication scenario. Figure 2 shows the architecture for this implementation. The core of this test is the RangingActivity which calculated the bearing and distance to next informed beacon. All beacon information and associated reading of sensor information is maintained in BeaconListAdapter, Beacon and Calibration classes. To further communicate with readings from the gyroscope, the CompassActivity class maintains this information. For visual feedback to the user, a GaugeRotation implementation class controls this feedback.

Fig. 2.
figure 2

Beacon application architecture

4 Result and Discussion

The project is anchored in core Android development architecture and specifications as well as in universal design principles from computer science and mobile development research [14]. Following Perera et al. [7], a loosely coupled, modular and standardised architecture, enhanced the built solution exposing no single point of failure. Building on Android architecture components and patterns, facilitates for a robust design, with testable and maintainable applications. Features from accessibility is incorporated into the application design following API conventions from the model-view-view-model (MVVM) pattern and these Android features support out of the box a backward compatible approach which allows for focus to be put in applications features.

4.1 Application Prototype

The evacuation route is presented to the users using several modalities, both visual cues in the form of an arrow showing the direction to take, Fig. 3, possibly also explicit verbal information, in addition to audio-based feedback, giving instructions. In a situation of fire, potentially stressful sounds of fire alarms and difficulty of seeing anything due to smoke, make the ability to perceive such information even more challenging, hence the haptic feedback provided should aid the user to navigate even without the visual and audio-based feedback.

Fig. 3.
figure 3

Navigation application screenshot

The initial prototype has a simple visual design, showing arrows pointing in the direction the user should move in, in order to follow the correct path out of the building, Fig. 3. The main issue then, is how to use haptic feedback to convey similar information, or how to give the user a sense of direction only using haptic feedback.

To ensure correct information, and assess the given bearing of the application compared with received and interpreted gyroscope information, the application was also tested following a white-box testing approach, Fig. 4. In this round of verification received information was presented on screen and interpreted by the beta-testers. Measurement readings from the sensor were displayed giving bearing and relative position in the X, Y and Z dimension. Following, and further elaborating on, the approach by Aranda et al. [12], this interpreted information was compared with actual sensor positioning. The combination of a simple interface and multi-modal interaction in the final prototype (Fig. 3) underpins the conclusions by Furukawa et al. [10] about keep cognitive load as small as possible.

Fig. 4.
figure 4

Application screenshot

Thirdly it was of high importance to verify placement and manipulation of beacon information in such an ad-hoc scenario. For the approach to be confirmed and to address the correctness of interpreted sensor data, sensor implementations and location-aware API’s were investigated including Altbeacon, EstimoteSDK, Google Nearby API. Following internet of things properties as laid out by Gubbi et al. [6], the ability to extract associated data from the Beacon/API, offline/online connectivity and user direction mapping were given most weight in the evaluation. Figure 5 shows the manual control interface for the beacons.

When all put together, reviewing the application showcases a navigation approach with beacon information as the primary driver for finding waypoints. Information to the user is conveyed through a simple visual interface, haptic feedback is given when in use and the potential of spoken instructions are facilitated by the component-based architecture. Interesting opportunities, all in line with Furukawa and Yang [10], were explored through testing and showed that haptic feedback can be used not only for confirming correct path, but also be used for signalling when the user diverge from correct path, and even indicating back/forth direction using a push/pull stream of the haptic feedback.

Fig. 5.
figure 5

Adding beacon address

5 Conclusion

This paper investigates exploitation of sensor-based feedback on smartphones and the use of haptic feedback as means of navigation, particularly useful for people who have difficulties perceiving visual information. Building on the work from Aranda et al. [12], Furukawa and Yang [10] and Zhang et al. [8] we built haptic feedback into an Android prototype application. Through beta-testing of our prototype, the technological suitability has been verified. The testing reveals that haptic feedback can be used on commodity smartphones and further adds reason for this to be implemented as a full scale navigation solution building further upon the achieved results.

6 Future Work

The approach shows promising prospects for being extended to general outdoor navigation and existing navigation application such as Google maps and Waze Navigation application. We will continue to extend the prototype and testing also towards these outlooks and think they all are worthy future purse. In future work we will create a full-scale application and pursue to have data representing waypoints supplied to the system through IoT-devices and indoor mobile ad hoc networks. This will help calculate the safest route out of a building or navigating a given path. Using sensor data and information about possible exits, stairways and obstacles reported from users, the system with its smartphone frontend will be able to act as a full-blown navigation solution.

One of the main challenges, is how to convey a sense of direction to the user, solely on the basis on haptic feedback. Given that a correctly calculated path from the user’s current position exists, the navigation application should be able to convey the next direction the user should move in. Based on using the current prototype, it remains to be investigated further, what types of vibration patterns are most intuitive to the user. As an example, we are interested in whether the vibrations should be more or less intense, dependent on if the user is close to the correct direction or not. Different options need to be implemented, and a large user study is being planned to evaluate applicability and usability of the system in full scale.