Abstract
This article deals with a method for interacting with a handheld navigation application, based on using the mobile device for pointing. When the user points the device in any direction, feedback will be provided based on if the user is aiming at the next point in the track or beside it. The presented study has been performed in order to get a better understanding of how the basic parameters in this type of interaction—like the angle for pointing and the size of the target—influence the navigation performance. We have applied a dual investigation by running a computer simulation varying additional parameters such as GPS accuracy and user behavior, and also running an in-context study with 15 participants in a realistic outdoor setting with real location-based GPS data. The study has resulted in general recommendations for angle intervals and the radius of the circles surrounding the track points.



















Similar content being viewed by others
References
Stahl C (2007) The roaring navigator: a group guide for the zoo with shared auditory landmark display. In: Proceedings of the 9th international conference on human computer interaction with mobile devices and services (MobileHCI’07)
Jones M, Jones S, Bradley G, Warren N, Bainbridge D, Holmes G (2008) ONTRACK: dynamically adapting music playback to support navigation. Pers Ubiquitous Comput 12(7):513–525
Brunnberg L, Juhlin O, Gustafsson A (2009) Games for passengers—accounting for motion in location-based applications. In: Proceeding FDG’09, Port Canaveral, FL, USA. ACM
McGookin D, Brewster S, Prieg P (2009) Audio bubbles: employing non-speech audio to support tourist wayfinding. In: Proceedings of the 4th international workshop on haptic and audio interaction design (HAID ‘09)
Magnusson C, Breidegard B, Rassmus-Gröhn K (2009) Soundcrumbs—Hansel and Gretel in the 21st century. In: Proceedings of the 4th international workshop on haptic and audio interaction design (HAID ‘09)
Robinson S, Eslambolchilar P, Jones M (2009) Sweep-shake: finding digital resources in physical environments. In: Proceedings of the 11th international conference on human-computer interaction with mobile devices and services (MobileHCI’09)
Williamson J, Robinson S, Stewart C, Murray-Smith R, Jones M, Brewster S (2010) Social gravity: a virtual elastic tether for casual, privacy-preserving pedestrian rendezvous. Accepted for publication in proceedings of the 2010 conference on human factors in computing systems (CHI 2010)
Robinson S, Jones M, Eslambolchilar P, Murray-Smith R, Lindborg M (2010) “I did it my way”: moving away from the tyranny of turn-by-turn pedestrian navigation. Accepted in MobileHCI 2010, 7–10 September 2010, Lisbon, Portugal
Magnusson C, Rassmus-Gröhn K, Szymczak D (2010) The influence of angle size in navigation applications using pointing gestures. In: The fifth international workshop on haptic and audio interaction design (HAID), September 16–17, 2010, Copenhagen, Denmark
Holland S, Morse DR, Gedenryd H (2002) Audiogps: spatial audio in a minimal attention interface. Pers Ubiquit Comput 6(4):253–259
Strachan S, Eslambolchilar P, Murray-Smith R, Hughes S, O’Modhrain S (2005) GpsTunes: controlling navigation via audio feedback. In: Proceedings of the 7th international conference on human computer interaction with mobile devices and services (MobileHCI ‘05)
Magnusson C, Molina M, Rassmus-Gröhn K, Szymczak D (2010) Pointing for non-visual orientation and navigation, NordiCHI 2010, October 16–20, 2010, Reykjavik, Iceland
Ahmaniemi T, Lantz V (2009) Augmented reality target finding based on tactile cues. In: Proceedings of the 2009 international conference on multimodal interfaces (ICMI-MLMI’09)
Strachan S, Williamson J, Murray-Smith R (2007) Show me the way to Monte Carlo. In: Proceedings of the SIGCHI conference on human factors in computing systems
Humanware (2010) TREKKER: http://www.humanware.com/en-europe/products/blindness/talking_gps/trekker/_details/id_88/trekker.html
Kapsys (2010) Kapten: http://www.kapsys.com/modules/movie/scenes/content/index.php?fuseAction=learnuse&rubric=learnuse&article=products_kapten
Calder DJ (2009) Travel aids for the blind—the digital ecosystem solution. In: Proceedings of the 7th IEEE international conference on industrial informatics (INDIN 2009)
Johnson L, Higgins C (2006) A navigation aid for the blind using tactile-visual sensory substitution. In: Proceedings of the 28th annual international IEEE conference of engineering in medicine and biology society (EMBS’06)
Gustafson-Pearce O, Billet E, Cecelja F (2007) Comparison between audio and tactile systems for delivering simple navigational information to visually impaired pedestrians. Br J Vis Impair 25(3):255–265
Sinnott RW (1984) Virtues of the haversine. Sky Telesc 68(2):159
Strachan S, Murray-Smith R (2009) Bearing-based selection in mobile spatial interaction. Pers Ubiquitous Comput 13(4):265–280
Sawhney N, Murphy A (1996) ESPACE 2: an experimental hyperaudio environment. In: Conference companion on human factors in computing systems: common ground (Vancouver, British Columbia, Canada, April 13–18, 1996)
Acknowledgments
We thank the EC which co-funds the IP HaptiMap (FP7-ICT-224675). We also thank VINNOVA for additional support.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Magnusson, C., Rassmus-Gröhn, K. & Szymczak, D. Navigation by pointing to GPS locations. Pers Ubiquit Comput 16, 959–971 (2012). https://doi.org/10.1007/s00779-011-0456-3
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00779-011-0456-3