Skip to main content

Wayfinding

  • Chapter
  • First Online:
Web Accessibility

Part of the book series: Human–Computer Interaction Series ((HCIS))

  • 3026 Accesses

Abstract

Wayfinding is a fundamental ability for daily living of people with disability. People with visual impairments have difficulty to find and follow an appropriate route, and wheelchair users need to find an accessible route without gaps or stairs. Wayfinding systems allow them to navigate indoor and outdoor environment seamlessly, and assist their daily mobility to schools, offices and any other places they are interested in. This chapter will focus on introducing technologies to enable such wayfinding systems to assist people with disabilities.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 229.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 299.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 299.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  • Accessible Japan, Accessible transportation. https://www.accessible-japan.com/transportation/

  • Ahmetovic D, Gleason C, Ruan C, Kitani K, Takagi H, Asakawa C (2016) NavCog: a navigational cognitive assistant for the blind. In: Proceedings of the 18th international conference on human-computer interaction with mobile devices and services. ACM, pp 90–99

    Google Scholar 

  • Antol S, Agrawal A, Lu J, Mitchell M, Batra D, Lawrence Zitnick C, Parikh D (2015) Vqa: Visual question answering. In: Proceedings of the IEEE international conference on computer vision, pp 2425–2433

    Google Scholar 

  • Bagautdinov TM, Alahi A, Fleuret F, Fua P, Savarese S (2017) Social scene understanding: end-to-end multi-person action localization and collective activity recognition. In: CVPR, pp 3425–3434

    Google Scholar 

  • Bahl P, Padmanabhan VN (2000) RADAR: an in-building RF-based user location and tracking system. In: Proceedings of IEEE INFOCOM 2000, nineteenth annual joint conference of the IEEE computer and communications societies, vol 2, pp 775–784

    Google Scholar 

  • Bakillah M, Mobasheri A, Rousell A, Hahmann S, Jokar J, Liang SH (2014) Toward a collective tagging android application for gathering accessibility-related geospatial data in European cities. Parameters 10:21

    Google Scholar 

  • BlindSquare. http://www.blindsquare.com/

  • Borenstein J, Ulrich I (1997) The guidecane-a computerized travel aid for the active guidance of blind pedestrians. ICRA

    Google Scholar 

  • Bradley NA, Dunlop MD (2005) An experimental investigation into wayfinding directions for visually impaired people. Personal Ubiquitous Comput. 9(6):395–403. http://dx.doi.org/10.1007/s00779-005-0350-y

    Article  Google Scholar 

  • Carlson T, Millan JR (2013) Brain-controlled wheelchairs: a robotic architecture. IEEE Robot Autom Mag 20(1): 65–73

    Article  Google Scholar 

  • Ciurana M, LĂłpez D, BarcelĂł-Arroyo F (2009) SofTOA: Software ranging for TOA-based positioning of WLAN terminals. In: International symposium on location-and context-awareness. Springer, Berlin, pp 207–221

    Google Scholar 

  • Dakopoulos D, Bourbakis NG (2010) Wearable obstacle avoidance electronic travel aids for blind: a survey. IEEE Trans Syst Man Cybern Part C (Appl Rev) 40(1):25–35

    Article  Google Scholar 

  • Ding C, Wald M, Wills G (2014) A survey of open accessibility data. In: Proceedings of the 11th web for all conference. ACM, p. 37

    Google Scholar 

  • Ducasse J, Brock AM, Jouffrais C (2018) Accessible interactive maps for visually impaired Users. In: Pissaloux E, Velazquez R (eds) Mobility of visually impaired people. Springer, Cham

    Google Scholar 

  • Fallah N, Apostolopoulos I, Bekris K, Folmer E (2012) The user as a sensor: navigating users with visual impairments in indoor spaces using tactile landmarks. In: Proceedings of the SIGCHI conference on human factors in computing systems. ACM, pp 425–432

    Google Scholar 

  • Faragher R, Harle R (2015) Location fingerprinting with bluetooth low energy beacons. IEEE J Sel Areas Commun 33(11):2418–2428

    Article  Google Scholar 

  • Filipe V et al (2012) Blind navigation support system based on Microsoft Kinect. Procedia Comput Sci 14: 94–101

    Article  Google Scholar 

  • Flores GH, Manduchi R (2018) WeAllWalk: an annotated dataset of inertial sensor time series from blind walkers. ACM Trans Access Comput (TACCESS) 11(1):4

    Google Scholar 

  • Gezici S, Tian Z, Giannakis GB, Kobayashi H, Molisch AF, Poor HV, Sahinoglu Z (2005) Localization via ultra-wideband radios: a look at positioning aspects for future sensor networks. IEEE Signal Process Mag 22(4):70–84

    Article  Google Scholar 

  • Giudice NA, Legge GE (2008) Blind navigation and the role of technology. The engineering handbook of smart technology for aging, disability, and independence, pp 479–500

    Chapter  Google Scholar 

  • Google (2018a) Introducing “wheelchair accessible” routes in transit navigation https://www.blog.google/products/maps/introducing-wheelchair-accessible-routes-transit-navigation/. Accessed July 2018

  • Google (2018b) Introducing android 9 pie. https://android-developers.googleblog.com/2018/08/introducing-android-9-pie.html

  • Guerreiro J, Ahmetovic D, Kitani KM, Asakawa C (2017) Virtual navigation for blind people: building sequential representations of the real-world. In: Proceedings of the 19th international ACM SIGACCESS conference on computers and accessibility (ASSETS ’17). ACM, New York, NY, USA, pp 280–289. https://doi.org/10.1145/3132525.3132545

  • Gurari D et al (2018) VizWiz grand challenge: answering visual questions from blind people. arXiv:1802.08218

  • Hähnel BFD, Fox D (2006) Gaussian processes for signal strength-based location estimation. In: Proceeding of robotics: science and systems

    Google Scholar 

  • Hara K, Le V, Froehlich J (2013) Combining crowdsourcing and google street view to identify street-level accessibility problems. In: Proceedings of the SIGCHI conference on human factors in computing systems. ACM, pp 631–640

    Google Scholar 

  • Harle R (2013) A survey of indoor inertial positioning systems for pedestrians. IEEE Commun Surv Tutor 15(3):1281–1293

    Article  Google Scholar 

  • Harm de Vries (2018) “Talk the Walk: Navigating New York City through Grounded Dialogue.” https://arxiv.org/abs/1807.03367

  • Haverinen J, Kemppainen A (2009) Global indoor self-localization based on the ambient magnetic field. Robot Auton Syst 57(10):1028–1035

    Article  Google Scholar 

  • He S, Chan SHG (2016) Wi-Fi fingerprint-based indoor positioning: recent advances and comparisons. IEEE Commun Surv Tutor 18(1):466–490

    Article  Google Scholar 

  • Hilsenbeck S, Bobkov D, Schroth G, Huitl R, Steinbach E (2014) Graph-based data fusion of pedometer and WiFi measurements for mobile indoor positioning. In: Proceedings of the 2014 ACM international joint conference on pervasive and ubiquitous computing. ACM, pp 147–158

    Google Scholar 

  • Hirzinger G et al. (1993) Sensor-based space robotics-ROTEX and its telerobotic features. IEEE Trans Robot Autom 9(5): 649–663

    Article  Google Scholar 

  • HULOP, Human-scale Localization Platform. https://hulop.mybluemix.net/

  • IEEE 802.11, Liaison from 3GPP RAN4 on RTT measurement accuracy. (2016) https://mentor.ieee.org/802.11/dcn/16/11-16-1338-00-0000-liaison-from-3gpp-ran4-on-rtt-measurement-accuracy.doc

  • Ifukube Tohru, Sasaki Tadayuki, Peng Chen (1991) A blind mobility aid modeled after echolocation of bats. IEEE Trans Biomed Eng 38(5):461–465

    Article  Google Scholar 

  • Kane SK, Morris MR, Perkins AZ, Wigdor D, Ladner RE, Wobbrock JO (2011). Access overlays: improving non-visual access to large touch screens for blind users. In: Proceedings of the 24th annual ACM symposium on user interface software and technology (UIST ’11). ACM, New York, NY, USA, pp 273–282. https://doi.org/10.1145/2047196.2047232

  • Kolarik AJ et al (2014) A summary of research investigating echolocation abilities of blind and sighted humans. Hear Res 310: 60–68

    Article  Google Scholar 

  • Kulyukin V, Gharpure C, Nicholson J (2005) Robocart: toward robot-assisted navigation of grocery stores by the visually impaired. In: 2005 IEEE/RSJ international conference on intelligent robots and systems, 2005 (IROS 2005). IEEE, pp 2845–2850

    Google Scholar 

  • Lahav O, Mioduser D (2000) Multisensory virtual environment for supporting blind persons’ acquisition of spatial cognitive mapping, orientation, and mobility skills. In: Proceedings of the third international conference on disability, virtual reality and associated technologies, ICDVRAT 2000.

    Google Scholar 

  • Lazik P, Rajagopal N, Shih O, Sinopoli B, Rowe A (2015) ALPS: a bluetooth and ultrasound platform for mapping and localization. In: Proceedings of the 13th ACM conference on embedded networked sensor system. ACM, pp 73–84

    Google Scholar 

  • LĂ©cuyer A et al (2003) HOMERE: a multimodal system for visually impaired people to explore virtual environments. IEEE virtual reality, 2003. Proceedings, IEEE

    Google Scholar 

  • Liu H, Darabi H, Banerjee P, Liu J (2007) Survey of wireless indoor positioning techniques and systems. IEEE Trans Syst Man Cybern Part C (Appl Rev) 37(6):1067–1080

    Article  Google Scholar 

  • Li F, Zhao C, Ding G, Gong J, Liu C, Zhao F (2012) A reliable and accurate indoor localization method using phone inertial sensors. In: Proceedings of the 2012 ACM conference on ubiquitous computing. ACM, pp 421–430

    Google Scholar 

  • Loomis JM, Klatzky RL, Golledge RG, Cicinelli JG, Pellegrino JW, Fry PA (1993) Nonvisual navigation by blind and sighted: assessment of path integration ability. J Exp Psychol Gen 122(1):73

    Article  Google Scholar 

  • Loomis JM, Golledge RG, Klatzky RL (1998) Navigation system for the blind: auditory display modes and guidance. Presence 7(2):193–203

    Article  Google Scholar 

  • Manduchi R, Kurniawan S, Bagherinia H (2010) Blind guidance using mobile computer vision: a usability study. In: Proceedings of the 12th international ACM SIGACCESS conference on Computers and accessibility. ACM, pp 241–242

    Google Scholar 

  • Marston JR, Loomis JM, Klatzky RL, Golledge RG, Smith EL (2006) Evaluation of spatial displays for navigation without sight. ACM Trans Appl Perception (TAP) 3(2):110–124

    Article  Google Scholar 

  • Marston JR, Loomis JM, Klatzky RL, Golledge RG (2007) Nonvisual route following with guidance from a simple haptic or auditory display. J Vis Impair Blind 101(4):203–211

    Article  Google Scholar 

  • Mautz, Rainer. Indoor positioning technologies (2012)

    Google Scholar 

  • Ministry of Land, Infrastructure, Transport and Tourism, Japan (2018) Development specification for spatial network model for pedestrians. http://www.mlit.go.jp/common/001244373.pdf. Accessed July 2018

  • Mobasheri A, Deister J, Dieterich H (2017) Wheelmap: The wheelchair accessibility crowdsourcing platform. Open Geospatial Data, Softw Stand 2(1):27

    Article  Google Scholar 

  • Murata M, Sato D, Ahmetovic D, Takagi H, Kitani MK, Asakawa C (2018) Smartphone-based indoor localization for blind navigation across building Complexes. International conference on pervasive computing and communications (PerCom)

    Google Scholar 

  • Murphy, Michael P et al (2011) The littledog robot. Int J Robot Res 30(2):145–149

    Article  Google Scholar 

  • Narzt W, Pomberger G, Ferscha A, Kolb D, MĂĽller R, Wieghardt J, Lindinger C (2006). Augmented reality navigation systems. Univers Access Inf Soc 4(3):177–187

    Article  Google Scholar 

  • Paladugu DA, Wang Z, Li B (2010) On presenting audio-tactile maps to visually impaired users for getting directions. In: CHI’10 extended abstracts on human factors in computing systems (CHI EA ’10). ACM, New York, NY, USA, pp 3955–3960. https://doi.org/10.1145/1753846.1754085

  • Palazzi CE, Teodori L, Roccetti M (2010). Path 2.0: A participatory system for the generation of accessible routes. In 2010 IEEE international conference on multimedia and expo (ICME), IEEE, pp 1707–1711

    Google Scholar 

  • Park HS (2014). Social scene understanding from social cameras (Doctoral dissertation, Carnegie Mellon University)

    Google Scholar 

  • PĂ©rez JE, Arrue M, Kobayashi M, Takagi H, Asakawa C (2017). Assessment of semantic taxonomies for blind indoor navigation based on a shopping center use case. In Proceedings of the 14th Web for All conference on the future of accessible work. ACM, p 19

    Google Scholar 

  • Petrie H, Johnson V, Strothotte T, Raab A, Fritz S, Michel R (1996) MoBIC: Designing a travel aid for blind and elderly people. J Navig 49(1):45–52

    Article  Google Scholar 

  • Petrie H, Johnson V, Strothotte T, Raab A, Michel R, Reichert L, Schalt A (1997) MoBIC: An aid to increase the independent mobility of blind travellers. Br J Vis Impair 15(2):63–66

    Article  Google Scholar 

  • Philips, Johan et al (2007) Adaptive shared control of a brain-actuated simulated wheelchair. In: IEEE 10th international conference on rehabilitation robotics ICORR 2007. IEEE

    Google Scholar 

  • Picinali L et al (2014) Exploration of architectural spaces by blind people using auditory virtual reality for the construction of spatial knowledge. Int J Hum Comput Stud 72(4):393–407

    Google Scholar 

  • Richtel M (2010) Forget gum. Walking and using phone is risky, The New York Times, p 17

    Google Scholar 

  • Russakovsky O, Deng J, Su H, Krause J, Satheesh S, Ma S, Berg AC (2015). Imagenet large scale visual recognition challenge. Int J Comput Vis 115(3):211–252

    Article  MathSciNet  Google Scholar 

  • Saon G, Kurata G, Sercu T, Audhkhasi K, Thomas S, Dimitriadis D, Roomi B (2017). English conversational telephone speech recognition by humans and machines. arXiv:1703.02136

  • Sato D, Oh U, Naito K, Takagi H, Kitani K, Asakawa C (2017). Navcog3: An evaluation of a smartphone-based blind indoor navigation assistant with semantic features in a large-scale environment. In: Proceedings of the 19th international ACM SIGACCESS conference on computers and accessibility. ACM, pp 270–279

    Google Scholar 

  • Schroff F, Kalenichenko D, Philbin J (2015). Facenet: A unified embedding for face recognition and clustering. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 815–823

    Google Scholar 

  • Seeing AI, Microsoft. https://www.microsoft.com/en-us/seeing-ai

  • Seeng Eye GPS, Sendero Group. http://www.senderogroup.com/products/seeingeyegps/index.html

  • Strothotte T, Fritz S, Michel R, Raab A, Petrie H, Johnson V, Schalt A (1996). Development of dialogue systems for a mobility aid for blind people: initial design and usability testing. In: Proceedings of the second annual ACM conference on Assistive technologies. ACM, pp 139–144

    Google Scholar 

  • Su J, Rosenzweig A, Goel A, de Lara E, Truong KN (2010). Timbremap: enabling the visuallyimpaired to use maps on touch-enabled devices. In: Proceedings of the 12th international conference on Human computer interaction with mobile devices and services (MobileHCI ’10). ACM, New York, NY, USA, pp 17–26. https://doi.org/10.1145/1851600.1851606

  • Subhan F, Hasbullah H, Rozyyev A, Bakhsh ST (2011). Indoor positioning in bluetooth networks using fingerprinting and lateration approach. In: 2011 international conference on information science and applications (ICISA 2011). IEEE, pp 1–9

    Google Scholar 

  • Tachi, Susumu, Kiyoshi Komoriya. (1984) Guide dog robot.In: Autonomous mobile robots: control, planning, and architecture, pp 360–367

    Google Scholar 

  • Taigman Y, Yang M, Ranzato MA, Wolf L (2014). Deepface: Closing the gap to human-level performance in face verification. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 1701–1708

    Google Scholar 

  • Takagi H. Realizing a barrier-free society. https://www.ibm.com/blogs/research/2017/02/realizing-a-barrier-free-society/

  • Takizawa, Hotaka et al (2012) Kinect cane: An assistive system for the visually impaired based on three-dimensional object recognition.In: 2012 IEEE/SICE international symposium on system integration (SII), IEEE

    Google Scholar 

  • Transport for London, Transport accessibility. https://tfl.gov.uk/transport-accessibility/

  • van Diggelen F, Enge P (2015) The worlds first GPS MOOC and worldwide laboratory using smartphones. In: Proceedings of the 28th international technical meeting of the satellite division of the institute of navigation (ION GNSS+ 2015), pp 361–369

    Google Scholar 

  • Vinyals O, Toshev A, Bengio S, Erhan D (2015). Show and tell: A neural image caption generator. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 3156–3164

    Google Scholar 

  • Wang J, Katabi D (2013). Dude, where’s my card?: RFID positioning that works with multipath and non-line of sight. ACM SIGCOMM Comput Commun Rev 43(4):51–62. ACM

    Google Scholar 

  • Wang Z, Li B, Hedgpeth T, Haven T (2009) Instant tactile-audio map: enabling access to digital maps for people with visual impairment. In: Proceedings of the 11th international ACM SIGACCESS conference on Computers and accessibility (Assets ’09). ACM, New York, NY, USA, pp 43–50. https://doi.org/10.1145/1639642.1639652

  • Wang, Yunqing, Katherine J, Kuchenbecker. (2012) HALO: Haptic alerts for low-hanging obstacles in white cane navigation. In: 2012 IEEE Haptics Symposium (HAPTICS). IEEE

    Google Scholar 

  • Watson Assistant, IBM. https://www.ibm.com/watson/ai-assistant/

  • Wayfindr (2018) Open Standard for audio-based wayfinding. Working Draft ver. 2.0. https://www.wayfindr.net/open-standard. Accessed July 2018

  • Wiener WR, Welsh RL, Blasch BB (2010). Foundations of orientation and mobility, vol 1. American Foundation for the Blind

    Google Scholar 

  • Wilson J, Walker BN, Lindsay J, Cambias C, Dellaert F (2007). Swan: System for wearable audio navigation. In: 11th IEEE international symposium on wearable computers, 2007. IEEE, pp 91–98

    Google Scholar 

  • Wu, Kaishun, et al (2013) CSI-based indoor localization. IEEE Trans Parallel Distrib Syst 24(7):1300–1309

    Article  Google Scholar 

  • Xu H, Yang Z, Zhou Z, Shangguan L, Yi K, Liu Y (2015). Enhancing wifi-based localization with visual clues. In: Proceedings of the 2015 ACM international joint conference on pervasive and ubiquitous computing. ACM, pp 963–974

    Google Scholar 

  • Yang Z, Wu C, Zhou Z, Zhang X, Wang X, Liu Y (2015) Mobility increases localizability: A survey on wireless indoor localization using inertial sensors. ACM Comput Surv (Csur) 47(3):54

    Article  Google Scholar 

  • Yatani K, Banovic N, Truong K (2012). SpaceSense: representing geographical information to visually impaired people using spatial tactile feedback. In: Proceedings of the SIGCHI conference on human factors in computing systems (CHI ’12). ACM, New York, NY, USA, pp 415–424. https://doi.org/10.1145/2207676.2207734

  • NavCog PCB, NavCog at Annual PCB conference on Youtube. https://www.youtube.com/watch?v=KkRigGqTsuc&t=2s

  • Yuan, Dan, and Roberto Manduchi. (2005) Dynamic environment exploration using a virtual white cane.In: IEEE computer society conference on computer vision and pattern recognition, CVPR 2005. vol 1. IEEE

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Daisuke Sato .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer-Verlag London Ltd., part of Springer Nature

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Sato, D., Takagi, H., Asakawa, C. (2019). Wayfinding. In: Yesilada, Y., Harper, S. (eds) Web Accessibility. Human–Computer Interaction Series. Springer, London. https://doi.org/10.1007/978-1-4471-7440-0_34

Download citation

  • DOI: https://doi.org/10.1007/978-1-4471-7440-0_34

  • Published:

  • Publisher Name: Springer, London

  • Print ISBN: 978-1-4471-7439-4

  • Online ISBN: 978-1-4471-7440-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics