Abstract
We present a wayfinding method that assists blind people in determining the correct direction to a destination by taking a one-shot image. Signage is standard in public buildings and used to help visitors, but has little benefit for blind people. Our one-shot wayfinding method recognizes surrounding signage in all directions from an equirectangular image captured using a 360-degree smartphone camera. The method analyzes the relationship between detected text and arrows on signage and estimates the correct direction toward the user’s destination. In other words, the method enables wayfinding for the blind without requiring either environmental modifications (e.g. Bluetooth beacons) or preparation of map data. In a user study, we compared our method with a baseline method: a signage reader using a smartphone camera with a standard field of view. We found that our method enabled the participants to decide directions more efficiently than with the baseline method.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
- 2.
- 3.
- 4.
- 5.
All communication with the participants was in their native language. In this paper, we describe any translated content in the form of “translated content”.
References
BeSpecular (2016). https://www.bespecular.com
Seeing AI (2017). https://www.microsoft.com/en-us/seeing-ai
Ahmetovic, D., Gleason, C., Ruan, C., Kitani, K., Takagi, H., Asakawa, C.: NavCog: a navigational cognitive assistant for the blind. In: MobileHCI (2016)
Alnafessah, A., Al-Ammar, M.A., Alhadhrami, S., Al-Salman, A., Al-Khalifa, H.S.: Developing an ultra wideband indoor navigation system for visually impaired people. IJDSN 12, 403–416 (2016)
Augusto Borges Oliveira, D., Palhares Viana, M.: Fast CNN-based document layout analysis. In: ICCVW (2017)
Bigham, J.P., et al.: VizWiz: nearly real-time answers to visual questions. In: UIST (2010)
Brooke, J.: SUS: a ‘quick and dirty’ usability. In: Usability Evaluation in Industry, p. 189 (1996)
Fiannaca, A., Apostolopoulous, I., Folmer, E.: HEADLOCK: a wearable navigation aid that helps blind cane users traverse large open spaces. In: ASSETS (2014)
Gabow, H.N., Galil, Z., Spencer, T., Tarjan, R.E.: Efficient algorithms for finding minimum spanning trees in undirected and directed graphs. Combinatorica 6(2), 109–122 (1986). https://doi.org/10.1007/BF02579168
Gallagher, T., Wise, E., Li, B., Dempster, A.G., Rizos, C., Ramsey-Stewart, E.: Indoor positioning system based on sensor fusion for the blind and visually impaired. In: IPIN (2012)
Garcia, G., Nahapetian, A.: Wearable computing for image-based indoor navigation of the visually impaired. In: WH. A (2015)
Guentert, M.: Improving public transit accessibility for blind riders: a train station navigation assistant. In: ASSETS (2011)
Guerreiro, J.A., Ahmetovic, D., Sato, D., Kitani, K., Asakawa, C.: Airport accessibility and navigation assistance for people with visual impairments. In: CHI (2019)
Guidelines, I.H.F.: Wayfinding Guidelines International Health Facility Guidelines (2016). http://www.healthfacilityguidelines.com/GuidelineIndex/Index/Wayfinding-Guidelines
Jayant, C., Ji, H., White, S., Bigham, J.P.: Supporting blind photography. In: ASSETS (2011)
Kayukawa, S., Tatsuya, I., Takagi, H., Morishima, S., Asakawa, C.: Guiding blind pedestrians in public spaces by understanding walking behavior of nearby pedestrians. IMWUT 4(3), 1–22 (2020)
Kelley, J.F.: An iterative design methodology for user-friendly natural language office information applications. TOIS 2(1), 26–41 (1984)
Ko, E., Ju, J.S., Kim, E.Y.: Situation-based indoor wayfinding system for the visually impaired. In: ASSETS (2011)
Kuribayashi, M., Kayukawa, S., Takagi, H., Asakawa, C., Morishima, S.: LineChaser: a smartphone-based navigation system for blind people to stand in line. In: CHI (2021)
Kuznetsova, A., et al.: The open images dataset v4: unified image classification, object detection, and visual relationship detection at scale. IJCV 128(7), 1956–1981 (2020)
Lee, K., Hong, J., Pimento, S., Jarjue, E., Kacorri, H.: Revisiting blind photography in the context of teachable object recognizers. In: ASSETS (2019)
Li, B., Muñoz, J.P., Rong, X., Xiao, J., Tian, Y., Arditi, A.: ISANA: wearable context-aware indoor assistive navigation with obstacle avoidance for the blind. In: ECCVW (2016)
Loomis, J.M., Lippa, Y., Klatzky, R.L., Golledge, R.G.: Spatial updating of locations specified by 3-D sound and spatial language. JEP:LMC 28(2), 335 (2002)
Manduchi, R., Coughlan, J.M.: The last meter: blind visual guidance to a target. In: CHI (2014)
Pal, J., Viswanathan, A., Song, J.H.: Smartphone adoption drivers and challenges in urban living: cases from Seoul and Bangalore. In: IHCI (2016)
Panëels, S.A., Olmos, A., Blum, J.R., Cooperstock, J.R.: Listen to it yourself! Evaluating usability of what’s around me? For the blind. In: CHI (2013)
Presti, G., et al.: WatchOut: obstacle sonification for people with visual impairment or blindness. In: ASSETS (2019)
Redmon, J., Farhadi, A.: YOLOv3: an incremental improvement. arXiv (2018)
Saha, M., Fiannaca, A.J., Kneisel, M., Cutrell, E., Morris, M.R.: Closing the gap: designing for the last-few-meters wayfinding problem for people with visual impairments. In: ASSETS (2019)
Sato, D., Oh, U., Naito, K., Takagi, H., Kitani, K., Asakawa, C.: NavCog3: an evaluation of a smartphone-based blind indoor navigation assistant with semantic features in a large-scale environment. In: ASSETS (2017)
Shen, H., Coughlan, J.M.: Towards a real-time system for finding and reading signs for visually impaired users. In: Miesenberger, K., Karshmer, A., Penaz, P., Zagler, W. (eds.) ICCHP 2012. LNCS, vol. 7383, pp. 41–47. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-31534-3_7
Treuillet, S., Royer, E.: Outdoor/indoor vision based localization for blind pedestrian navigation assistance. IJIG 10, 481–496 (2010)
Vázquez, M., Steinfeld, A.: Helping visually impaired users properly aim a camera. In: ASSETS (2012)
Wang, S., Tian, Y.: Indoor signage detection based on saliency map and bipartite graph matching. In: ICBBW (2011)
Wang, S., Tian, Y.: Camera-based signage detection and recognition for blind persons. In: Miesenberger, K., Karshmer, A., Penaz, P., Zagler, W. (eds.) ICCHP 2012. LNCS, vol. 7383, pp. 17–24. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-31534-3_3
Yamanaka, Y., Takaya, E., Kurihara, S.: Tactile tile detection integrated with ground detection using an RGB-depth sensor. In: ICAART (2020)
Zhao, Y., Wu, S., Reynolds, L., Azenkot, S.: A face recognition application for people with visual impairments: understanding use beyond the lab. In: CHI (2018)
Zhong, Y., Lasecki, W.S., Brady, E., Bigham, J.P.: RegionSpeak: quick comprehensive spatial descriptions of complex images for blind users. In: CHI (2015)
Acknowledgments
We would like to thank all participants who took part in our user study. We would also thank Japan Airport Terminal Co., Ltd. and East Japan Railway Company. This work was supported by AMED (JP20dk0310108, JP21dk0310108h0002), JSPS KAKENHI (JP20J23018), and Grant-in-Aid for Young Scientists (Early Bird, Waseda Research Institute for Science and Engineering, BD070Z003100).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 ICST Institute for Computer Sciences, Social Informatics and Telecommunications Engineering
About this paper
Cite this paper
Yamanaka, Y., Kayukawa, S., Takagi, H., Nagaoka, Y., Hiratsuka, Y., Kurihara, S. (2022). One-Shot Wayfinding Method for Blind People via OCR and Arrow Analysis with a 360-Degree Smartphone Camera. In: Hara, T., Yamaguchi, H. (eds) Mobile and Ubiquitous Systems: Computing, Networking and Services. MobiQuitous 2021. Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering, vol 419. Springer, Cham. https://doi.org/10.1007/978-3-030-94822-1_9
Download citation
DOI: https://doi.org/10.1007/978-3-030-94822-1_9
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-94821-4
Online ISBN: 978-3-030-94822-1
eBook Packages: Computer ScienceComputer Science (R0)