Abstract
The study investigates a specialized voice navigation system designed for visually impaired users, enhancing voice guidance clarity and accuracy using virtual navigation simulator from autonomous driving systems. It examines the comprehension and proficiency of visually impaired individuals with voice-guided systems, involving a novel voice guide model that incorporates auditory beacons for more effective guidance. The evaluation involved seven visually impaired participants whose feedback was crucial in assessing the system’s practicality and efficiency. Results indicated that all participants successfully reached their destinations using the model, showcasing a significant enhancement in navigation aid for visually impaired users. This research contributes to assistive technologies by offering insights into developing more effective and user-friendly navigation systems, emphasizing the importance of user-centric design to meet the specific needs of visually impaired individuals.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Kanadalan, R.N., Namuduri, K.: A comprehensive survey of navigation systems for the visual impaired. Signal Processing. arXiv preprint arXiv:1906.05917 (2019)
Declerck, G., Lenay, C.: Living in space. A phenomenological account. In: Pissaloux, E., Velazquez, R. (eds.) Mobility of Visually Impaired People, p. 2018. Springer, Cham. (2018). https://doi.org/10.1007/978-3-319-54446-5
Use Live View on Google Maps. https://support.google.com/maps/answer/9332056?hl=en&co=GENIE.Platform%3DAndroid. Accessed 04 Nov 2022
Miura, T., Yabu, K.: Narrative review of assistive technologies and sensory substitution in people with visual and hearing impairment. Psychologia 65(1), 70–99 (2023)
Miura, T., Muraoka, T., Ifukube, T.: Comparison of obstacle sense ability between the blind and the sighted: a basic psychophysical study for designs of acoustic assistive devices. Acoust. Sci. Tech. 31(2), 137–147 (2010)
Chebat, D.R., Harrar, V., Kupers, R., Maidenbaum, S., Amedi, A., Ptito, M.: Sensory substitution and the neural correlates of navigation in blindness. In: Pissaloux, E., Velazquez, R. (eds.) Mobility of Visually Impaired People, pp. 167–200. Springer International Publishing, Cham (2018). https://doi.org/10.1007/978-3-319-54446-5_6
Rojas, J.A.M., Hermosilla, J.A., Montero, R.S., Espi, P.L.L.: Physical analysis of several organic signals for human echolocation: oral vacuum pulses. Acta Acust. Acust. 95(2), 325–330 (2009)
Nakamura-Funaba, H., Ueda, M., Iwamiya, S.: Questionnaire survey of the use of sound to support the mobility of the visually impaired. J. Acoust. Soc. Jpn. 62(12), 839–847 (2006)
Miura, T., Ebihara, Y., Sakajiri, M., Ifukube, T.: Utilization of auditory perceptions of sounds and silent objects for orientation and mobility by visually-impaired people. In: 2011 IEEE International Conference on Systems, Man, and Cybernetics, pp. 1080–1087 (2011)
Pissaloux, E., Velázquez, R.: On spatial cognition and mobility strategies. In: Pissaloux, E., Velazquez, R. (eds.) Mobility of Visually Impaired People, p. 2018. Springer, Cham. (2018). https://doi.org/10.1007/978-3-319-54446-5_5
Harrar, V., Aubin, S., Chebat, D.R., Kupers, R., Ptito, M.: The multisensory blind brain. In: Pissaloux, E., Velázquez, R. (eds.) Mobility of Visually Impaired People, pp. 111–136. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-54446-5_4
Matsuo, M., Miura, T., Ichikari, R., Kato, K., Kurata, T.: OTASCE map: a mobile map tool with customizable audio-tactile cues for the visually impaired. J. Technol. Persons Disabil. 8, 82–103 (2020)
Pygame. https://www.pygame.org/news. Accessed 04 Apr 2024
accessibleapps/accessible_output2. https://github.com/accessibleapps/accessible_output2. Accessed 04 Apr 2024
Kallie, C.S., Schrater, P.R., Legge, G.E.: Variability in stepping direction explains the veering behavior of blind walkers. J. Exp. Psychol. Hum. Percept. Perform. 33(1), 183–200 (2007)
Lewis, J.R., Utesch, B.S., Maher, D.E.: UMUX-LITE: when there's no time for the SUS. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 2099–2102. ACM, New York, NY (2013)
Acknowledgments
This study was supported by KAKENHI (grant number 21K18483, 21H00885, 23K16919, 23K17582 and 23H00996).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Erdenesambuu, D., Matsuo, M., Miura, T., Sakajiri, M., Onishi, J. (2024). Advancing Mobility for the Visually Impaired: A Virtual Sound-Based Navigation Simulator Interface. In: Miesenberger, K., Peňáz, P., Kobayashi, M. (eds) Computers Helping People with Special Needs. ICCHP 2024. Lecture Notes in Computer Science, vol 14750. Springer, Cham. https://doi.org/10.1007/978-3-031-62846-7_50
Download citation
DOI: https://doi.org/10.1007/978-3-031-62846-7_50
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-62845-0
Online ISBN: 978-3-031-62846-7
eBook Packages: Computer ScienceComputer Science (R0)