skip to main content
10.1145/3313831.3376516acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

The Effectiveness of Visual and Audio Wayfinding Guidance on Smartglasses for People with Low Vision

Published: 23 April 2020 Publication History

Abstract

Wayfinding is a critical but challenging task for people who have low vision, a visual impairment that falls short of blindness. Prior wayfinding systems for people with visual impairments focused on blind people, providing only audio and tactile feedback. Since people with low vision use their remaining vision, we sought to determine how audio feedback compares to visual feedback in a wayfinding task. We developed visual and audio wayfinding guidance on smartglasses based on de facto standard approaches for blind and sighted people and conducted a study with 16 low vision participants. We found that participants made fewer mistakes and experienced lower cognitive load with visual feedback. Moreover, participants with a full field of view completed the wayfinding tasks faster when using visual feedback. However, many participants preferred audio feedback because of its shorter learning curve. We propose design guidelines for wayfinding systems for low vision.

Supplemental Material

MP4 File

References

[1]
Aginsky, V. et al. 1998. ENVIRONMENTAL PSYCHOLOGY Journalof TWO STRATEGIES FOR LEARNING A ROUTE IN A DRIVING SIMULATOR.
[2]
Ahmetovic, D. et al. 2016. NavCog: A Navigational Cognitive Assistant for the Blind. Proceedings of the 18th International Conference on Human-Computer Interaction with Mobile Devices and Services (2016), 90--99.
[3]
Alghamdi, S. et al. 2013. Indoor navigational aid using active RFID and QR-code for sighted and blind people. 2013 IEEE Eighth International Conference on Intelligent Sensors, Sensor Networks and Information Processing (Apr. 2013), 18--22.
[4]
Alkhanifer, A.A. and Ludi, S. 2014. Visually impaired orientation techniques in unfamiliar indoor environments. Proceedings of the 16th international ACM SIGACCESS conference on Computers & accessibility - ASSETS '14 (New York, New York, USA, 2014), 283--284.
[5]
AOA (American Optometric Association) Low Vision. Contact Lens Practice.
[6]
Apostolopoulos, I. et al. 2010. Feasibility of interactive localization and navigation of people with visual impairments. Intelligent Autonomous Systems 11, IAS 2010 (2010), 22--32.
[7]
Armstrong, R.A. 2014. When to use the Bonferroni correction. Ophthalmic and Physiological Optics. 34, 5 (Sep. 2014), 502--508.
[8]
Azenkot, S. et al. Smartphone Haptic Feedback for Nonvisual Wayfinding.
[9]
B. L. Bentzen and P. A. Mitchell 1995. Audible Signage as a Wayfinding Aid: Verbal Landmark versus Talking Signs., Journal of Visual Impairment & Blindness, 1995. Journal of Visual Impairment & Blindness. 89, 6 (1995), 494--505.
[10]
Balata, J. et al. 2016. Automatically Generated Landmark-enhanced Navigation Instructions for Blind Pedestrians. Proceedings of the 2016 Federated Conference on Computer Science and Information Systems. 8, (2016), 1605--1612.
[11]
Bhowmick, A. et al. 2014. IntelliNavi: Navigation for Blind Based on Kinect and Machine Learning. Springer, Cham. 172--183.
[12]
Blindness and vision impairment: 2018. https://www.who.int/news-room/factsheets/detail/blindness-and-visual-impairment. Accessed: 2019-08--23.
[13]
Common Types of Low Vision: http://www.aoa.org/patients-and-public/caring-foryour-vision/low-vision/common-types-of-lowvision'sso=y. Accessed: 2015-07-07.
[14]
Corn, A.L. and Erin, J.N. 2010. Foundations of Low Vision: Clinical and Functional Perspectives. American Foundation for the Blind.
[15]
Dakopoulos, D. and Bourbakis, N.G. 2010. Wearable Obstacle Avoidance Electronic Travel Aids for Blind: A Survey. IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews). 40, 1 (Jan. 2010), 25--35.
[16]
Ertan, S. et al. 1998. A wearable haptic navigation guidance system. International Symposium on Wearable Computers, Digest of Papers. 1998-Octob, (1998), 164--165.
[17]
Everingham, M.R. et al. 1999. Head-mounted mobility aid for low vision using scene classification techniques. International Journal of Virtual Reality. 3, (1999), 3--12.
[18]
Fallah, N. et al. 2013. Indoor Human Navigation Systems: A Survey. Interacting with Computers. 25, 1 (Jan. 2013), 21--33.
[19]
Fallah, N. et al. 2012. The user as a sensor. Proceedings of the 2012 ACM annual conference on Human Factors in Computing Systems - CHI '12 (New York, New York, USA, 2012), 425.
[20]
Faria, J. et al. 2010. Electronic white cane for blind people navigation assistance. 2010 World Automation Congress, WAC 2010 (2010).
[21]
Feiner, S. et al. 1997. A touring machine: Prototyping 3D mobile augmented reality systems for exploring the urban environment. Personal and Ubiquitous Computing. 1, 4 (1997), 208--217.
[22]
Fiannaca, A. et al. 2014. Headlock: a Wearable Navigation Aid that Helps Blind Cane Users Traverse Large Open Spaces. Proceedings of the 16th international ACM SIGACCESS conference on Computers & accessibility - ASSETS '14 (2014), 19-- 26.
[23]
Flores, G. et al. 2015. Vibrotactile guidance for wayfinding of blind walkers. IEEE Transactions on Haptics. 8, 3 (Jul. 2015), 306--317.
[24]
Flores, G. and Manduchi, R. 2018. Easy Return: An App for Indoor Backtracking Assistance. (2018).
[25]
Fusco, G. and Coughlan, J.M. 2018. Indoor Localization Using Computer Vision and VisualInertial Odometry. Springer, Cham. 86--93.
[26]
Gaunet, F. 2006. Verbal guidance rules for a localized wayfinding aid intended for blind-pedestrians in urban areas. Universal Access in the Information Society. 4, 4 (May 2006), 338--353.
[27]
Gharpure, C.P. and Kulyukin, V. a. 2008. Robotassisted shopping for the blind: Issues in spatial cognition and product selection. Intelligent Service Robotics. 1, 3 (2008), 237--251.
[28]
Google Maps: https://www.google.com/maps/@40.7599532,73.9477269,15z. Accessed: 2019-09-02.
[29]
Helal, A. et al. Drishti: an integrated navigation system for visually impaired and disabled. Proceedings Fifth International Symposium on Wearable Computers 149--156.
[30]
Hicks, S.L. et al. 2013. A Depth-Based HeadMounted Visual Display to Aid Navigation in Partially Sighted Individuals. PLoS ONE. 8, 7 (Jul. 2013), e67695.
[31]
Höllerer, T. et al. 1999. Exploring MARS: Developing Indoor and Outdoor User Interfaces to a Mobile Augmented Reality System. Computers & Graphics. 23, 6 (Dec. 1999), 779--785.
[32]
Huang, H.-C. et al. 2015. An Indoor Obstacle Detection System Using Depth Information and Region Growth. Sensors. 15, 10 (2015), 27116-- 27141.
[33]
Huang, J. et al. 2019. An augmented reality signreading assistant for users with reduced vision. PLOS ONE. 14, 1 (Jan. 2019), e0210630.
[34]
Hub, A. et al. 2004. Design and development of an indoor navigation and object identification system for the blind. Proceedings of the ACM SIGACCESS conference on Computers and accessibility - ASSETS '04 (New York, New York, USA, 2004), 147.
[35]
Introducing Live View, the new augmented reality feature in Google Maps - Google Maps Help: https://support.google.com/maps/thread/11554255?hl=en. Accessed: 2019--12--24.
[36]
Ishikawa, T. et al. 2008. Wayfinding with a GPSbased mobile navigation system: A comparison with maps and direct experience. Journal of Environmental Psychology. 28, (2008), 74--82.
[37]
Jones, T. and Troscianko, T. 2006. Mobility performance of low-vision adults using an electronic mobility aid. Clinical and Experimental Optometry. 89, 1 (Jan. 2006), 10--17.
[38]
Joseph, S.L. et al. 2013. Semantic indoor navigation with a blind-user oriented augmented reality. Proceedings - 2013 IEEE International Conference on Systems, Man, and Cybernetics, SMC 2013. 65789 (2013), 3585--3591.
[39]
Katz, B.F.G. et al. 2012. NAVIG: augmented reality guidance system for the visually impaired. Virtual Reality. 16, 4 (Jun. 2012), 253--269.
[40]
Kim, J.-E. et al. 2016. Navigating Visually Impaired Travelers in a Large Train Station Using Smartphone and Bluetooth Low Energy. (2016).
[41]
Kinateder, M. et al. 2018. Using an Augmented Reality Device as a Distance-based Vision Aid- Promise and Limitations. Optometry and Vision Science. 95, 9 (2018), 727. .
[42]
Klatzky, R.L. et al. 2006. Cognitive load of navigating without vision when guided by virtual sound versus spatial language. Article in Journal of Experimental Psychology Applied. (2006).
[43]
Kress, B.C. and Cummings, W.J. 2017. 11--1: Invited Paper: Towards the Ultimate Mixed Reality Experience: HoloLens Display Architecture Choices. SID Symposium Digest of Technical Papers. 48, 1 (May 2017), 127--131.
[44]
Kulyukin, V. et al. 2005. RoboCart: Toward robotassisted navigation of grocery stores by the visually impaired. 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS. (2005), 979--984.
[45]
Kuyk, T. and Elliott, J.L. 1999. Visual factors and mobility in persons with age-related macular degeneration. Journal of Rehabilitation Research and Development. 36, 4 (1999), 303--312.
[46]
Leat, S.J. and Lovie-Kitchin, J.E. 2008. Visual function, visual attention, and mobility performance in low vision. Optometry and Vision Science. 85, 11 (Nov. 2008), 1049--1056. .
[47]
Legge, G.E. et al. 2013. Indoor Navigation by People with Visual Impairment Using a Digital Sign System. PLoS ONE. 8, 10 (Oct. 2013), e76783.
[48]
Loomis, J.M. et al. 2007. Assisting Wayfinding in Visually Impaired Travelers.
[49]
Loomis, J.M. et al. 1998. Navigation System for the Blind: Auditory Display Modes and Guidance. Presence: Teleoperators and Virtual Environments. 7, 2 (Apr. 1998), 193--203.
[50]
Loomis, J.M. et al. 1993. Personal Guidance System for the Visually Impaired using GPS, GIS, and VR Technologies. (1993).
[51]
Makino, H. et al. 1996. Development of navigation system for the blind using GPS and mobile phone combination. Annual International Conference of the IEEE Engineering in Medicine and Biology Proceedings (1996), 506--507.
[52]
Manduchi, R. et al. 2010. Blind Guidance Using Mobile Computer Vision: A Usability Study.
[53]
Manduchi, R. and Coughlan, J.M. 2014. The Last Meter: Blind Visual Guidance to a Target. Proceedings of CHI 2014. (2014), 3113--3122.
[54]
Maps - Apple: https://www.apple.com/ios/maps/. Accessed: 2019-09-02.
[55]
Marron, J.A. and Bailey, I.L. 1982. Visual factors and orientation-mobility performance. American journal of optometry and physiological optics. 59, 5 (May 1982), 413--26.
[56]
Marston, J.R. et al. 2006. Evaluation of spatial displays for navigation without sight. ACM Transactions on Applied Perception. 3, 2 (Apr. 2006), 110--124.
[57]
Marston, J.R. et al. 2007. Nonvisual route following with guidance from a simple haptic or auditory display. Journal of Visual Impairment and Blindness. 101, 4 (2007), 203--211.
[58]
Meule, A. 2017. Reporting and Interpreting Working Memory Performance in n-back Tasks. Frontiers in psychology. 8, (2017), 352.
[59]
Microsoft Soundscape - Microsoft Research: https://www.microsoft.com/enus/research/product/soundscape/. Accessed: 2018--1011.
[60]
Narzt, W. et al. 2006. Augmented reality navigation systems. Universal Access in the Information Society. 4, 3 (Mar. 2006), 177--187.
[61]
Panëels, S.A. et al. 2013. Listen to It Yourself! Evaluating Usability of "What's Around Me?" for the Blind.
[62]
Petrie, H. et al. 1996. MOBIC?: Designing a travel aid for blind and elderly people. Journal of Navigation. 49, 1 (1996), 45--52.
[63]
Al Rabbaa, J. et al. 2019. MRsive: An Augmented Reality Tool for Enhancing Wayfinding and Engagement with Art in Museums. Springer, Cham. 535--542.
[64]
Raubal, M. and Winter, S. 2002. Enriching Wayfinding Instructions with Local Landmarks. Springer, Berlin, Heidelberg. 243--259.
[65]
van Rheede, J.J. et al. 2015. Improving Mobility Performance in Low Vision With a Distance-Based Representation of the Visual Scene. Investigative Opthalmology & Visual Science. 56, 8 (Jul. 2015), 4802.
[66]
Roentgen, U.R. et al. Users' Evaluations of Four Electronic Travel Aids Aimed at Navigation for Persons Who Are Visually Impaired.
[67]
Ross, A.S. et al. 2019. Use Cases and Impact of Audio-Based Virtual Exploration. (2019).
[68]
Sato, D. et al. 2017. NavCog3: An evaluation of a smartphone-based blindindoor navigation assistant with semantic features in a large-scale environment. ASSETS 2017 - Proceedings of the 19th International ACM SIGACCESS Conference on Computers and Accessibility (New York, New York, USA, 2017), 270--279.
[69]
Sato, D. et al. 2019. Wayfinding. Springer, London. 677--700.
[70]
Smith, A.J. et al. 1992. Low vision mobility problems: Perceptions of O&M specialists and persons with low vision. Journal of Visual Impairment and Blindness. 86, 1 (1992), 58--62.
[71]
Spatial mapping - Mixed Reality | Microsoft Docs: https://docs.microsoft.com/en-us/windows/mixedreality/spatial-mapping. Accessed: 2019-09-05.
[72]
Spatial sound in Unity - Mixed Reality | Microsoft Docs: https://docs.microsoft.com/enus/windows/mixed-reality/spatial-sound-in-unity. Accessed: 2019-09--15.
[73]
Stent, A. et al. iWalk: A Lightweight Navigation System for Low-Vision Users.
[74]
Strothotte, T. et al. 1996. Development of dialogue systems for a mobility aid for blind people. Proceedings of the second annual ACM conference on Assistive technologies - Assets '96 (New York, New York, USA, 1996), 139--144.
[75]
Summary Health Statistics for the U.S. Population: National Health Interview Survey, 2004.: 2004. http://www.cdc.gov/nchs/data/series/sr_10/sr10_229. pdf. Accessed: 2015-05-03.
[76]
Szpiro, S. et al. 2016. Finding a store, searching for a product: a study of daily challenges of low vision people. Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing. (2016), 61--72.
[77]
Tachi, S. and Komoriya, K. 1985. Guide Dog Robot.
[78]
Tan, H.Z. et al. 2003. A Haptic Back Display for Attentional and Directional Cueing. Haptics-e. 3, 1 (Jun. 2003), 1--20.
[79]
Tjan, B.S. et al. 2006. Digital Sign System for Indoor Wayfinding for the Visually Impaired. 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Workshops (2006), 30--30.
[80]
Turano, K.A. et al. 1999. Perceived visual ability for independent mobility in persons with retinitis pigmentosa. Investigative Ophthalmology and Visual Science. 40, 5 (1999), 865--877.
[81]
Unity - Manual: HoloLens WorldAnchor persistence: https://docs.unity3d.com/Manual/windowsholographi c-persistence.html. Accessed: 2019-09-05.
[82]
Vassallo, R. et al. 2017. Hologram stability evaluation for Microsoft HoloLens. Medical Imaging 2017: Image Perception, Observer Performance, and Technology Assessment. 10136, March 2017 (2017), 1013614.
[83]
Wahab, M.H.A. et al. 2011. Smart Cane: Assistive Cane for Visually-impaired People. IJCSI International Journal of Computer Science Issues. 8, 4 (2011), 21--27.
[84]
Williams, M. a et al. 2013. ?Pray Before You Step out": Describing Personal and Situational Blind Navigation Behaviors. Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility. (2013), 28:1---28:8.
[85]
Williams, M.A. et al. 2014. "Just Let the Cane Hit It": How the Blind and Sighted See Navigation Differently. Proceedings of the 16th international ACM SIGACCESS conference on Computers & accessibility - ASSETS '14. (2014), 217--224.
[86]
Wobbrock, J.O. et al. 2011. The aligned rank transform for nonparametric factorial analyses using only anova procedures. Proceedings of the 2011 annual conference on Human factors in computing systems - CHI '11 (New York, New York, USA, 2011), 143.
[87]
Zhao, Y. et al. 2016. CueSee?: Exploring Visual Cues for People with Low Vision to Facilitate a Visual Search Task. International Joint Conference on Pervasive and Ubiquitous Computing (2016), 73--84.
[88]
Zhao, Y. et al. 2019. Designing AR Visualizations to Faciliate Stair Navigation for People with Low Vision. UIST 2019 (2019).
[89]
Zhao, Y. et al. 2018. "It Looks Beautiful but Scary:" How Low Vision People Navigate Stairs and Other Surface Level Changes. Proceedings of the 20th International ACM SIGACCESS Conference on Computers and Accessibility - ASSETS '18 (New York, New York, USA, 2018), 307--320.
[90]
Zhao, Y. et al. 2017. Understanding Low Vision People's Visual Perception on Commercial Augmented Reality Glasses. Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems - CHI '17. (2017), 4170--4181.

Cited By

View all
  • (2024)Navigation Training for Persons With Visual Disability Through Multisensory Assistive Technology: Mixed Methods Experimental StudyJMIR Rehabilitation and Assistive Technologies10.2196/5577611(e55776)Online publication date: 18-Nov-2024
  • (2024)Direct or Immersive? Comparing Smartphone-based Museum Guide Systems for Blind VisitorsProceedings of the 21st International Web for All Conference10.1145/3677846.3677856(10-22)Online publication date: 13-May-2024
  • (2024)ChitChatGuide: Conversational Interaction Using Large Language Models for Assisting People with Visual Impairments to Explore a Shopping MallProceedings of the ACM on Human-Computer Interaction10.1145/36764928:MHCI(1-25)Online publication date: 24-Sep-2024
  • Show More Cited By

Index Terms

  1. The Effectiveness of Visual and Audio Wayfinding Guidance on Smartglasses for People with Low Vision

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      CHI '20: Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems
      April 2020
      10688 pages
      ISBN:9781450367080
      DOI:10.1145/3313831
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 23 April 2020

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. accessibility
      2. audio feedback
      3. augmented reality
      4. low vision
      5. visual feedback
      6. wayfinding

      Qualifiers

      • Research-article

      Funding Sources

      Conference

      CHI '20
      Sponsor:

      Acceptance Rates

      Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

      Upcoming Conference

      CHI 2025
      ACM CHI Conference on Human Factors in Computing Systems
      April 26 - May 1, 2025
      Yokohama , Japan

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)301
      • Downloads (Last 6 weeks)19
      Reflects downloads up to 14 Feb 2025

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)Navigation Training for Persons With Visual Disability Through Multisensory Assistive Technology: Mixed Methods Experimental StudyJMIR Rehabilitation and Assistive Technologies10.2196/5577611(e55776)Online publication date: 18-Nov-2024
      • (2024)Direct or Immersive? Comparing Smartphone-based Museum Guide Systems for Blind VisitorsProceedings of the 21st International Web for All Conference10.1145/3677846.3677856(10-22)Online publication date: 13-May-2024
      • (2024)ChitChatGuide: Conversational Interaction Using Large Language Models for Assisting People with Visual Impairments to Explore a Shopping MallProceedings of the ACM on Human-Computer Interaction10.1145/36764928:MHCI(1-25)Online publication date: 24-Sep-2024
      • (2024)AudioMove: Applying the Spatial Audio to Multi-Directional Limb Exercise GuidanceProceedings of the ACM on Human-Computer Interaction10.1145/36764898:MHCI(1-26)Online publication date: 24-Sep-2024
      • (2024)Vision-Based Assistive Technologies for People with Cerebral Visual Impairment: A Review and Focus StudyProceedings of the 26th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3663548.3675637(1-20)Online publication date: 27-Oct-2024
      • (2024)Dude, Where's My Luggage? An Autoethnographic Account of Airport Navigation by a Traveler with Residual VisionProceedings of the 26th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3663548.3675624(1-13)Online publication date: 27-Oct-2024
      • (2024)WatchCap: Improving Scanning Efficiency in People with Low Vision through Compensatory Head Movement StimulationProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36595928:2(1-32)Online publication date: 15-May-2024
      • (2024)CookAR: Affordance Augmentations in Wearable AR to Support Kitchen Tool Interactions for People with Low VisionProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676449(1-16)Online publication date: 13-Oct-2024
      • (2024)AI-Enabled Smart Glasses for People with Severe Vision ImpairmentsACM SIGACCESS Accessibility and Computing10.1145/3654768.3654771(1-1)Online publication date: 26-Mar-2024
      • (2024)SonifyAR: Context-Aware Sound Effect Generation in Augmented RealityExtended Abstracts of the CHI Conference on Human Factors in Computing Systems10.1145/3613905.3650927(1-7)Online publication date: 11-May-2024
      • Show More Cited By

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      HTML Format

      View this article in HTML Format.

      HTML Format

      Figures

      Tables

      Media

      Share

      Share

      Share this Publication link

      Share on social media