skip to main content
10.1145/3373625.3417028acmconferencesArticle/Chapter ViewAbstractPublication PagesassetsConference Proceedingsconference-collections
research-article
Public Access

AIGuide: An Augmented Reality Hand Guidance Application for People with Visual Impairments

Published: 29 October 2020 Publication History

Abstract

Locating and grasping objects is a critical task in people’s daily lives. For people with visual impairments, this task can be a daily struggle. The support of augmented reality frameworks in smartphones has the potential to overcome the limitations of current object detection applications designed for people with visual impairments. We present AIGuide, a self-contained offline smartphone application that leverages augmented reality technology to help users locate and pick up objects around them. We conducted a user study to validate its effectiveness at providing guidance, compare it to other assistive technology form factors, evaluate the use of multimodal feedback, and provide feedback about the overall experience. Our results show that AIGuide is a promising technology to help people with visual impairments locate and acquire objects in their daily routine.

Supplementary Material

MP4 File (a2-aldas-demo.mp4)

References

[1]
[1]KNFB Reader. Retrieved May 1, 2020 from https://www.knfbreader.com/
[2]
[2]Seeing AI. Retrieved May 1, 2020 from https://www.microsoft.com/en-us/ai/seeing-ai
[3]
[3]Soundscape. Retrieved May 1, 2020 from https://www.microsoft.com/en-us/research/product/soundscape/
[4]
[4]Navilens. Retrieved May 1, 2020 from https://www.navilens.com/
[5]
[5]Daisuke Sato, Uran Oh, Kakuya Naito, Hironobu Takagi, Kris Kitani, and Chieko Asakawa. 2017. NavCog3: An Evaluation of a Smartphone-Based Blind Indoor Navigation Assistant with Semantic Features in a Large-Scale Environment. In Proceedings of the 19th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS ’17). 270–279.
[6]
[6]Jeffrey P. Bigham, Samual White, Tom Yeh, Chandrika Jayant, Hanjie Ji, Greg Little, Andrew Miller, Robert C. Miller, Robin Miller, Aubrey Tatarowicz, and Brandyn White. 2010. VizWiz: nearly real-time answers to visual questions. In Proceedings of the 23nd annual ACM symposium on User interface software and technology (UIST ’10).333.
[7]
[7]Jeffrey P. Bigham, Chandrika Jayant, Andrew Miller, Brandyn White, and Tom Yeh. 2010. VizWiz::LocateIt - Enabling Blind People to Locate Objects in Their Environment. 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition – Workshops.
[8]
[8]Giorgio Presti, Dragan Ahmetovic, Mattia Ducci, Cristian Bernareggi, Luca Andrea, Adriano Baratè, Federico Avanzin, and Sergio Mascetti. 2019. WatchOut: Obstacle Sonification for People with Visual Impairment or Blindness. The 21st International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS ’19).
[9]
[9]Yu Zhong, Pierre J. Garrigues, and Jeffrey P. Bigham. 2013. Real time object scanning using a mobile phone and cloud-based visual search engine. In Proceeding of the 15th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS ’13).1–8.
[10]
[10]BeMyEyes. Retrieved May 1, 2020 from https://www.bemyeyes.com/
[11]
[11]Aira. Retrieved May 1, 2010 from https://aira.io/
[12]
[12]Aipoly. Retrieved May 1, 2020 from https://www.aipoly.com/
[13]
[13]Shaun K. Kane, Brian Frey, and Jacob O. Wobbrock. 2013. Access lens: a gesture-based screen reader for real-world documents. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ‘13).pp. 347-350.
[14]
[14]Manduchi, Roberto, et al. “Blind Guidance Using Mobile Computer Vision.” Proceedings of the 12th International ACM SIGACCESS Conference on Computers and Accessibility - ASSETS ’10, 2010.
[15]
[15]Morris, J., and J. Mueller. “Blind and Deaf Consumer Preferences for Android and IOS Smartphones.” SpringerLink, Springer, Cham, 1 Jan. 1970, doi.org/10.1007/978-3-319-05095-9_7.
[16]
[16]Implementing an iOS Settings Bundle. Retrieved May 5, 2020 from https://developer.apple.com/library/archive/documentation/Cocoa/Conceptual/UserDefaults/Preferences/Preferences.html
[17]
[17]Lecun, Yann, et al. “Deep Learning.” Nature, vol. 521, no. 7553, 2015, pp. 436–444.
[18]
[18]Roberto Manduchi and James M. Coughlan. 2014. The last meter: blind visual guidance to a target. In Proceedings of the 32nd annual ACM conference on Human factors in computing systems - CHI ’14, 3113– 3122. https://doi.org/10.1145/2556288.2557328
[19]
[19]Understanding ARKit Tracking and Detection. Retrieved May 4, 2020 from https://developer.apple.com/videos/play/wwdc2018/610
[20]
[20]Scanning and Detecting 3D Object. Retrieved May 4, 2020 from https://developer.apple.com/documentation/arkit/scanning_and_detecting_3d_objects
[21]
[21]Object Recognition. Retrieved May 4, 2020 from https://library.vuforia.com/articles/Training/Object-Recognition
[22]
[22]Shilkrot, R., Huber, J., Wong, M., Maes, P., Nanayakkara, S., FingerReader: A Wearable Device to Explore Printed Text on the Go. Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems – CHI ’15. Pages 2363–2372. https://doi.org/10.1145/2702123.2702421
[23]
[23]Stearns, L., Du, R., Oh, U., Jou, C., Findlater, L., Ross, D. A., and Froehlich, J. E. (2016). Evaluating haptic and auditory directional guidance to assist blind people in reading printed text using finger-mounted cameras. ACM Transactions on Accessible Computing (TACCESS), 9(1), 1-38.
[24]
[24]Oh, U., Kane, S., Findlater, L., Follow that sound: using sonification and corrective verbal feedback to teach touchscreen gestures. Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility - ASSETS ’13, 2013. https://doi.org/10.1145/2513383.2513455
[25]
[25]Soviak, A., Borodin, A., Ashok, V., Borodin, Y., Puzis, Y., Ramakrishnan, I., Tactile Accessibility: Does Anyone Need a Haptic Glove? Proceedings of the 18th International ACM SIGACCESS Conference on Computers and Accessibility ASSETS ’16, 2016. https://doi.org/10.1145/2982142.2982175
[26]
[26]Manduchi, R., Coughlan, J., The last meter: blind visual guidance to a target. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems CHI ’14, 2014 https://doi.org/10.1145/2556288.2557328
[27]
[27]Thakoor, K., Mante, N., Zhang, C., Siagian, C., Weiland, J., Medioni, L., A System for Assisting the Visually Impaired in Localization and Grasp of Desired Objects. Proceedings of the ECCV European Conference on Computer Vision - ECCV ’14, 2014. https://doi.org/10.1007/978-3-319-16199-0_45
[28]
[28]Zientara, P., Lee, S., Smith, G., Brenner, R., Itti, L., Rosson, M., Carroll, K., Irick, K., Narayanan, V., Third Eye: A Shopping Assistant for the Visually Impaired. Computer, vol. 50, no. 2, 2017, pp. 16–24.
[29]
[29]Kim, K., Ren, X., Choi, S., Tan, H., Assisting people with visual impairments in aiming at a target on a large wall-mounted display. International Journal of Human-Computer Studies, 2016., Volume 86, Pages 109-120. https://doi.org/10.1016/j.ijhcs.2015.10.002
[30]
[30]Parseihian, G., Gondre, C., Aramaki, M., Ystad, S., Martinet, R., Comparison and Evaluation of Sonification Strategies for Guidance Tasks. Proceedings of the IEEE Transactions on Multimedia, 2016. Pages 674-686.
[31]
[31]Hong, J., Pradhan, A., Froehlich, J., Findlater, L., Evaluating Wrist-Based Haptic Feedback for Non-Visual Target Finding and Path Tracing on a 2D Surface. Proceedings of the 19th International ACM SIGACCESS Conference on Computers and Accessibility - ASSETS ’17, 2017. Pages 210–219 https://doi.org/10.1145/3132525.3132538
[32]
[32]Stearns, L., DeSouza, V., Yin, J., Findlater, L., Froehlich, J., Augmented Reality Magnification for Low Vision Users with the Microsoft Hololens and a Finger-Worn Camera. Proceedings of the 19th International ACM SIGACCESS Conference on Computers and Accessibility ASSETS ’17. 2017 Pages 361–362 https://doi.org/10.1145/3132525.3134812
[33]
[33]“Apple Unveils New IPad Pro with LiDAR Scanner and Trackpad Support in IPadOS.” Apple Newsroom, 16 Apr. 2020, www.apple.com/newsroom/2020/03/apple-unveils-new-ipad-pro-with-lidar-scanner-and-trackpad-support-in-ipados/.
[34]
[34]“What Is ToF Camera Technology on Galaxy and How Does It Work?” The Official Samsung Galaxy Site, www.samsung.com/global/galaxy/what-is/tof-camera/.
[35]
[35]Marynel Vázquez and Aaron Steinfeld. 2012. Helping visually impaired users properly aim a camera. In Proceedings of the 14th international ACM SIGACCESS conference on Computers and accessibility - ASSETS ’12, 95. https://doi.org/10.1145/2384916.2384934
[36]
[36]Neat, L., Peng, R., Qin, S., and Manduchi, R. (2019, March). Scene text access: a comparison of mobile OCR modalities for blind users. In Proceedings of the 24th International Conference on Intelligent User Interfaces (pp. 197-207).
[37]
[37]Ohn-Bar, E., Kitani, K. M., and Asakawa, C. (2018). Personalized dynamics models for adaptive assistive navigation interfaces. CoRR, abs/1804.04118.
[38]
[38]Zhao, Y., Kupferstein, E., Rojnirun, H., Findlater, L., and Azenkot, S. The Effectiveness of Visual and Audio Wayfinding Guidance on Smartglasses for People with Low Vision.
[39]
[39]Boularouk, S., Josselin, D., and Altman, E. (2017, May). Open source tools for locomotion and apprehension of space by visually impaired persons: some propositions to build a prototype based on Arduino, speech recognition and OpenStreetMap.
[40]
[40]Guo, A., Chen, X. A., Qi, H., White, S., Ghosh, S., Asakawa, C., and Bigham, J. P. (2016, October). Vizlens: A robust and interactive screen reader for interfaces in the real world. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology (pp. 651-664).
[41]
[41]Kane, S. K., Frey, B., and Wobbrock, J. O. (2013, April). Access lens: a gesture-based screen reader for real-world documents. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 347-350).
[42]
[42]Baud-Bovy, G., Porquis, L. B., Ancarani, F., and Gori, M. ABBI: a wearable device for enhancing the spatial abilities of visually-impaired people.
[43]
[43]Khambadkar, V., and Folmer, E. (2013, October). GIST: a gestural interface for remote nonvisual spatial perception. In Proceedings of the 26th annual ACM symposium on User interface software and technology (pp. 301-310).
[44]
[44]ARAnchor. Retrieved May 7, 2020 from https://developer.apple.com/documentation/arkit/aranchor
[45]
[45]Lee, S., Yuan, C. W., Hanrahan, B. V., Rosson, M. B., and Carroll, J. M. (2017, October). Reaching Out: Investigating Different Modalities to Help People with Visual Impairments Acquire Items. In Proceedings of the 19th International ACM SIGACCESS Conference on Computers and Accessibility (pp. 389-390).
[46]
[46]Bailey, R. W. (1993, October). Performance vs. preference. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting (Vol. 37, No. 4, pp. 282-286). Sage CA: Los Angeles, CA: SAGE Publications.
[47]
[47]https://zoom.us/
[48]
[48]Ahmetovic, D., Sato, D., Oh, U., Ishihara, T., Kitani, K., and Asakawa, C. (2020, April). ReCog: Supporting Blind People in Recognizing Personal Objects. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (pp. 1-12).
[49]
[49]Kacorri, H., Kitani, K. M., Bigham, J.P., and Asakawa, C. (2017, May). People with visual impairment training personal object recognizers: Feasibility and challenges. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (pp. 5839-5849).
[50]
[50]Guo, A., Mcvea, S., Wang, X., Clary, P., Goldman, K., Li, Y., ... and Bigham, J.P. (2018, October). Investigating Cursor-based Interactions to Support Non-Visual Exploration in the Real World. In Proceedings of the 20th International ACM SIGACCESS Conference on Computers and Accessibility (pp. 3-14). ACM.
[51]
[51] Boland., Mike. “ARCore Reaches 400 Million Devices.” AR Insider, 13 May 2019, arinsider.co/2019/05/13/arcore-reaches-400-million-devices/.

Cited By

View all
  • (2024)SonoHaptics: An Audio-Haptic Cursor for Gaze-Based Object Selection in XRProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676384(1-19)Online publication date: 13-Oct-2024
  • (2024)Work in Progress: Expanding Learning Opportunities in STEM Courses: The Potential of Haptic VR Laboratories for Students with and Without Visual ImpairmentTowards a Hybrid, Flexible and Socially Engaged Higher Education10.1007/978-3-031-52667-1_16(149-154)Online publication date: 26-Jan-2024
  • (2023)Grid Map Correction for Fall Risk Alert System Using SmartphoneJournal of Robotics and Mechatronics10.20965/jrm.2023.p086735:3(867-878)Online publication date: 20-Jun-2023
  • Show More Cited By

Index Terms

  1. AIGuide: An Augmented Reality Hand Guidance Application for People with Visual Impairments
      Index terms have been assigned to the content through auto-classification.

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      ASSETS '20: Proceedings of the 22nd International ACM SIGACCESS Conference on Computers and Accessibility
      October 2020
      764 pages
      ISBN:9781450371032
      DOI:10.1145/3373625
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 29 October 2020

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. assistive technology
      2. augmented reality
      3. mixed reality
      4. mobile computing

      Qualifiers

      • Research-article
      • Research
      • Refereed limited

      Funding Sources

      Conference

      ASSETS '20
      Sponsor:

      Acceptance Rates

      ASSETS '20 Paper Acceptance Rate 46 of 167 submissions, 28%;
      Overall Acceptance Rate 436 of 1,556 submissions, 28%

      Upcoming Conference

      ASSETS '25

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)483
      • Downloads (Last 6 weeks)63
      Reflects downloads up to 01 Mar 2025

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)SonoHaptics: An Audio-Haptic Cursor for Gaze-Based Object Selection in XRProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676384(1-19)Online publication date: 13-Oct-2024
      • (2024)Work in Progress: Expanding Learning Opportunities in STEM Courses: The Potential of Haptic VR Laboratories for Students with and Without Visual ImpairmentTowards a Hybrid, Flexible and Socially Engaged Higher Education10.1007/978-3-031-52667-1_16(149-154)Online publication date: 26-Jan-2024
      • (2023)Grid Map Correction for Fall Risk Alert System Using SmartphoneJournal of Robotics and Mechatronics10.20965/jrm.2023.p086735:3(867-878)Online publication date: 20-Jun-2023
      • (2023)In-Place Virtual Exploration Using a Virtual Cane: An Initial StudyCompanion Proceedings of the 2023 Conference on Interactive Surfaces and Spaces10.1145/3626485.3626539(45-49)Online publication date: 5-Nov-2023
      • (2023)Accessibility Research and Users with Multiple Disabilities or Complex NeedsProceedings of the 25th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3597638.3615651(1-6)Online publication date: 22-Oct-2023
      • (2023)Leveraging Sensorimotor Realities for Assistive Technology Design Bridging Smart Environments and Virtual WorldsProceedings of the 16th International Conference on PErvasive Technologies Related to Assistive Environments10.1145/3594806.3594834(247-253)Online publication date: 5-Jul-2023
      • (2023)Take My Hand: Automated Hand-Based Spatial Guidance for the Visually ImpairedProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581415(1-16)Online publication date: 19-Apr-2023
      • (2023)Moving Towards and Reaching a 3-D Target by Embodied Guidance: Parsimonious Vs Explicit Sound MetaphorsUniversal Access in Human-Computer Interaction10.1007/978-3-031-35681-0_15(229-243)Online publication date: 23-Jul-2023
      • (2022)Corridor-Walker: Mobile Indoor Walking Assistance for Blind People to Avoid Obstacles and Recognize IntersectionsProceedings of the ACM on Human-Computer Interaction10.1145/35467146:MHCI(1-22)Online publication date: 20-Sep-2022
      • (2022)Helping Helpers: Supporting Volunteers in Remote Sighted Assistance with Augmented Reality MapsProceedings of the 2022 ACM Designing Interactive Systems Conference10.1145/3532106.3533560(881-897)Online publication date: 13-Jun-2022
      • Show More Cited By

      View Options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      HTML Format

      View this article in HTML Format.

      HTML Format

      Login options

      Figures

      Tables

      Media

      Share

      Share

      Share this Publication link

      Share on social media