skip to main content
research-article

AIGuide: Augmented Reality Hand Guidance in a Visual Prosthetic

Published: 19 May 2022 Publication History

Abstract

Locating and grasping objects is a critical task in people’s daily lives. For people with visual impairments, this task can be a daily struggle. The support of augmented reality frameworks in smartphones can overcome the limitations of current object detection applications designed for people with visual impairments. We present AIGuide, a self-contained smartphone application that leverages augmented reality technology to help users locate and pick up objects around them. We conducted a user study to investigate the effectiveness of AIGuide in a visual prosthetic for providing guidance; compare it to other assistive technology form factors; investigate the use of multimodal feedback, and provide feedback about the overall experience. We gathered performance data and participants’ reactions and analyzed videos to understand users’ interactions with the nonvisual smartphone user interface. Our results show that AIGuide is a promising technology to help people with visual impairments locate and acquire objects in their daily routine. The benefits of AIGuide may be enhanced with appropriate interaction design.

References

[1]
[n.d.]. Accessibility—Vision. Retrieved from https://www.apple.com/accessibility/vision/.
[3]
Dragan Ahmetovic, Federico Avanzini, Adriano Baratè, Cristian Bernareggi, Gabriele Galimberti, Luca A. Ludovico, Sergio Mascetti, and Giorgio Presti. 2019. Sonification of rotation instructions to support navigation of people with visual impairment. In Proceedings of the IEEE International Conference on Pervasive Computing and Communications (PerCom’19). 1–10.
[4]
Dragan Ahmetovic, Daisuke Sato, Uran Oh, Tatsuya Ishihara, Kris Kitani, and Chieko Asakawa. 2020. Recog: Supporting blind people in recognizing personal objects. In Proceedings of the CHI Conference on Human Factors in Computing Systems. 1–12.
[5]
Aipoly [n.d.]. Retreived May 1, 2020 from https://www.aipoly.com/.
[6]
Aira [n.d.]. Retrieved May 1, 2020 from https://aira.io/.
[7]
[8]
Mauro Avila Soto, Markus Funk, Matthias Hoppe, Robin Boldt, Katrin Wolf, and Niels Henze. 2017. Dronenavigator: Using leashed and free-floating quadcopters to navigate visually impaired travelers. In Proceedings of the 19th International ACM Sigaccess Conference on Computers and Accessibility. 300–304.
[9]
Shiri Azenkot, R. E. Ladner, and Jacob Wobbrock. 2011. Smartphone haptic feedback for nonvisual wayfinding. In Proceedings of the 13th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS’11).
[10]
Shiri Azenkot, Richard E. Ladner, and Jacob O. Wobbrock. 2011. Smartphone haptic feedback for nonvisual wayfinding. In Proceedings of the 13th International ACM SIGACCESS Conference on Computers and Accessibility. 281–282.
[11]
Robert W. Bailey. 1993. Performance vs. preference. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting, Vol. 37. SAGE Publications, Los Angeles, CA, 282–286.
[12]
Gabriel Baud-Bovy, Lope Ben Porquis, Fabio Ancarani, and Monica Gori. 2017. ABBI: A wearable device for improving spatial cognition in visually-impaired children. In IEEE Biomedical Circuits and Systems Conference (BioCAS’17). 1–4.
[13]
Serge Belongie. 2007. Project GroZi: Assistive navigational technology for the visually impaired. J. Vis. 7, 15 (2007), 37–37.
[14]
BeMyEyes. Retreived May 1, 2020 from https://www.bemyeyes.com/.
[15]
Jeffrey P. Bigham, Chandrika Jayant, Hanjie Ji, Greg Little, Andrew Miller, Robert C. Miller, Robin Miller, Aubrey Tatarowicz, Brandyn White, Samual White, et al. 2010. Vizwiz: Nearly real-time answers to visual questions. In Proceedings of the 23rd Annual ACM Symposium on User Interface Software and Technology. 333–342.
[16]
Mike Boland. 2019. ARCore Reaches 400 Million Devices. Retrieved from arinsider.co/2019/05/13/arcore-reaches-400-million-devices/.
[17]
Roger Boldu, Alexandru Dancu, Denys J. C. Matthies, Thisum Buddhika, Shamane Siriwardhana, and Suranga Nanayakkara. 2018. Fingerreader2. 0: Designing and evaluating a wearable finger-worn camera to assist people with visual impairments while shopping. Proc. ACM Interact. Mobile Wear. Ubiq. Technol. 2, 3 (2018), 1–19.
[18]
Said Boularouk, Didier Josselin, and Eitan Altman. 2017. Open source tools for locomotion and apprehension of space by visually impaired persons: Some propositions to build a prototype based on Arduino, speech recognition and OpenStreetMap. In Societal Geo-Innovation.
[19]
Anke Brock, Slim Kammoun, Marc Macé, and Christophe Jouffrais. 2014. Using wrist vibrations to guide hand movement and whole body navigation. i-com 13, 3 (2014), 19–28.
[20]
Hsuan-Eng Chen, Yi-Ying Lin, Chien-Hsing Chen, and I-Fang Wang. 2015. BlindNavi. In Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. ACM.
[21]
Hsuan-Eng Chen, Yi-Ying Lin, Chien-Hsing Chen, and I-Fang Wang. 2015. BlindNavi: A navigation app for the visually impaired smartphone user. In Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. 19–24.
[22]
Akansel Cosgun, E. Akin Sisbot, and Henrik I. Christensen. 2014. Evaluation of rotational and directional vibration patterns on a tactile belt for guiding visually impaired people. In Proceedings of the IEEE Haptics Symposium (HAPTICS’14). IEEE, 367–370.
[23]
Koen Crommentuijn and Fredrik Winberg. 2006. Designing auditory displays to facilitate object localization in virtual haptic 3D environments. In Proceedings of the 8th International ACM SIGACCESS Conference on Computers and Accessibility. 255–256.
[24]
Florian Dramas, Bernard Oriola, Brian G. Katz, Simon J. Thorpe, and Christophe Jouffrais. 2008. Designing an assistive device for the blind based on object localization and augmented auditory reality. In Proceedings of the 10th International ACM SIGACCESS Conference on Computers and Accessibility. 263–264.
[25]
Giuseppe Ghiani, Barbara Leporini, and Fabio Paternò. 2009. Vibrotactile feedback to aid blind users of mobile guides. J. Vis. Lang. Comput. 20, 5 (10 2009), 305–317.
[26]
William Grussenmeyer and eelke folmer. 2017. Accessible touchscreen technology for people with visual impairments: A survey. ACM Trans. Access. Comput. 9 (1 2017), Article 6.
[27]
William Grussenmeyer and Eelke Folmer. 2017. Accessible touchscreen technology for people with visual impairments: A survey. ACM Trans. Access. Comput. 9, 2 (2017), 1–31.
[28]
Anhong Guo, Xiang’Anthony’ Chen, Haoran Qi, Samuel White, Suman Ghosh, Chieko Asakawa, and Jeffrey P. Bigham. 2016. Vizlens: A robust and interactive screen reader for interfaces in the real world. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology. 651–664.
[29]
Anhong Guo, Saige McVea, Xu Wang, Patrick Clary, Ken Goldman, Yang Li, Yu Zhong, and Jeffrey P. Bigham. 2018. Investigating cursor-based interactions to support non-visual exploration in the real world. In Proceedings of the 20th International ACM SIGACCESS Conference on Computers and Accessibility. 3–14.
[30]
Wilko Heuten, Niels Henze, Susanne Boll, and Martin Pielot. 2008. Tactile wayfinder: A non-visual support system for wayfinding. In Proceedings of the 5th Nordic Conference on Human-Computer Interaction: Building Bridges. 172–181.
[31]
Jonggi Hong, Alisha Pradhan, Jon E. Froehlich, and Leah Findlater. 2017. Evaluating wrist-based haptic feedback for non-visual target finding and path tracing on a 2d surface. In Proceedings of the 19th International ACM SIGACCESS Conference on Computers and Accessibility. 210–219.
[32]
Felix Huppert, Gerold Hoelzl, and Matthias Kranz. 2021. GuideCopter—A precise drone-based haptic guidance interface for blind or visually impaired people. In Proceedings of the CHI Conference on Human Factors in Computing Systems. 1–14.
[34]
Chandrika Jayant, Hanjie Ji, Samuel White, and Jeffrey P. Bigham. 2011. Supporting blind photography. In Proceedings of the 13th International ACM SIGACCESS Conference on Computers and Accessibility. 203–210.
[35]
Hernisa Kacorri, Kris M. Kitani, Jeffrey P. Bigham, and Chieko Asakawa. 2017. People with visual impairment training personal object recognizers: Feasibility and challenges. In Proceedings of the CHI Conference on Human Factors in Computing Systems. 5839–5849.
[36]
Slim Kammoun, Christophe Jouffrais, Tiago Guerreiro, Hugo Nicolau, and Joaquim Jorge. 2012. Guiding blind people with haptic feedback. In Proceedings of Frontiers in Accessibility for Pervasive Computing (Pervasive’12), Vol. 3.
[37]
Shaun K. Kane, Brian Frey, and Jacob O. Wobbrock. 2013. Access lens: A gesture-based screen reader for real-world documents. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 347–350.
[38]
Shaun K. Kane, Meredith Ringel Morris, Annuska Z. Perkins, Daniel Wigdor, Richard E. Ladner, and Jacob O. Wobbrock. 2011. Access overlays: Improving non-visual access to large touch screens for blind users. In Proceedings of the 24th Annual ACM Symposium on User Interface Software and Technology. 273–282.
[39]
Vinitha Khambadkar and Eelke Folmer. 2013. GIST: A gestural interface for remote nonvisual spatial perception. In Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology. 301–310.
[40]
Jee-Eun Kim, Masahiro Bessho, Shinsuke Kobayashi, Noboru Koshizuka, and Ken Sakamura. 2016. Navigating visually impaired travelers in a large train station using smartphone and bluetooth low energy. In Proceedings of the 31st Annual ACM Symposium on Applied Computing. 604–611.
[41]
Kibum Kim, Xiangshi Ren, Seungmoon Choi, and Hong Z. Tan. 2016. Assisting people with visual impairments in aiming at a target on a large wall-mounted display. Int. J. Hum.-Comput. Stud. 86 (2016), 109–120.
[42]
KNFB Reader [n.d.]. Retrieved May 1, 2020 from https://www.knfbreader.com/.
[43]
Kyungjun Lee, Jonggi Hong, Simone Pimento, Ebrima Jarjue, and Hernisa Kacorri. 2019. Revisiting blind photography in the context of teachable object recognizers. In Proceedings of the 21st International ACM SIGACCESS Conference on Computers and Accessibility. 83–95.
[44]
Sooyeon Lee, Chien Wen Yuan, Benjamin V. Hanrahan, Mary Beth Rosson, and John M. Carroll. 2017. Reaching out: Investigating different modalities to help people with visual impairments acquire items. In Proceedings of the 19th International ACM SIGACCESS Conference on Computers and Accessibility. 389–390.
[45]
Xiaoping Liu, He Zhang, Lingqiu Jin, and Cang Ye. 2018. A wearable robotic object manipulation aid for the visually impaired. In Proceedings of the IEEE 1st International Conference on Micro/Nano Sensors for AI, Healthcare, and Robotics (NSENS’18). IEEE, 5–9.
[46]
Adriano Mancini, Emanuele Frontoni, and Primo Zingaretti. 2018. Mechatronic system to help visually impaired users during walking and running. IEEE Trans. Intell. Transport. Syst. 19, 2 (2018), 649–660.
[47]
Roberto Manduchi. 2012. Mobile vision as assistive technology for the blind: An experimental study. In Computers Helping People with Special Needs, Klaus Miesenberger, Arthur Karshmer, Petr Penaz, and Wolfgang Zagler (Eds.). Springer, Berlin, 9–16.
[48]
Roberto Manduchi and James M. Coughlan. 2014. The last meter: Blind visual guidance to a target. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 3113–3122.
[49]
Roberto Manduchi, Sri Kurniawan, and Homayoun Bagherinia. 2010. Blind guidance using mobile computer vision: A usability study. In Proceedings of the 12th International ACM SIGACCESS Conference on Computers and Accessibility. 241–242.
[50]
Lotfi B. Merabet and Jaime Sánchez. 2016. Development of an audio-haptic virtual interface for navigation of large-scale environments for people who are blind. In Proceedings of the International Conference on Universal Access in Human-Computer Interaction. Springer, 595–606.
[51]
John Morris and James Mueller. 2014. Blind and deaf consumer preferences for android and iOS smartphones. In Inclusive Designing. Springer, 69–79.
[52]
Cecily Morrison, Edward Cutrell, Martin Grayson, Anja Thieme, Alex Taylor, Geert Roumen, Camilla Longden, Sebastian Tschiatschek, Rita Faia Marques, and Abigail Sellen. 2021. Social Sensemaking with AI: Designing an open-ended AI experience with a blind child. In Proceedings of the CHI Conference on Human Factors in Computing Systems. 1–14.
[53]
Navilens [n.d.]. Retrieved May 1, 2020 from https://www.navilens.com/.
[54]
Leo Neat, Ren Peng, Siyang Qin, and Roberto Manduchi. 2019. Scene text access: A comparison of mobile OCR modalities for blind users. In Proceedings of the 24th International Conference on Intelligent User Interfaces. 197–207.
[55]
Apple Newsroom. [n.d.]. Apple Unveils New IPad Pro with LiDAR Scanner and Trackpad Support in IPadOS. Retrieved from www.apple.com/newsroom/2020/03/apple-unveils-new-ipad-pro-with-lidar-scanner-and-trackpad-support-in-ipados/.
[56]
John Nicholson, Vladimir Kulyukin, and Daniel Coster. 2009. ShopTalk: Independent blind shopping through verbal route directions and barcode scans. Open Rehabil. J. 2, 1 (2009).
[57]
Object Recognition [n.d.]. Retrieved May 4, 2020 from https://library.vuforia.com/articles/Training/Object-Recognition.
[58]
Uran Oh, Shaun Kane, and Leah Findlater. 2013. Follow that sound: Using sonification and corrective verbal feedback to teach touchscreen gestures. In Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS’13).
[59]
Uran Oh, Shaun K. Kane, and Leah Findlater. 2013. Follow that sound: Using sonification and corrective verbal feedback to teach touchscreen gestures. In Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility. 1–8.
[60]
Eshed OhnBar, Kris Kitani, and Chieko Asakawa. 2018. Personalized dynamics models for adaptive assistive navigation systems. In Conference on Robot Learning. PMLR, 16–39.
[61]
Sabrina Paneels, Margarita Anastassova, Steven Strachan, Sophie Pham Van, Saranya Sivacoumarane, and Christian Bolzmacher. 2013. What’s around me? Multi-actuator haptic feedback on the wrist. In Proceedings of the World Haptics Conference (WHC’13). IEEE, 407–412.
[62]
Gaëtan Parseihian, Charles Gondre, Mitsuko Aramaki, Sølvi Ystad, and Richard Kronland-Martinet. 2016. Comparison and evaluation of sonification strategies for guidance tasks. IEEE Trans. Multimedia 18, 4 (2016), 674–686.
[63]
Martin Pielot, Benjamin Poppinga, Wilko Heuten, and Susanne Boll. 2011. A tactile compass for eyes-free pedestrian navigation, In IFIP Conference on Human-Computer Interaction. Springer, Berlin, Heidelberg, 640–656.
[64]
Giorgio Presti, Dragan Ahmetovic, Mattia Ducci, Cristian Bernareggi, Luca Ludovico, Adriano Baratè, Federico Avanzini, and Sergio Mascetti. 2019. WatchOut: Obstacle sonification for people with visual impairment or blindness. In Proceedings of the 21st International ACM SIGACCESS Conference on Computers and Accessibility. 402–413.
[65]
Kay E. Ramey, Dionne N. Champion, Elizabeth B. Dyer, Danielle T. Keifert, Christina Krist, Peter Meyerhoff, Krystal Villanosa, and Jaakko Hilppö. 2016. Qualitative analysis of video data: Standards and heuristics. In Transforming Learning, Empowering Learners: The International Conference of the Learning Sciences, C. K. Looi, J. L. Polman, U. Cress, and P. Reimann.
[66]
David A. Ross and Bruce B. Blasch. 2000. Wearable interfaces for orientation and wayfinding. In Proceedings of the 4th International ACM Conference on Assistive Technologies. 193–200.
[67]
Daisuke Sato, Uran Oh, Kakuya Naito, Hironobu Takagi, Kris Kitani, and Chieko Asakawa. 2017. Navcog3: An evaluation of a smartphone-based blind indoor navigation assistant with semantic features in a large-scale environment. In Proceedings of the 19th International ACM SIGACCESS Conference on Computers and Accessibility. 270–279.
[68]
Shantanu A. Satpute, Janet R. Canady, Roberta L. Klatzky, and George D. Stetten. 2019. FingerSight: A vibrotactile wearable ring for assistance with locating and reaching objects in peripersonal space. IEEE Trans. Hapt. 13, 2 (2019), 325–333.
[69]
Scanning and Detecting 3D Object [n.d.]. Retrieved May 4, 2020 from https://developer.apple.com/documentation/arkit/scanning_and_detecting_3d_objects.
[70]
Seeing AI [n.d.]. Retrieved May 1, 2020 from https://www.microsoft.com/en-us/ai/seeing-ai.
[71]
Chirayu Shah, Mourad Bouzit, Meriam Youssef, and Leslie Vasquez. 2006. Evaluation of RU-netra-tactile feedback navigation system for the visually impaired. In Proceedings of the International Workshop on Virtual Rehabilitation. IEEE, 72–77.
[72]
Roy Shilkrot, Jochen Huber, Wong Meng Ee, Pattie Maes, and Suranga Chandima Nanayakkara. 2015. FingerReader: A wearable device to explore printed text on the go. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. 2363–2372.
[73]
The Official Samsung Galaxy Site. [n.d.]. What Is ToF Camera Technology on Galaxy and How Does It Work? Retrieved from www.samsung.com/global/galaxy/what-is/tof-camera/.
[74]
Soundscape [n.d.]. Retrieved May 1, 2020 from https://www.microsoft.com/en-us/research/product/soundscape/.
[75]
Andrii Soviak, Anatoliy Borodin, Vikas Ashok, Yevgen Borodin, Yury Puzis, and I. V. Ramakrishnan. 2016. Tactile accessibility: Does anyone need a haptic glove? In Proceedings of the 18th International ACM SIGACCESS Conference on Computers and Accessibility. 101–109.
[76]
Akshaya Kesarimangalam Srinivasan, Shwetha Sridharan, and Rajeswari Sridhar. 2020. Object localization and navigation assistant for the visually challenged. In Proceedings of the 4th International Conference on Computing Methodologies and Communication (ICCMC’20). IEEE, 324–328.
[77]
Lee Stearns, Victor DeSouza, Jessica Yin, Leah Findlater, and Jon E. Froehlich. 2017. Augmented reality magnification for low vision users with the microsoft hololens and a finger-worn camera. In Proceedings of the 19th International ACM SIGACCESS Conference on Computers and Accessibility. 361–362.
[78]
Lee Stearns, Ruofei Du, Uran Oh, Catherine Jou, Leah Findlater, David A. Ross, and Jon E. Froehlich. 2016. Evaluating haptic and auditory directional guidance to assist blind people in reading printed text using finger-mounted cameras. ACM Trans. Access. Comput. 9, 1 (2016), 1–38.
[79]
Kaveri Thakoor, Nii Mante, Carey Zhang, Christian Siagian, James Weiland, Laurent Itti, and Gérard Medioni. 2014. A system for assisting the visually impaired in localization and grasp of desired objects. In Proceedings of the European Conference on Computer Vision. Springer, 643–657.
[80]
Understanding ARKit Tracking and Detection [n.d.]. Retrieved from May 4, 2020 https://developer.apple.com/videos/play/wwdc2018/610.
[81]
G. Vanderheiden. 1996. Use of audio-haptic interface techniques to allow nonvisual access to touchscreen appliances. Proceedings of the Human Factors and Ergonomics Society Annual Meeting 40, 24 (1996), 1266–1266.
[82]
Marynel Vázquez and Aaron Steinfeld. 2012. Helping visually impaired users properly aim a camera. In Proceedings of the 14th International ACM SIGACCESS Conference on Computers and Accessibility. 95–102.
[83]
Ramiro Velázquez, Edwige Pissaloux, and Aimé Lay-Ekuakille. 2015. Tactile-foot Stimulation can Assist the Navigation of People with Visual Impairment. Appl. Bionics Biomechan. (2015).
[84]
Andreas Wachaja, Pratik Agarwal, Mathias Zink, Miguel Reyes Adame, Knut Möller, and Wolfram Burgard. 2015. Navigating blind people with a smart walker. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS’15). IEEE, 6014–6019.
[85]
Hsueh-Cheng Wang, Robert K. Katzschmann, Santani Teng, Brandon Araki, Laura Giarré, and Daniela Rus. 2017. Enabling independent navigation for visually impaired people through a wearable vision-based feedback system. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA’17). IEEE, 6533–6540.
[86]
Graham Wilson and Stephen A. Brewster. 2016. Using dynamic audio feedback to support peripersonal reaching in young visually impaired people. In Proceedings of the 18th International ACM SIGACCESS Conference on Computers and Accessibility. 209–218.
[87]
Koji Yatani, Nikola Banovic, and Khai Truong. 2012. SpaceSense: Representing geographical information to visually impaired people using spatial tactile feedback. In Proceedings of the Conference on Human Factors in Computing Systems.
[88]
Hanlu Ye, Meethu Malu, Uran Oh, and Leah Findlater. 2014. Current and Future Mobile and Wearable Device use by People with Visual Impairments. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 3123–3132.
[89]
Chris Yoon, Ryan Louie, J. Ryan, MinhKhang Vu, Hyegi Bang, William Derksen, and P. Ruvolo. 2019. Leveraging augmented reality to create apps for people with visual disabilities: A case study in indoor navigation. In Proceedings of the 21st International ACM SIGACCESS Conference on Computers and Accessibility.
[90]
Chris Yoon, Ryan Louie, Jeremy Ryan, MinhKhang Vu, Hyegi Bang, William Derksen, and Paul Ruvolo. 2019. Leveraging augmented reality to create apps for people with visual disabilities: A case study in indoor navigation. In Proceedings of the 21st International ACM SIGACCESS Conference on Computers and Accessibility. 210–221.
[91]
John S. Zelek, Sam Bromley, Daniel Asmar, and David Thompson. 2003. A haptic glove as a tactile-vision sensory substitution for wayfinding. J. Vis. Impair. Blindness 97, 10 (2003), 621–632.
[92]
Yuhang Zhao, Elizabeth Kupferstein, Hathaitorn Rojnirun, Leah Findlater, and Shiri Azenkot. 2020. The effectiveness of visual and audio wayfinding guidance on smartglasses for people with low vision. In Proceedings of the CHI Conference on Human Factors in Computing Systems. 1–14.
[93]
Yu Zhong, Pierre J. Garrigues, and Jeffrey P. Bigham. 2013. Real time object scanning using a mobile phone and cloud-based visual search engine. In Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility. 1–8.
[94]
Peter A. Zientara, Sooyeon Lee, Gus H. Smith, Rorry Brenner, Laurent Itti, Mary B. Rosson, John M. Carroll, Kevin M. Irick, and Vijaykrishnan Narayanan. 2017. Third eye: A shopping assistant for the visually impaired. Computer 50, 2 (2017), 16–24.
[95]
Zoom [n.d.]. Retrieved from https://zoom.us/.

Cited By

View all
  • (2024)Improving Usability of Data Charts in Multimodal Documents for Low Vision UsersProceedings of the 26th International Conference on Multimodal Interaction10.1145/3678957.3685714(498-507)Online publication date: 4-Nov-2024
  • (2024)Viiat-Hand: A Reach-and-Grasp Restoration System Integrating Voice Interaction, Computer Vision, Auditory and Tactile Feedback for Non-Sighted AmputeesIEEE Robotics and Automation Letters10.1109/LRA.2024.34482189:10(8674-8681)Online publication date: Oct-2024
  • (2024)A review of sonification solutions in assistive systems for visually impaired peopleDisability and Rehabilitation: Assistive Technology10.1080/17483107.2024.232659019:8(2818-2833)Online publication date: 12-Mar-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Transactions on Accessible Computing
ACM Transactions on Accessible Computing  Volume 15, Issue 2
June 2022
288 pages
ISSN:1936-7228
EISSN:1936-7236
DOI:10.1145/3530301
Issue’s Table of Contents

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 19 May 2022
Online AM: 02 March 2022
Accepted: 01 December 2021
Revised: 01 October 2021
Received: 01 June 2021
Published in TACCESS Volume 15, Issue 2

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Mobile assistive technology
  2. augmented reality
  3. nonvisual guidance interaction
  4. people with visual impairments

Qualifiers

  • Research-article
  • Refereed

Funding Sources

  • National Science Foundation (NSF)

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)158
  • Downloads (Last 6 weeks)16
Reflects downloads up to 01 Mar 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Improving Usability of Data Charts in Multimodal Documents for Low Vision UsersProceedings of the 26th International Conference on Multimodal Interaction10.1145/3678957.3685714(498-507)Online publication date: 4-Nov-2024
  • (2024)Viiat-Hand: A Reach-and-Grasp Restoration System Integrating Voice Interaction, Computer Vision, Auditory and Tactile Feedback for Non-Sighted AmputeesIEEE Robotics and Automation Letters10.1109/LRA.2024.34482189:10(8674-8681)Online publication date: Oct-2024
  • (2024)A review of sonification solutions in assistive systems for visually impaired peopleDisability and Rehabilitation: Assistive Technology10.1080/17483107.2024.232659019:8(2818-2833)Online publication date: 12-Mar-2024
  • (2024)Exploring the role of computer vision in product design and development: a comprehensive reviewInternational Journal on Interactive Design and Manufacturing (IJIDeM)10.1007/s12008-024-01765-7Online publication date: 14-Mar-2024
  • (2023)SpaceX MagProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/35962537:2(1-36)Online publication date: 12-Jun-2023
  • (2023)Enabling Customization of Discussion Forums for Blind UsersProceedings of the ACM on Human-Computer Interaction10.1145/35932287:EICS(1-20)Online publication date: 19-Jun-2023
  • (2023)Enabling Efficient Web Data-Record Interaction for People with Visual Impairments via Proxy InterfacesACM Transactions on Interactive Intelligent Systems10.1145/357936413:3(1-27)Online publication date: 11-Sep-2023
  • (2023)Inclusive Augmented and Virtual Reality: A Research AgendaInternational Journal of Human–Computer Interaction10.1080/10447318.2023.224761440:20(6200-6219)Online publication date: 27-Aug-2023
  • (2023)Inclusive AR/VR: accessibility barriers for immersive technologiesUniversal Access in the Information Society10.1007/s10209-023-00969-023:1(59-73)Online publication date: 2-Feb-2023
  • (2022)Grid-Coding: An Accessible, Efficient, and Structured Coding Paradigm for Blind and Low-Vision ProgrammersProceedings of the 35th Annual ACM Symposium on User Interface Software and Technology10.1145/3526113.3545620(1-21)Online publication date: 29-Oct-2022

View Options

Login options

Full Access

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Full Text

View this article in Full Text.

Full Text

HTML Format

View this article in HTML Format.

HTML Format

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media