skip to main content
10.1145/3313831.3376591acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

The Emerging Professional Practice of Remote Sighted Assistance for People with Visual Impairments

Published: 23 April 2020 Publication History

Abstract

People with visual impairments (PVI) must interact with a world they cannot see. Remote sighted assistance (RSA) has emerged as a conversational assistive technology. We interviewed RSA assistants ("agents") who provide assistance to PVI via a conversational prosthetic called Aira (https://aira.io/) to understand their professional practice. We identified four types of support provided: scene description, navigation, task performance, and social engagement. We discovered that RSA provides an opportunity for PVI to appropriate the system as a richer conversational/social support tool. We studied and identified patterns in how agents provide assistance and how they interact with PVI as well as the challenges and strategies associated with each context. We found that conversational interaction is highly context-dependent. We also discuss implications for design.

References

[1]
Asm Iftekhar Anam, Shahinur Alam, and Mohammed Yeasin. 2014. Expression: A dyadic conversation aid using Google Glass for people who are blind or visually impaired. In 6th International Conference on Mobile Computing, Applications and Services. IEEE, 57--64.
[2]
Przemyslaw Baranski and Pawel Strumillo. 2015. Field trials of a teleassistance system for the visually impaired. In 2015 8th International Conference on Human System Interaction (HSI). IEEE, 173--179.
[3]
Jeffrey P Bigham, Chandrika Jayant, Hanjie Ji, Greg Little, Andrew Miller, Robert C Miller, Robin Miller, Aubrey Tatarowicz, Brandyn White, Samual White, and others. 2010a. VizWiz: nearly real-time answers to visual questions. In Proceedings of the 23nd annual ACM symposium on User interface software and technology. ACM, 333--342.
[4]
Jeffrey P Bigham, Chandrika Jayant, Andrew Miller, Brandyn White, and Tom Yeh. 2010b. VizWiz:: LocateIt-enabling blind people to locate objects in their environment. In 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition-Workshops. IEEE, 65--72.
[5]
Jeffrey P Bigham, Richard E Ladner, and Yevgen Borodin. 2011. The design of human-powered access technology. In The proceedings of the 13th international ACM SIGACCESS conference on Computers and accessibility. ACM, 3--10.
[6]
Erin Brady, Jeffrey P Bigham, and others. 2015. Crowdsourcing accessibility: Human-powered access technologies. Foundations and Trends® in Human--Computer Interaction 8, 4 (2015), 273--372.
[7]
Erin L Brady, Yu Zhong, Meredith Ringel Morris, and Jeffrey P Bigham. 2013. Investigating the appropriateness of social network question asking as a resource for blind users. In Proceedings of the 2013 conference on Computer supported cooperative work. ACM, 1225--1236.
[8]
Stacy M Branham and Shaun K Kane. 2015. Collaborative accessibility: How blind and sighted companions co-create accessible home spaces. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. ACM, 2373--2382.
[9]
Virginia Braun and Victoria Clarke. 2006. Using thematic analysis in psychology. Qualitative research in psychology 3, 2 (2006), 77--101.
[10]
M Bujacz, P Baranski, M Moranski, P Strumillo, and A Materka. 2008. Remote guidance for the blind-A proposed teleassistance system and navigation trials. In 2008 Conference on Human System Interactions. IEEE, 888--892.
[11]
Michele A Burton, Erin Brady, Robin Brewer, Callie Neylan, Jeffrey P Bigham, and Amy Hurst. 2012. Crowdsourcing subjective fashion advice using VizWiz: challenges and opportunities. In Proceedings of the 14th international ACM SIGACCESS conference on Computers and accessibility. ACM, 135--142.
[12]
Babar Chaudary, Iikka Paajala, Eliud Keino, and Petri Pulli. 2017. Tele-guidance based navigation system for the visually impaired and blind persons. In eHealth 360°. Springer, 9--16.
[13]
Adam Faeth and Chris Harding. 2014. Emergent effects in multimodal feedback from virtual buttons. ACM Transactions on Computer-Human Interaction (TOCHI) 21, 1 (2014), 3.
[14]
Vanja Garaj, Rommanee Jirawimut, Piotr Ptasinski, Franjo Cecelja, and Wamadeva Balachandran. 2003. A system for remote sighted guidance of visually impaired pedestrians. British Journal of Visual Impairment 21, 2 (2003), 55--63.
[15]
Nicholas A Giudice and Gordon E Legge. 2008. Blind navigation and the role of technology. The Engineering Handbook of Smart Technology for Aging, Disability, and Independence 8 (2008), 479--500.
[16]
João Guerreiro, Dragan Ahmetovic, Daisuke Sato, Kris Kitani, and Chieko Asakawa. 2019. Airport Accessibility and Navigation Assistance for People with Visual Impairments. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. ACM, 16.
[17]
Danna Gurari, Qing Li, Abigale J Stangl, Anhong Guo, Chi Lin, Kristen Grauman, Jiebo Luo, and Jeffrey P Bigham. 2018. Vizwiz grand challenge: Answering visual questions from blind people. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 3608--3617.
[18]
Nicole Holmes and Kelly Prentice. 2015. iPhone video link facetime as an orientation tool: remote O&M for people with vision impairment. International Journal of Orientation & Mobility 7, 1 (2015), 60--68.
[19]
Chandrika Jayant, Hanjie Ji, Samuel White, and Jeffrey P Bigham. 2011. Supporting blind photography. In The proceedings of the 13th international ACM SIGACCESS conference on Computers and accessibility. ACM, 203--210.
[20]
Hernisa Kacorri, Eshed Ohn-Bar, Kris M Kitani, and Chieko Asakawa. 2018. Environmental factors in indoor navigation based on real-world trajectories of blind users. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. ACM, 56.
[21]
Seita Kayukawa, Keita Higuchi, João Guerreiro, Shigeo Morishima, Yoichi Sato, Kris Kitani, and Chieko Asakawa. 2019. BBeep: A Sonic Collision Avoidance System for Blind Travellers and Nearby Pedestrians. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. ACM, 52.
[22]
Aliasgar Kutiyanawala, Vladimir Kulyukin, and John Nicholson. 2011. Teleassistance in accessible shopping for the blind. In Proceedings on the International Conference on Internet Computing (ICOMP). The Steering Committee of The World Congress in Computer Science,Computer Engineering and Applied Computing (WorldComp), 1.
[23]
Walter S Lasecki, Kyle I Murray, Samuel White, Robert C Miller, and Jeffrey P Bigham. 2011. Real-time crowd control of existing interfaces. In Proceedings of the 24th annual ACM symposium on User interface software and technology. ACM, 23--32.
[24]
Walter S Lasecki, Phyo Thiha, Yu Zhong, Erin Brady, and Jeffrey P Bigham. 2013a. Answering visual questions with conversational crowd assistants. In Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility. ACM, 18.
[25]
Walter S Lasecki, Rachel Wesley, Jeffrey Nichols, Anand Kulkarni, James F Allen, and Jeffrey P Bigham. 2013b. Chorus: a crowd-powered conversational assistant. In Proceedings of the 26th annual ACM symposium on User interface software and technology. ACM, 151--162.
[26]
Sooyeon Lee, Chien Wen Yuan, Benjamin V Hanrahan, Mary Beth Rosson, and John M Carroll. 2017. Reaching Out: Investigating Different Modalities to Help People with Visual Impairments Acquire Items. In Proceedings of the 19th International ACM SIGACCESS Conference on Computers and Accessibility. ACM, 389--390.
[27]
Carolina Martinez. 1998. Orientation and mobility training: The way to go. SEE/HEAR. Texas School for the Blind. Consulta efectuada em 29, 06 (1998), 2006.
[28]
Helen Petrie, Valerie Johnson, Thomas Strothotte, Andreas Raab, Rainer Michel, Lars Reichert, and Axel Schalt. 1997. MoBIC: An aid to increase the independent mobility of blind travellers. British Journal of Visual Impairment 15, 2 (1997), 63--66.
[29]
Paymon Ra?an and Gordon E Legge. 2017. Remote Sighted Assistants for Indoor Location Sensing of Visually Impaired Pedestrians. ACM Transactions on Applied Perception (TAP) 14, 3 (2017), 19.
[30]
Daisuke Sato, Uran Oh, Kakuya Naito, Hironobu Takagi, Kris Kitani, and Chieko Asakawa. 2017. Navcog3: An evaluation of a smartphone-based blind indoor navigation assistant with semantic features in a large-scale environment. In Proceedings of the 19th International ACM SIGACCESS Conference on Computers and Accessibility. ACM, 270--279.
[31]
Stefano Scheggi, A Talarico, and Domenico Prattichizzo. 2014. A remote guidance system for blind and visually impaired people via vibrotactile haptic feedback. In 22nd Mediterranean Conference on Control and Automation. IEEE, 20--23.
[32]
André Schmidt, Mads Kleemann, Timothy Merritt, and Ted Selker. 2015. Tactile communication in extreme contexts: Exploring the design space through kiteboarding. In IFIP Conference on Human-Computer Interaction. Springer, 37--54.
[33]
M Iftekhar Tanveer, ASM Anam, Mohammed Yeasin, and Majid Khan. 2013. Do you see what I see?: designing a sensory substitution device to access non-verbal modes of communication. In Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility. ACM, 10.
[34]
Christopher D Wickens. 2002. Multiple resources and performance prediction. Theoretical issues in ergonomics science 3, 2 (2002), 159--177.
[35]
Michele A Williams, Caroline Galbraith, Shaun K Kane, and Amy Hurst. 2014. Just let the cane hit it: how the blind and sighted see navigation differently. In Proceedings of the 16th international ACM SIGACCESS conference on Computers & accessibility. ACM, 217--224.
[36]
Michele A Williams, Amy Hurst, and Shaun K Kane. 2013. Pray before you step out: describing personal and situational blind navigation behaviors. In Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility. ACM, 28.
[37]
Chien Wen Yuan, Benjamin V Hanrahan, Sooyeon Lee, Mary Beth Rosson, and John M Carroll. 2017. I Didn't Know That You Knew I Knew: Collaborative Shopping Practices Between People with Visual Impairment and People with Vision. Proceedings of the ACM on Human-Computer Interaction 1, CSCW (2017), 118.
[38]
Yu Zhong, Walter S Lasecki, Erin Brady, and Jeffrey P Bigham. 2015. Regionspeak: Quick comprehensive spatial descriptions of complex images for blind users. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. ACM, 2353--2362.

Cited By

View all
  • (2025)4D+ City Sidewalk: Integrating Pedestrian View into Sidewalk Spaces to Support User-Centric Urban Spatial PerceptionSensors10.3390/s2505137525:5(1375)Online publication date: 24-Feb-2025
  • (2024)A Simulation and Training Platform for Remote-Sighted AssistanceSensors10.3390/s2423777324:23(7773)Online publication date: 4-Dec-2024
  • (2024)Human–AI Collaboration for Remote Sighted Assistance: Perspectives from the LLM EraFuture Internet10.3390/fi1607025416:7(254)Online publication date: 18-Jul-2024
  • Show More Cited By

Index Terms

  1. The Emerging Professional Practice of Remote Sighted Assistance for People with Visual Impairments

        Recommendations

        Comments

        Information & Contributors

        Information

        Published In

        cover image ACM Conferences
        CHI '20: Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems
        April 2020
        10688 pages
        ISBN:9781450367080
        DOI:10.1145/3313831
        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

        Sponsors

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        Published: 23 April 2020

        Permissions

        Request permissions for this article.

        Check for updates

        Author Tags

        1. assistive technology
        2. human powered accessibility
        3. remote sighted assistance
        4. visual impairment

        Qualifiers

        • Research-article

        Funding Sources

        Conference

        CHI '20
        Sponsor:

        Acceptance Rates

        Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

        Upcoming Conference

        CHI 2025
        ACM CHI Conference on Human Factors in Computing Systems
        April 26 - May 1, 2025
        Yokohama , Japan

        Contributors

        Other Metrics

        Bibliometrics & Citations

        Bibliometrics

        Article Metrics

        • Downloads (Last 12 months)128
        • Downloads (Last 6 weeks)6
        Reflects downloads up to 01 Mar 2025

        Other Metrics

        Citations

        Cited By

        View all
        • (2025)4D+ City Sidewalk: Integrating Pedestrian View into Sidewalk Spaces to Support User-Centric Urban Spatial PerceptionSensors10.3390/s2505137525:5(1375)Online publication date: 24-Feb-2025
        • (2024)A Simulation and Training Platform for Remote-Sighted AssistanceSensors10.3390/s2423777324:23(7773)Online publication date: 4-Dec-2024
        • (2024)Human–AI Collaboration for Remote Sighted Assistance: Perspectives from the LLM EraFuture Internet10.3390/fi1607025416:7(254)Online publication date: 18-Jul-2024
        • (2024)Wheeler: A Three-Wheeled Input Device for Usable, Efficient, and Versatile Non-Visual InteractionProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676396(1-20)Online publication date: 13-Oct-2024
        • (2024)WorldScribe: Towards Context-Aware Live Visual DescriptionsProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676375(1-18)Online publication date: 13-Oct-2024
        • (2024)"Because Some Sighted People, They Don't Know What the Heck You're Talking About:" A Study of Blind Tokers' Infrastructuring Work to Build IndependenceProceedings of the ACM on Human-Computer Interaction10.1145/36372978:CSCW1(1-30)Online publication date: 26-Apr-2024
        • (2024)“I Don't Really Get Involved In That Way”: Investigating Blind and Visually Impaired Individuals' Experiences of Joint Attention with Sighted PeopleProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642940(1-16)Online publication date: 11-May-2024
        • (2024)Help Supporters: Exploring the Design Space of Assistive Technologies to Support Face-to-Face Help Between Blind and Sighted StrangersProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642816(1-24)Online publication date: 11-May-2024
        • (2024)Investigating Use Cases of AI-Powered Scene Description Applications for Blind and Low Vision PeopleProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642211(1-21)Online publication date: 11-May-2024
        • (2024)BubbleCam: Engaging Privacy in Remote Sighted AssistanceProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642030(1-16)Online publication date: 11-May-2024
        • Show More Cited By

        View Options

        Login options

        View options

        PDF

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader

        HTML Format

        View this article in HTML Format.

        HTML Format

        Figures

        Tables

        Media

        Share

        Share

        Share this Publication link

        Share on social media