skip to main content
10.1145/3234695.3236344acmconferencesArticle/Chapter ViewAbstractPublication PagesassetsConference Proceedingsconference-collections
research-article

"Siri Talks at You": An Empirical Investigation of Voice-Activated Personal Assistant (VAPA) Usage by Individuals Who Are Blind

Published: 08 October 2018 Publication History

Abstract

Voice-activated personal assistants (VAPAs)--like Amazon Echo or Apple Siri--offer considerable promise to individuals who are blind due to widespread adoption of these non-visual interaction platforms. However, studies have yet to focus on the ways in which these technologies are used by individuals who are blind, along with whether barriers are encountered during the process of interaction. To address this gap, we interviewed fourteen legally-blind adults with experience of home and/or mobile-based VAPAs. While participants appreciated the access VAPAs provided to inaccessible applications and services, they faced challenges relating to the input, responses from VAPAs, and control of information presented. User behavior varied depending on the situation or context of the interaction. Implications for design are suggested to support inclusivity when interacting with VAPAs. These include accounting for privacy and situational factors in design, examining ways to support concerns over trust, and synchronizing presentation of visual and non-visual cues.

References

[1]
Ali Abdolrahmani, Ravi Kuber, and Amy Hurst. 2016. An empirical investigation of the situationally-induced impairments experienced by blind mobile device users. In Proceedings of the 13th Web for All Conference (W4A'16), 21:1--21:8.
[2]
Shiri Azenkot and Nicole B. Lee. 2013. Exploring the use of speech input by blind people on mobile devices. In Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS'13), 11:1--11:8.
[3]
Emily C. Bouck, Sara Flanagan, Gauri S. Joshi, Waseem Sheikh, and Dave Schleppenbach. 2011. Speaking math - a voice input, speech output calculator for students with visual impairments. Journal of Special Education Technology 26, 4: 1--14.
[4]
Stacy M. Branham, Ali Abdolrahmani, William Easley, Morgan Scheuerman, Erick Ronquillo, and Amy Hurst. 2017. "Is someone there? do they have a gun": how visual information about others can improve personal safety management for blind individuals. In Proceedings of the 19th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS'17), 260--269.
[5]
Stacy M. Branham and Shaun K. Kane. 2015. The invisible work of accessibility: how blind employees manage accessibility in mixed-ability workplaces. In Proceedings of the 17th International ACM SIGACCESS Conference on Computers & Accessibility (ASSETS'15), 163--171.
[6]
Mei-Ling Chen and Hao-Chuan Wang. 2018. How personal experience and technical knowledge affect using conversational agents. In Proceedings of the 23rd International Conference on Intelligent User Interfaces Companion (IUI'18), 53:1--53:2.
[7]
Abide Coskun-Setirek and Sona Mardikyan. 2017. Understanding the adoption of voice activated personal assistants. International Journal of E-Services and Mobile Applications (IJESMA) 9, 3: 1--21.
[8]
Benjamin R. Cowan, Nadia Pantidi, David Coyle, Kellie Morrissey, Peter Clarke, Sara Al-Shehri, David Earley, and Natasha Bandeira. 2017. "What can i help you with?": infrequent users' experiences of intelligent personal assistants. In Proceedings of the 19th International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI'17), 43:1--43:12.
[9]
Aarthi Easwara Moorthy and Kim-Phuong L. Vu. 2015. Privacy concerns for use of voice activated personal assistant in the public space. International Journal of Human-Computer Interaction 31, 4: 307-- 335.
[10]
Christos Efthymiou and Martin Halvey. 2016. Evaluating the social acceptability of voice based smartwatch search. In Information Retrieval Technology (Lecture Notes in Computer Science), 267--278.
[11]
Ido Guy. 2016. Searching by talking: analysis of voice queries on mobile web search. In Proceedings of the 39th International ACM SIGIR Conference on Research and Development in Information Retrieval (SIGIR'16), 35--44.
[12]
Sangyeal Han and Heetae Yang. 2018. Understanding adoption of intelligent personal assistants: a parasocial relationship perspective. Industrial Management & Data Systems 118, 3: 618--636.
[13]
A. Helal, S. E. Moore, and B. Ramachandran. 2001. Drishti: an integrated navigation system for visually impaired and disabled. In Proceedings Fifth International Symposium on Wearable Computers, 149--156.
[14]
Jonathan Lazar, Aaron Allen, Jason Kleinman, and Chris Malarkey. 2007. What frustrates screen reader users on the web: a study of 100 blind users. International Journal of Human--Computer Interaction 22, 3: 247--269.
[15]
Irene Lopatovska, Katrina Rink, Ian Knight, Kieran Raines, Kevin Cosenza, Harriet Williams, Perachya Sorsche, David Hirsch, Qi Li, and Adrianna Martinez. 2018. Talk to me: Exploring user interactions with the Amazon Alexa. Journal of Librarianship and Information Science: 0961000618759414.
[16]
Ewa Luger and Abigail Sellen. 2016. "Like having a really bad PA": The gulf between user expectation and experience of conversational agents. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI'16), 5286--5297.
[17]
Niina Mallat, Virpi Tuunainen, and Kristina Wittkowski. 2017. Voice activated personal assistants-- consumer use contexts and usage behavior. In Technology Research, Education, and Opinion (TREO), Americas Conference on Information Systems (AMCIS'17).
[18]
Austin M. Mulloy, Cindy Gevarter, Megan Hopkins, Kevin S. Sutherland, and Sathiyaprakash T. Ramdoss. 2014. Assistive technology for students with visual impairments and blindness. In Assistive Technologies for People with Diverse Abilities. Springer, 113--156.
[19]
Emma Murphy, Ravi Kuber, Graham McAllister, Philip Strain, and Wai Yu. 2008. An empirical investigation into the difficulties experienced by visually impaired Internet users. Universal Access in the Information Society 7, 1--2: 79--91.
[20]
François Portet, Michel Vacher, Caroline Golanski, Camille Roux, and Brigitte Meillon. 2013. Design and evaluation of a smart home voice interface for the elderly: acceptability and objection aspects. Personal and Ubiquitous Computing 17, 1: 127--144.
[21]
Alisha Pradhan, Kanika Mehta, and Leah Findlater. 2018. "Accessibility came by accident": use of voicecontrolled intelligent personal assistants by people with disabilities. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (CHI '18), 459:1--459:13.
[22]
Gianluca Schiavo, Ornella Mich, Michela Ferron, and Nadia Mana. 2017. Mobile Multimodal Interaction for Older and Younger Users: Exploring Differences and Similarities. In Proceedings of the 16th International Conference on Mobile and Ubiquitous Multimedia (MUM'17), 407--414.
[23]
Waseem Sheikh, Dave Schleppenbach, and Dennis Leas. 2018. MathSpeak: a non-ambiguous language for audio rendering of MathML. International Journal of Learning Technology 13, 1: 3--25.
[24]
Derrick W. Smith and Stacy M. Kelly. 2014. Assistive technology for students with visual impairments: A research agenda. In International Review of Research in Developmental Disabilities. Elsevier, 23--53.
[25]
Linda Wulf, Markus Garschall, Julia Himmelsbach, and Manfred Tscheligi. 2014. Hands free - care free: elderly people taking advantage of speech-only interaction. In Proceedings of the 8th Nordic Conference on Human-Computer Interaction: Fun, Fast, Foundational (NordiCHI'14), 203--206.
[26]
Yu Zhong, T.V. Raman, Casey Burkhardt, Fadi Biadsy, and Jeffrey P. Bigham. 2014. JustSpeak: enabling universal voice control on Android. In Proceedings of Web for All Conference (W4A'14), 36:1--36:4.

Cited By

View all
  • (2025)Investigating the Acceptance of Voice User Interfaces for Users With DisabilitiesIEEE Access10.1109/ACCESS.2024.352014913(1055-1069)Online publication date: 2025
  • (2025)Seven HCI Grand Challenges Revisited: Five-Year ProgressInternational Journal of Human–Computer Interaction10.1080/10447318.2025.2450411(1-49)Online publication date: 4-Feb-2025
  • (2025)Breaking down barriersInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2024.103403194:COnline publication date: 1-Feb-2025
  • Show More Cited By

Index Terms

  1. "Siri Talks at You": An Empirical Investigation of Voice-Activated Personal Assistant (VAPA) Usage by Individuals Who Are Blind

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    ASSETS '18: Proceedings of the 20th International ACM SIGACCESS Conference on Computers and Accessibility
    October 2018
    508 pages
    ISBN:9781450356503
    DOI:10.1145/3234695
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 08 October 2018

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. blind individuals
    2. non-visual interaction
    3. usability challenges
    4. vapa
    5. voice activated personal assistant

    Qualifiers

    • Research-article

    Conference

    ASSETS '18
    Sponsor:

    Acceptance Rates

    ASSETS '18 Paper Acceptance Rate 28 of 108 submissions, 26%;
    Overall Acceptance Rate 436 of 1,556 submissions, 28%

    Upcoming Conference

    ASSETS '25

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)199
    • Downloads (Last 6 weeks)20
    Reflects downloads up to 17 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2025)Investigating the Acceptance of Voice User Interfaces for Users With DisabilitiesIEEE Access10.1109/ACCESS.2024.352014913(1055-1069)Online publication date: 2025
    • (2025)Seven HCI Grand Challenges Revisited: Five-Year ProgressInternational Journal of Human–Computer Interaction10.1080/10447318.2025.2450411(1-49)Online publication date: 4-Feb-2025
    • (2025)Breaking down barriersInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2024.103403194:COnline publication date: 1-Feb-2025
    • (2025)Limitations in speech recognition for young adults with down syndromeUniversal Access in the Information Society10.1007/s10209-025-01197-4Online publication date: 15-Feb-2025
    • (2025)Voice User Interface for Designing Inclusive Games for Children with Visual Impairment and Sighted PupilsHuman-Computer Interaction. Design and Research10.1007/978-3-031-80832-6_7(89-98)Online publication date: 14-Feb-2025
    • (2024)Real-Time View Assistance for the Blind using Image ProcessingJournal of Innovative Image Processing10.36548/jiip.2024.2.0026:2(96-109)Online publication date: Jun-2024
    • (2024)Testing 3 Modalities (Voice Assistant, Chatbot, and Mobile App) to Assist Older African American and Black Adults in Seeking Information on Alzheimer Disease and Related Dementias: Wizard of Oz Usability StudyJMIR Formative Research10.2196/606508(e60650)Online publication date: 9-Dec-2024
    • (2024)Investigating the Integration and the Long-Term Use of Smart Speakers in Older Adults’ Daily Practices: Qualitative StudyJMIR mHealth and uHealth10.2196/4747212(e47472)Online publication date: 12-Feb-2024
    • (2024)Aligning Visual Prosthetic Development With Implantee NeedsTranslational Vision Science & Technology10.1167/tvst.13.11.2813:11(28)Online publication date: 21-Nov-2024
    • (2024)EasyAsk: An In-App Contextual Tutorial Search Assistant for Older Adults with Voice and Touch InputsProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36785168:3(1-27)Online publication date: 9-Sep-2024
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media