skip to main content
10.1145/3517428.3550362acmconferencesArticle/Chapter ViewAbstractPublication PagesassetsConference Proceedingsconference-collections
poster

Improving Image Accessibility by Combining Haptic and Auditory Feedback

Published: 22 October 2022 Publication History

Abstract

Advancements in accessibility have led to mobile applications that help blind or low vision people (BLV) access surrounding information independently. Unfortunately, accessibility of visual information such as images remains limited. Previous research demonstrated that spatial interaction can help BLV users build a mental model of the relative locations of image objects. As haptics has recently become a core component of modern smartphones, we extend this prior research by designing three prototypes that use haptic feedback to reveal object location in images. We evaluate these techniques in terms of experience and ability of BLV users to locate multiple objects in images. Evaluation results in a preliminary study with seven BLV users suggest that the proposed haptic feedback prototype with auditory notifications to identify people and auditory caption can provide a more accessible and engaging image experience.

References

[1]
Mallak Alkhathlan, ML Tlachac, Lane Harrison, and Elke Rundensteiner. 2021. “Honestly I Never Really Thought About Adding a Description”: Why Highly Engaged Tweets Are Inaccessible. In IFIP Conference on Human-Computer Interaction. Springer, 373–395.
[2]
Christopher S Campbell, Shumin Zhai, Kim W May, and Paul P Maglio. 1999. What you feel must be what you see: adding tactile feedback to the trackpoint. In In: Proc. of INTERACT’99: 7th IFIP Conference on Human Computer Interaction. Citeseer.
[3]
Géry Casiez, Nicolas Roussel, Romuald Vanbelleghem, and Frédéric Giraud. 2011. Surfpad: riding towards targets on a squeeze film effect. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 2491–2500.
[4]
Heather Culbertson, Samuel B Schorr, and Allison M Okamura. 2018. Haptics: The present and future of artificial touch sensation. Annual Review of Control, Robotics, and Autonomous Systems 1 (2018), 385–409.
[5]
Jack Tigh Dennerlein, David B Martin, and Christopher Hasser. 2000. Force-feedback improves performance for steering and combined steering-targeting tasks. In Proceedings of the SIGCHI conference on Human factors in computing systems. 423–429.
[6]
Nicholas A Giudice, Hari Prasath Palani, Eric Brenner, and Kevin M Kramer. 2012. Learning non-visual graphical information using a touch-based vibro-audio interface. In Proceedings of the 14th international ACM SIGACCESS conference on Computers and accessibility. 103–110.
[7]
Cole Gleason, Amy Pavel, Emma McCamey, Christina Low, Patrick Carrington, Kris M Kitani, and Jeffrey P Bigham. 2020. Twitter A11y: A browser extension to make Twitter images accessible. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. 1–12.
[8]
Cagatay Goncu and Kim Marriott. 2011. GraVVITAS: generic multi-touch presentation of accessible graphics. In IFIP Conference on Human-Computer Interaction. Springer, 30–48.
[9]
Mitchell L. Gordon and Shumin Zhai. 2019. Touchscreen Haptic Augmentation Effects on Tapping, Drag and Drop, and Path Following. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (Glasgow, Scotland Uk) (CHI ’19). Association for Computing Machinery, New York, NY, USA, 1–12. https://doi.org/10.1145/3290605.3300603
[10]
2022 Apple Inc.2022. Human Interface Guidelines. Retrieved Jun 20, 2022 from https://developer.apple.com/design/human-interface-guidelines/patterns/playing-haptics
[11]
2022 Apple Inc.2022. Human Interface Guidelines. Retrieved Jun 20, 2022 from https://developer.apple.com/design/human-interface-guidelines/patterns/feedback/
[12]
2022 Apple Inc.2022. uikit. Retrieved Jun 20, 2022 from https://developer.apple.com/documentation/uikit/uifeedbackgenerator
[13]
2022 Apple Inc.2022. uikit. Retrieved Jun 20, 2022 from https://developer.apple.com/documentation/uikit/uinotificationfeedbackgenerator
[14]
Shaun K Kane, Jeffrey P Bigham, and Jacob O Wobbrock. 2008. Slide rule: making mobile touch screens accessible to blind people using multi-touch interaction techniques. In Proceedings of the 10th international ACM SIGACCESS conference on Computers and accessibility. 73–80.
[15]
Ernst Kruijff, Saugata Biswas, Christina Trepkowski, Jens Maiero, George Ghinea, and Wolfgang Stuerzlinger. 2019. Multilayer haptic feedback for pen-based tablet interaction. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. 1–14.
[16]
Karon E MacLean. 2008. Haptic interaction design for everyday interfaces. Reviews of Human Factors and Ergonomics 4, 1 (2008), 149–194.
[17]
Giuseppe Melfi, Karin Müller, Thorsten Schwarz, Gerhard Jaworek, and Rainer Stiefelhagen. 2020. Understanding what you feel: A mobile audio-tactile system for graphics used at schools with students with visual impairment. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. 1–12.
[18]
Meredith Ringel Morris, Jazette Johnson, Cynthia L Bennett, and Edward Cutrell. 2018. Rich representations of visual content for screen reader users. In Proceedings of the 2018 CHI conference on human factors in computing systems. 1–11.
[19]
Alvaro Pascual-Leone and Roy Hamilton. 2001. The metamodal organization of the brain. Progress in brain research 134 (2001), 427–445.
[20]
Benjamin Poppinga, Charlotte Magnusson, Martin Pielot, and Kirsten Rassmus-Gröhn. 2011. TouchOver map: audio-tactile exploration of interactive maps. In Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services. 545–550.
[21]
Elliot Salisbury, Ece Kamar, and Meredith Morris. 2017. Toward scalable social alt text: Conversational crowdsourcing as a tool for refining vision-to-language technology for the blind. In Proceedings of the AAAI Conference on Human Computation and Crowdsourcing, Vol. 5.
[22]
Brandon T Shrewsbury. 2011. Providing haptic feedback using the kinect. In The proceedings of the 13th international ACM SIGACCESS conference on Computers and accessibility. 321–322.
[23]
Francesca Sorgini, Renato Caliò, Maria Chiara Carrozza, and Calogero Maria Oddo. 2018. Haptic-assistive technologies for audition and vision sensory disabilities. Disability and Rehabilitation: Assistive Technology 13, 4(2018), 394–421.
[24]
Abigale Stangl, Meredith Ringel Morris, and Danna Gurari. 2020. ” Person, Shoes, Tree. Is the Person Naked?” What People with Vision Impairments Want in Image Descriptions. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. 1–13.
[25]
Eric Vezzoli, Thomas Sednaoui, Michel Amberg, Frédéric Giraud, and Betty Lemaire-Semail. 2016. Texture rendering strategies with a high fidelity-capacitive visual-haptic friction control device. In International Conference on Human Haptic Sensing and Touch Enabled Computer Applications. Springer, 251–260.
[26]
World Health Organization (WHO). 2020. Blindness and vision impairment. Retrieved October 8, 2019 from https://www.who.int/news-room/fact-sheets/detail/blindness-and-visual-impairment
[27]
Thomas Wolbers, Roberta L Klatzky, Jack M Loomis, Magdalena G Wutte, and Nicholas A Giudice. 2011. Modality-independent coding of spatial layout in the human brain. Current Biology 21, 11 (2011), 984–989.
[28]
Yang Zhang and Chris Harrison. 2015. Quantifying the targeting performance benefit of electrostatic haptic feedback on touchscreens. In Proceedings of the 2015 International Conference on Interactive Tabletops & Surfaces. 43–46.

Cited By

View all
  • (2025)Technology in Digital Escape RoomsDigital Escape Room Designs in Education10.4018/979-8-3693-4219-0.ch003(107-156)Online publication date: 28-Feb-2025

Index Terms

  1. Improving Image Accessibility by Combining Haptic and Auditory Feedback

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      ASSETS '22: Proceedings of the 24th International ACM SIGACCESS Conference on Computers and Accessibility
      October 2022
      902 pages
      ISBN:9781450392587
      DOI:10.1145/3517428
      Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 22 October 2022

      Check for updates

      Author Tags

      1. Accessibility
      2. Haptics
      3. Screen readers
      4. Smartphones
      5. Touchscreens
      6. Visual impairment

      Qualifiers

      • Poster
      • Research
      • Refereed limited

      Conference

      ASSETS '22
      Sponsor:

      Acceptance Rates

      ASSETS '22 Paper Acceptance Rate 35 of 132 submissions, 27%;
      Overall Acceptance Rate 436 of 1,556 submissions, 28%

      Upcoming Conference

      ASSETS '25

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)52
      • Downloads (Last 6 weeks)2
      Reflects downloads up to 15 Feb 2025

      Other Metrics

      Citations

      Cited By

      View all
      • (2025)Technology in Digital Escape RoomsDigital Escape Room Designs in Education10.4018/979-8-3693-4219-0.ch003(107-156)Online publication date: 28-Feb-2025

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      HTML Format

      View this article in HTML Format.

      HTML Format

      Figures

      Tables

      Media

      Share

      Share

      Share this Publication link

      Share on social media