Skip to main content
Log in

Assisting Visually Impaired People to Acquire Targets on a Large Wall-Mounted Display

  • Regular Paper
  • Published:
Journal of Computer Science and Technology Aims and scope Submit manuscript

Abstract

Large displays have become ubiquitous in our everyday lives, but these displays are designed for sighted people. This paper addresses the need for visually impaired people to access targets on large wall-mounted displays. We developed an assistive interface which exploits mid-air gesture input and haptic feedback, and examined its potential for pointing and steering tasks in human computer interaction (HCI). In two experiments, blind and blindfolded users performed target acquisition tasks using mid-air gestures and two different kinds of feedback (i.e., haptic feedback and audio feedback). Our results show that participants perform faster in Fitts’ law pointing tasks using the haptic feedback interface rather than the audio feedback interface. Furthermore, a regression analysis between movement time (MT) and the index of difficulty (ID) demonstrates that the Fitts’ law model and the steering law model are both effective for the evaluation of assistive interfaces for the blind. Our work and findings will serve as an initial step to assist visually impaired people to easily access required information on large public displays using haptic interfaces.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Hinrichs U, Carpendale S, Valkanova N, Kuikkaniemi K, Jacucci G, Vande M A. Interactive public displays. IEEE Computer Graphics and Applications, 2013, 33(2): 25–27.

    Article  Google Scholar 

  2. Rashid U, Nacenta M A, Quigley A. Factors influencing visual attention switch in multi-display user interfaces: A survey. In Proc. the 1st Int. Symp. Pervasive Displays, June, 2012, pp.1–6.

  3. Kane S, Morris M, Perkins A, Wigdor D, Ladner R E, Wobbrock J O. Access overlays: Improving non-visual access to large touch screens for blind users. In Proc. the 24th ACM Symp. User Interface Software and Technology (UIST), October 2011, pp.273–282.

  4. Hoggan E, Crossan A, Brewster S, Kaaresoja T. Audio or tactile feedback: Which modality when? In Proc. the 27th SIGCHI Conf. Human Factors in Computing Systems (CHI), April 2009, pp.2253-2256.

  5. Fitts P M. The information capacity of the human motor system in controlling the amplitude of movement. Journal of Experimental Psychology, 1954, 47: 381–391.

    Article  Google Scholar 

  6. Accot J, Zhai S. Beyond Fitts' law: Models for trajectory-based HCI tasks. In Proc. the 15th SIGCHI Conf. Human Factors in Computing Systems (CHI), March 1997, pp.295–302.

  7. Lehtinen V, Oulasvirta A, Salovaara A, Nurmi P. Dynamic tactile guidance for visual search tasks. In Proc. the 25th ACM Symp. User Interface Software and Technology (UIST), October 2012, pp.445–452.

  8. Pirhonen A, Brewster S, Holguin C. Gestural and audio metaphors as a means of control for mobile devices. In Proc.the 20th SIGCHI Conf. Human Factors in Computing Systems (CHI), April 2002, pp.291–298.

  9. Yuan B, Folmer E. Blind hero: Enabling guitar hero for the visually impaired. In Proc. the 10th ACM Conf. Computers and Accessibility (ASSETS), October 2008, pp.169–176.

  10. Nomura Y, Yagi Y, Sugiura T, Matsui H, Kato N. A fingertip guiding manipulator for mental image creation of multi-stroke drawings. Microsyst. Technol., 2007, 13(8): 905–910.

    Article  Google Scholar 

  11. Bousbia-Salah M, Fezari M, Hamdi R. A navigation system for blind pedestrians. In Proc. the 16th IFAC World Congress, July 2005, pp.1401–1405.

  12. Sánchez J, Sáenz M, Ripoll M. Usability of a multimodal videogame to improve navigation skills for blind children. In Proc. the 11th ACM SIGACCESS Conf. Computers and Accessibility (ASSETS), October 2009, pp.35–42.

  13. Tsetserukou D. FlexTorque, FlexTensor, and HapticEye: Exoskeleton haptic interfaces for augmented interaction. In Proc. the 2nd Int. Augmented Human Conf. (AH), March 2011, Article No. 33.

  14. Heuten W, Henze N, Boll S, Pielot M. Tactile wayfinder: A non-visual support system for wayfinding. In Proc. the 5th Nordic Conf. Computer-Human Interaction: Building Bridges (NordiCHI), October 2008, pp.172–181.

  15. Hub A, Diepstraten J, Ertl T. Design and development of an indoor navigation and object identification system for the blind. In Proc. the 6th ACM SIGACCESS Conf. Computers and Accessibility (ASSETS), October 2004, pp.147–152.

  16. Krishna S, Colbry D, Black J, Balasubramanian V, Panchanathan S. A systematic requirements analysis and development of an assistive device to enhance the social interaction of people who are blind or visually. In Proc. Workshop on Computer Vision Applications for the Visually Impaired (CVAVI 08), Oct. 2008.

  17. Zeng L, Prescher D, Weber G. Exploration and avoidance of surrounding obstacles for the visually impaired. In Proc. the 14th ACM SIGACCESS Conf. Computers and Accessibility (ASSETS), October 2012, pp.111–118.

  18. Azenkot S, Fortuna E. Improving public transit usability for blind and deaf-blind people by connecting a braille display to a smartphone. In Proc. the 12th ACM SIGACCESS Conf. Computers and Accessibility (ASSETS), October 2010, pp.317–318.

  19. Southern C, Clawson J, Frey B, Abowd G, Romero M. An evaluation of BrailleTouch: Mobile touchscreen text entry for the visually impaired. In Proc. the 14th Int. Conf. Human-Computer Interaction with Mobile Devices and Services (MobileHCI), September 2012, pp.317–326.

  20. Gutschmidt R, Schiewe M, Zinke F, JÄurgensen H. Haptic emulation of games: Haptic Sudoku for the blind. In Proc. the 3rd Int. Conf. Pervasive Technologies Related to Assistive Environments (PETRA), June 2010, Article No. 2.

  21. Ferati M, Mannheimer S, Bolchini D. Usability evaluation of acoustic interfaces for the blind. In Proc. the 29th ACM Int. Conf. Design of Communication (SIGDOC), October 2011, pp.9–16.

  22. Bonner M N, Brudvik J T, Abowd G D, Edwards W K. No-look notes: Accessible eyes-free multi-touch text entry. In Proc. the 8th Int. Conf. Pervasive Computing (Pervasive), May 2010, pp.409–426.

  23. Douglas S A, Willson S. Haptic comparison of size (relative magnitude) in blind and sighted people. In Proc. the 9th ACM SIGACCESS Conf. Computers and Accessibility (ASSETS), October 2007, pp.83–90.

  24. Hribar V, Pawluk D. A tactile-thermal display for haptic exploration of virtual paintings. In Proc. the 13th ACM SIGACCESS Conf. Computers and Accessibility (ASSETS), October 2011, pp.221–222.

  25. Prescher D, Weber G, Spindler M. A tactile windowing system for blind users. In Proc. the 12th ACM SIGACCESS Conf. Computers and Accessibility (ASSETS), October 2010, pp.91–98.

  26. Azenkot S, Lee N B. Exploring the use of speech input by blind people on mobile devices. In Proc. the 15th ACM SIGACCESS Conf. Computers and Accessibility (ASSETS), October 2013, Article No. 11.

  27. Stephanidis C, Paramythis A, Karagiannidis C, Savidis A. Supporting interface adaptation: The AVANTI web-browser. In Proc. the 3rd ERCIM Workshop on User Interfaces for All (UI4ALL), Nov. 1997, pp.3–4.

  28. Turunen M, Hakulinen J, Melto A et al. Speech-based and multimodal media center for di®erent user groups. In Proc. the 10th Annual Conf. Int. Speech Communication Association (INTERSPEECH), September 2009, pp.1439–1442.

  29. Yamashita A, Kuno S, Kaneko T. Assisting system of visually impaired in touch panel operation using stereo camera. In Proc. the 18th IEEE Int. Conf. Image Processing (ICIP), September 2011, pp.985–988.

  30. Yu W, Kangas K, Brewster S. Web-based haptic applications for blind people to create virtual graphs. In Proc. the 11th IEEE Symp. Haptic Interfaces for Virtual Environment and Teleoperator Systems (HAPTICS), March 2003, pp.318–325.

  31. Zajicek M, Powell C, Reeves C. A web navigation tool for the blind. In Proc. the 3rd ACM Conf. Computers and Accessibility (ASSETS), April 1998, pp.204–206.

  32. Kim C G, Song B S. Design of a wearable walking-guide system for the blind. In Proc. the 1st Int. Convention on Rehabilitation Engineering & Assistive Technology (i-CREATe), April 2007, pp.118–122.

  33. Azenkot S, Prasain S, Borning A, Fortuna E, Ladner R E, Wobbrock J O. Enhancing independence and safety for blind and deaf-blind public transit riders. In Proc. the 29th SIGCHI Conf. Human Factors in Computing Systems (CHI), May 2011, pp.3247–3256.

  34. Jeon M, Nazneen N, Akanser O, Ayala-Acevedo A, Walker B. Listen2dRoom: Helping blind individuals understand room layouts. In Proc. the 30th SIGCHI Conf. Human Factors in Computing Systems | Extended Abstract (CHI EA), May 2012, pp.1577–1582.

  35. Brewster S, Brown L. Tactons: Structured tactile messages for non-visual information display. In Proc. the 5th Conf. Australasian User Interface (AUIC), Jan. 2004, pp.18–22.

  36. Shoemaker G, Tsukitani T, Kitamura Y, Booth K S. Two-part models capture the impact of gain on pointing performance. ACM TOCHI, 2012, 19(4): Article No. 28.

  37. Goncu C, Marriott K. GraVVITAS: Generic multi-touch presentation of accessible graphics. In Proc. the 13th Int. Conf. Human-Computer Interaction (INTERACT), September 2011, pp.30–48.

  38. Woodworth R S. The accuracy of voluntary movement. Psychological Monographs, 1970, 3(3): 1–114.

    Article  Google Scholar 

  39. Carlton L G. Control processes in the production of discrete aiming responses. Journal of Human Movement Studies, 1979, 5: 115–124.

    Google Scholar 

  40. Bahram S, Chakraborty A, Amant R S. CAVIAR: A vibrotactile device for accessible reaching. In Proc. ACM Int. Conf. Intelligent User Interfaces (IUI), February 2012, pp.245–248.

  41. Fiannaca A, Morelli T, Folmer E. Haptic target acquisition to enable spatial gestures in nonvisual displays. In Proc. the 39th Graphics Interface (GI), May 2013, pp.213–219.

  42. Folmer E, Morelli T. Spatial gestures using a tactile- proprioceptive display. In Proc. the 6th Int. Conf. Tangible, Embedded and Embodied Interaction (TEI), February 2012, pp.139–142.

  43. Li F C Y, Dearman D, Truong K N. Leveraging proprioception to make mobile phones more accessible to users with visual impairments. In Proc. the 12th ACM SIGACCESS Conf. Computers and Accessibility (ASSETS), October 2010, pp.187–194.

  44. Li F C Y, Dearman D, Truong K N. Virtual shelves: Interactions with orientation aware devices. In Proc. the 22nd ACM Symp. User Interface Software and Technology (UIST), October 2009, pp.125–128.

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xiangshi Ren.

Additional information

This study was partially supported by the National Natural Science Foundation of China under Grant No. 61228206 and the Grant-in-Aid for Scientific Research of Japan under Grant Nos. 23300048 and 25330241.

Electronic supplementary material

Below is the link to the electronic supplementary material.

ESM 1

(PDF 84 kb)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Kim, K., Ren, X. Assisting Visually Impaired People to Acquire Targets on a Large Wall-Mounted Display. J. Comput. Sci. Technol. 29, 825–836 (2014). https://doi.org/10.1007/s11390-014-1471-4

Download citation

  • Received:

  • Revised:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11390-014-1471-4

Keywords

Navigation