Skip to main content
Log in

Designing Motion Gesture Interfaces in Mobile Phones for Blind People

  • Regular Paper
  • Published:
Journal of Computer Science and Technology Aims and scope Submit manuscript

Abstract

Despite the existence of advanced functions in smartphones, most blind people are still using old-fashioned phones with familiar layouts and dependence on tactile buttons. Smartphones support accessibility features including vibration, speech and sound feedback, and screen readers. However, these features are only intended to provide feedback to user commands or input. It is still a challenge for blind people to discover functions on the screen and to input the commands. Although voice commands are supported in smartphones, these commands are difficult for a system to recognize in noisy environments. At the same time, smartphones are integrated with sophisticated motion sensors, and motion gestures with device tilt have been gaining attention for eyes-free input. We believe that these motion gesture interactions offer more efficient access to smartphone functions for blind people. However, most blind people are not smartphone users and they are aware of neither the affordances available in smartphones nor the potential for interaction through motion gestures. To investigate the most usable gestures for blind people, we conducted a user-defined study with 13 blind participants. Using the gesture set and design heuristics from the user study, we implemented motion gesture based interfaces with speech and vibration feedback for browsing phone books and making a call. We then conducted a second study to investigate the usability of the motion gesture interface and user experiences using the system. The findings indicated that motion gesture interfaces are more efficient than traditional button interfaces. Through the study results, we provided implications for designing smartphone interfaces.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Kane S K, Jayant C, Wobbrock J O, Ladner R E. Freedom to roam: A study of mobile device adoption and accessibility for people with visual and motor disabilities. In Proc. the 11th ASSETS, October 2009, pp.115–122.

  2. Poggi I. From a typology of gestures to a procedure for gesture production. In Proc. Int. Gesture Workshop, April 2001, pp.158–168.

  3. Poggi I, Pelachaud C, Caldognetto E M. Gestural mind makers in ECAs. In Proc. the 2nd AAMAS, July 2003, pp.1098–1099.

  4. Iverson J M, Goldin-Meadow S. Why people gesture when they speak? Nature, 1998, 396(6708): 228.

    Article  Google Scholar 

  5. Karam M, Schraefel M C. A taxonomy of gestures in human computer interaction. Technical Report, ECSTR-IAM05-009, Electronics and Computer Science, University of Southampton, 2005.

  6. Wobbrock J O, Morris M R, Wilson A D. User-defined gestures for surface computing. In Proc. the 27th CHI, April 2009, pp.1083–1092.

  7. Ruiz J, Li Y, Lank E. User-defined motion gestures for mobile interaction. In Proc. CHI, May 2011, pp.197–206.

  8. Rekimoto J. Tilting operations for small screen interfaces. In Proc. the 9th UIST, November 1996, pp.167–168.

  9. Metzger C, Anderson M, Starner T. FreeDigiter: A contact-free device for gesture control. In Proc. the 8th ISWC, Oct. 31–Nov. 3, 2004, pp.18–21.

  10. Jones E, Alexander J, Andreou A, Irani P, Subramanian S. GestText: Accelerometer-based gestural text entry system. In Proc. CHI, April 2010, pp.2173–2182.

  11. Wigdor D, Balakrishnan R. TiltText: Using tilt for text input to mobile phones. In Proc. the 16th UIST, Nov. 2003, pp.81–90.

  12. Weberg L, Brange T, Hansson A W. A piece of butter on the PDA display. In Proc. CHI, March 2001, pp.435–436.

  13. Liu J, Zhong L, Wickramasuriya J et al. User evaluation of lightweight user authentication with a single tri-axis accelerometer. In Proc. the 11th MobileHCI, Sept. 2009, Article No. 15.

  14. Kray C, Nesbitt D, Dawson J, Rohs M. User-defined gestures for connecting mobile phones, public displays and tabletops. In Proc. the 12th MobileHCI, September 2010, pp.239–248.

  15. Kurdyukova E, Redlin M, Andre E. Studying user-defined iPad gestures for interaction in multi-display environment. In Proc. IUI, February 2012, pp.93–96.

  16. Liang H, Williams C, Semegen M, Stuerzlinger W, Irani P. User-defined surface + motion gestures for 3D manipulation of objects at distance through a mobile device. In Proc. the 10th APCHI, August 2012, pp.299–308.

  17. Obaid M, Häaring M, Kistler F, Bäuhling R, Andre E. User-defined body gestures for navigational control of a humanoid robot. In Proc. the 4th ICSR, October 2012, pp.367–377.

  18. Vatavu R. User-defined gestures for free-hand TV control. In Proc. the 10th EuroITV, July 2012, pp.45–48.

  19. Lee S, Kim S, Jin B et al. How users manipulate deformable displays as input devices. In Proc. CHI, April 2010, pp.1647–1656.

  20. Piumsomboon T, Billinghurst M, Clark A, Cockburn A. User-defined gestures for augmented reality. In Proc. CHI, April 27–May 2, 2013, pp.955–960.

  21. Jung H, Qin S. User-defined gesture sets using a mobile device for people with communication difficulties. In Proc. the 17th ICAC, Sept. 2011, pp.34–39.

  22. Kane S K, Wobbrock J O, Ladner R E. Usable gestures for blind people, understanding preference and performance. In Proc. CHI, May 2011, pp.413–422.

  23. Kane S K, Bigham J P, Wobbrock J O. Slide rule: Making mobile touch screens accessible to blind people using multi-touch interaction techniques. In Proc. the 10th ASSETS, Oct. 2008, pp.73–80.

  24. Obaid M, Häaring M, Kistler F, Bäuhling R, Andre E. User-defined body gestures for navigational control of a humanoid robot. In Proc. the 4th ICSR, October 2012, pp.367–377.

  25. Vatavu R. User-defined gestures for free-hand TV control. In Proc. the 10th EuroITV, July 2012, pp.45–48.

  26. Lee S, Kim S, Jin B et al. How users manipulate deformable displays as input devices. In Proc. CHI, April 2010, pp.1647–1656.

  27. Piumsomboon T, Billinghurst M, Clark A, Cockburn A. User-defined gestures for augmented reality. In Proc. CHI, April 27–May 2, 2013, pp.955–960.

  28. Jung H, Qin S. User-defined gesture sets using a mobile device for people with communication di ± culties. In Proc. the 17th ICAC, Sept. 2011, pp.34–39.

  29. Kane S K, Wobbrock J O, Ladner R E. Usable gestures for blind people, understanding preference and performance. In Proc. CHI, May 2011, pp.413–422.

  30. Kane S K, Bigham J P, Wobbrock J O. Slide rule: Making mobile touch screens accessible to blind people using multi-touch interaction techniques. In Proc. the 10th ASSETS, Oct. 2008, pp.73–80.

  31. Landau S, Wells L. Merging tactile sensory input and audio data by means of the talking tactile tablet. In Proc. Euro-Haptics, July 2003, pp.414–418.

  32. Pirhonen A, Brewster S, Holguin C. Gestural and audio metaphors as a means of control for mobile devices. In Proc. CHI, April 2002, pp.291–298.

  33. O'Neill E, Kaenampornpan M, Kostakos V, Warr A, Woodgate D. Can we do without GUIs? Gesture and speech interaction with a patient information system. Personal and Ubiquitous Computing, 2006, 10(5): 269–283.

    Article  Google Scholar 

  34. Zhao S, Dragicevic P, Chignell M et al. Earpod: Eyes-free menu selection using touch input and reactive audio feedback. In Proc. CHI, April 2007, pp.1395–1404.

  35. Sánchez J, Aguayo F. Mobile messenger for the blind. In Universal Access in Ambient Intelligence Environments, Stephanidis C, Pieper M (eds.), Springer, pp.369–385.

  36. Yfantidis G, Evreinov G. Adaptive blind interaction technique for touch screens. Universal Access in the Information Society, 2006, 4(4): 328–337.

    Article  Google Scholar 

  37. Azenkot S,Wobbrock J O, Prasain S, Ladner R E. Input finger detection for non-visual touch screen text entry in Perkinput. In Proc. Graphics Interface, May 2012, pp.121–129.

  38. Buzzi M C, Buzzi M, Donini F et al. Haptic reference cues to support the exploration of touchscreen mobile devices by blind users. In Proc. CHIItaly, Sept. 2013, Article No. 28.

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xiangshi Ren.

Additional information

This study has been partially supported by the Grant-in-Aid for Scientific Research of Japan under Grant Nos. 23300048, 25330241, and the National Natural Science Foundation of China under Grant No. 61228206.

Electronic supplementary material

Below is the link to the electronic supplementary material.

ESM 1

(PDF 90 kb)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Dim, N.K., Ren, X. Designing Motion Gesture Interfaces in Mobile Phones for Blind People. J. Comput. Sci. Technol. 29, 812–824 (2014). https://doi.org/10.1007/s11390-014-1470-5

Download citation

  • Received:

  • Revised:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11390-014-1470-5

Keywords

Navigation