Skip to main content
Log in

Mobile accessibility: natural user interface for motion-impaired users

  • Long Paper
  • Published:
Universal Access in the Information Society Aims and scope Submit manuscript

Abstract

We designed a natural user interface to access mobile devices for motion-impaired people who cannot use the standard multi-touch input system to work with tablets and smartphones. We detect the head motion of the user by means of the frontal camera and use its position to interact with the mobile device. The purpose of this work is to evaluate the performance of the system. We conducted two laboratory studies with 12 participants without disabilities and a field study with four participants with multiple sclerosis (MS). The first laboratory study was done to test the robustness and to count with a base to compare the results of the evaluation done with the participants with MS. Once observed the results of the participants with disabilities, we conducted a new laboratory study with participants without disabilities simulating the limitations of the users with MS to tune the system. All participants completed a set of defined tasks: pointing and pointing-selecting. We logged use and conducted questionnaires post-experiment. Our results showed positive outcomes using the system as an input device, although apps should follow a set of recommendations on the size of the targets and their position to facilitate the interaction with mobile devices for motion-impaired users. The work demonstrates the interface’s possibilities for mobile accessibility for motion-impaired users who need alternative access devices to interact with mobile devices.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15

Similar content being viewed by others

Notes

  1. We refer to the Apple point as px.

  2. This implies a resolution of 1024-by-768 Apple points. .

References

  1. Stephanidis, C.: The Universal Access Handbook. CRC Press, Boca Raton (2009)

    Book  Google Scholar 

  2. Morris, J., Mueller, J., Jones, M.L.: Wireless technology uses and activities by people with disabilities. J. Technol. Pers. Disabil. Santiago 2, 29–45 (2014)

    Google Scholar 

  3. Wireless RERC: SUNspot—use of wireless devices by adults with disabilities (2013). http://wirelessrerc.org/sites/default/files/publications/SUNspot_2013-03_Wireless_Devices_and_Adults_with_Disabilities_2013-07-12%5B1%5D.pdf. Accessed 15 Dec 2015

  4. Bosomworth D.: Mobile Marketing Statistics compilation (2015). http://www.smartinsights.com/mobile-marketing/mobile-marketing-analytics/mobile-marketing-statistics/. Accessed 15 Dec 2015

  5. Kane, S.K., Jayant, C., Wobbrock, J.O., Ladner, R.E.: Freedom to roam: a study of mobile device adoption and accessibility for people with visual and motor disabilities. In: Proceedings of the SIGACCESS ACM, pp. 115–122 (2009)

  6. W3C: W3C Mobile Accessibility (2016). https://www.w3.org/WAI/mobile/. Accessed 03 April 2016

  7. Anthony, L., Kim, Y., Findlater, L.: Analyzing user-generated YouTube videos to understand touchscreen use by people with motor impairments. In: Proceedings of the CHI, ACM, pp. 1223–1232 (2013)

  8. Trewin, S., Swart, C., Pettick, D.: Physical accessibility of touchscreen smartphones. In: Proceedings of the SIGACCESS, ACM, pp. 19:1–19:8 (2013)

  9. Biswas, P., Langdon, P.: Developing multimodal adaptation algorithm for mobility impaired users by evaluating their hand strength. Int. J. Hum. Comput. Interact. 28, 576–596 (2012). doi:10.1080/10447318.2011.636294

    Article  Google Scholar 

  10. Kouroupetroglou, G., Kousidis, S., Riga, P., Pino, A.: The mATHENA inventory for free mobile assistive technology applications. In: Ciuciu, I., Panetto, H., Debruyne, C., et al. (eds.) Proceedings of the Move to Meaningful Internet System. OTM 2015 Work, pp. 519–527. Springer, Cham (2015)

    Chapter  Google Scholar 

  11. Hakobyan, L., Lumsden, J., O’Sullivan, D., Bartlett, H.: Mobile assistive technologies for the visually impaired. Surv. Ophthalmol. 58, 513–528 (2013). doi:10.1016/j.survophthal.2012.10.004

    Article  Google Scholar 

  12. Jayant, C., Acuario, C., Johnson, W., et al.: V-braille: haptic braille perception using a touch-screen and vibration on mobile phones. In: Proceedings of the 12th International ACM SIGACCESS Conference on Computers and Accessibility, pp. 295–296. ACM, New York, NY (2010)

  13. Bouck, E.C.: Assistive Technology. SAGE, Tyne (2016)

    Google Scholar 

  14. Belatar, M., Poirier. F.: Text entry for mobile devices and users with severe motor impairments: handiglyph, a primitive shapes based onscreen keyboard. In: Proceedings of the 10th international ACM SIGACCESS conference on Computers and accessibility, pp. 209–216. ACM, New York, NY, USA (2008)

  15. Toyama, K.: “Look, Ma-No Hands!”: hands-free cursor control with real-time 3D face tracking. In: Proceedings of the Workshop on Perceptual User Interfaces, pp. 49–54 (1998)

  16. Bradski, G.R.: Computer vision face tracking for use in a perceptual user interface. Intel Technol. J. Q2, 705 (1998)

    Google Scholar 

  17. Betke, M., Gips, J., Fleming, P.: The camera mouse: visual tracking of body features to provide computer access for people with severe disabilities. IEEE Trans. Neural Syst. Rehabil. Eng. 10, 1–10 (2002). doi:10.1109/TNSRE.2002.1021581

    Article  Google Scholar 

  18. Gorodnichy, D.O., Malik, S., Roth, G.: Nouse “Use Your Nose as a Mouse”: a new technology for hands-free games and interfaces. In: Proceedings of the Vision Interface, pp. 354–361 (2002)

  19. Manresa-Yee, C., Varona, J., Perales, F., Salinas, I.: Design recommendations for camera-based head-controlled interfaces that replace the mouse for motion-impaired users. Univ. Access Inf. Soc. 13, 471–482 (2014). doi:10.1007/s10209-013-0326-z

    Article  Google Scholar 

  20. Bulbul, A., Cipiloglu, Z., Capin, T.: A Face tracking algorithm for user interaction in mobile devices. In: CW’09 International Conference on CyberWorlds, 2009, pp. 385–390 (2009)

  21. Joshi, N., Kar, A., Cohen, M.: Looking at you: fused gyro and face tracking for viewing large imagery on mobile devices. In: Proceedings of the SIGCHI Conference on Human Factors in Computer System, pp. 2211–2220. ACM, New York, NY (2012)

  22. Hansen, T.R., Eriksson, E., Lykke-Olesen, A.: Use Your Head: Exploring Face Tracking for Mobile Interaction. In: CHI’06 Extended Abstracts on Human Factors in Computing Systems, pp 845–850. ACM, New York, NY (2006)

  23. Bordallo Lopez, M., Hannuksela, J., Silven, O., Fan, L.: Head-tracking virtual 3-D display for mobile devices. In: 2012 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp. 27–34 (2012)

  24. Umoove Ltd.: Umoove Experience: The 3D Face & Eye Tracking Flying Game (2015). https://itunes.apple.com/es/app/umoove-experience-3d-face/id731389410?mt=8. Accessed 12 Dec 2015

  25. Electronic Rescue Service: Head Moves Game (2015). In: https://play.google.com/store/apps/details?id=ua.ers.headMoveGames&hl=es, https://play.google.com/store/apps/details?id=ua.ers.headMoveGames&hl=es. Accessed 1 Dec 2015

  26. Shtick Studios: HeadStart (2015). https://play.google.com/store/apps/details?id=com.yair.cars&hl=es. Accessed 01 December 2015

  27. Inisle Interactive Technologies Face Scape: FAce Scape (2017). https://itunes.apple.com/es/app/face-scape/id1019147652?mt=8. Accessed 05 March 2017

  28. Cuaresma, J., MacKenzie, I.S.: A comparison between tilt-input and facial tracking as input methods for mobile games. In: 6th IEEE Consumer Electronics Society Games, Entertainment, Media Conference, IEEE-GEM 2014, pp. 70–76. IEEE, New York (2014)

  29. Fundación Vodafone: EVA Facial Mouse (2015). https://play.google.com/store/apps/details?id=com.crea_si.eviacam.service. Accessed 01 December 2015

  30. Google and Beit Issie Shapiro: Go ahead project (2015). http://www.hakol-barosh.org.il/. Accessed 01 December 2015

  31. Lopez-Basterretxea, A., Mendez-Zorrilla, A., Garcia-Zapirain, B.: Eye/head tracking technology to improve HCI with iPad applications. Sensors (Basel) 15, 2244–2264 (2015). doi:10.3390/s150202244

    Article  Google Scholar 

  32. Varona, J., Manresa-Yee, C., Perales, F.J.: Hands-free vision-based interface for computer accessibility. J. Netw. Comput. Appl. 31, 357–374 (2008). doi:10.1016/j.jnca.2008.03.003

    Article  Google Scholar 

  33. Manresa-Yee, C., Ponsa, P., Varona, J., Perales, F.J.: User experience to improve the usability of a vision-based interface. Interact. Comput. 22, 594–605 (2010). doi:10.1016/j.intcom.2010.06.004

    Article  Google Scholar 

  34. Roig-Maimó, M.F., Manresa-Yee, C., Varona, J.: A robust camera-based interface for mobile entertainment. Sensors 16, 254 (2016)

    Article  Google Scholar 

  35. Roig-Maimó, M.F., Varona Gómez, J., Manresa-Yee, C.: Face Me! Head-tracker interface evaluation on mobile devices. In: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, pp 1573–1578. ACM, New York, NY (2015)

  36. National Multiple Sclerosis Society: What is MS? (2016). http://www.nationalmssociety.org/What-is-MS. Accessed 01 December 2016

  37. Apple Inc.: iOS Human Interface Guidelines: Designing for iOS (2016). https://developer.apple.com/ios/human-interface-guidelines/overview/design-principles/. Accessed 08 May 2016

  38. Brooke, J.: SUS-A quick and dirty usability scale. Usability Eval. Ind. 189, 4–7 (1996)

    Google Scholar 

  39. ISO: ISO/TS 9241-411:2012, Ergonomics of human-system interaction-Part 411: evaluation methods for the design of physical input devices (2012)

  40. MacKenzie, I.S., Kauppinen, T., Silfverberg, M.: Accuracy measures for evaluating computer pointing devices. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 9–16. ACM, New York, NY (2001)

  41. Bangor, A., Kortum, P.T., Miller, J.T.: An empirical evaluation of the system usability scale. Int. J. Hum. Comput. Interact. 24, 574–594 (2008). doi:10.1080/10447310802205776

    Article  Google Scholar 

Download references

Acknowledgements

We acknowledge the Agencia Estatal de Investigación (AEI) and the European Regional Development Funds (ERDF) for its support to the Project TIN2012-35427 (AEI/ERDF, EU), TIN2016-81143-R (AEI/FEDER, UE) and the Grand FPI BES-2013-064652 (FPI). We thank all the volunteers who participated in this study and ABDEM staff for their support.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Cristina Manresa-Yee.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Manresa-Yee, C., Roig-Maimó, M.F. & Varona, J. Mobile accessibility: natural user interface for motion-impaired users. Univ Access Inf Soc 18, 63–75 (2019). https://doi.org/10.1007/s10209-017-0583-3

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10209-017-0583-3

Keywords

Navigation