Skip to main content

Put Your Hands Up - or Better Down? Towards Intuitive Gesture Interaction for Diverse Users of an Assistive Robot

  • Conference paper
  • First Online:
Universal Access in Human-Computer Interaction (HCII 2023)

Abstract

With the growing number of robots in public, gesture control will be increasingly common. However, there is no gesture set for robot control which is equally usable for blind and visually impaired (BVI) as well as sighted users. Hence, this study applies a three-staged process for the design of an accessible gesture set for human-robot interaction. In Step 1, 141 intuitive gestures for three different universal robot commands were elicited by BVI as well as sighted users. The gestures were categorized based on body parts usage and associated movements. Occurrence of gesture categories was compared between the subsamples and a preliminary gesture set was selected based on frequencies and calculated agreement indices. In Step 2, those gestures were analyzed according to the fulfilment of user and technical requirements for gesture interaction derived from previous literature. Gestures fitting those requirements were selected for a final gesture set of 6 gestures covering the three robot commands. Finally, Step 3 evaluated the intuitiveness of the final gesture set with BVI users. Results are discussed regarding accessible human robot interaction and future research in gesture control of BVI users.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Ananthakumar, A.: Efficient face and gesture recognition for time sensitive application. In: Proceedings of the IEEE Southwest Symposium on Image Analysis and Interpretation, pp. 117–120. IEEE, New York (2018). https://doi.org/10.1109/SSIAI.2018.8470351

  2. Auquilla, A.R., Salamea, H.T., Alvarado-Cando, O., Molina, J.K., Cedillo, P.A.S.: Implementation of a telerobotic system based on the kinect sensor for the inclusion of people with physical disabilities in the industrial sector. In: Proceedings of the 4th IEEE Colombian Conference on Automatic Control, pp. 1–6. IEEE, New York (2019). https://doi.org/10.1109/CCAC.2019.8921359

  3. Azenkot, S., Lee, N.B.: Exploring the use of speech input by blind people on mobile devices. In: Proceedings of the 15th International Conference on Computers and Accessibility, pp. 1–8. ACM, New York (2013). https://doi.org/10.1145/2513383.2513440

  4. Babel, F., Kraus, J., Baumann, M.: Findings from a qualitative field study with an autonomous robot in public: exploration of user reactions and conflicts. Int. J. Soc. Robot. 14(7), 1625–1655 (2022). https://doi.org/10.1007/s12369-022-00894-x

    Article  Google Scholar 

  5. Barattini, P., Morand, C., Robertson, N.M.: A proposed gesture set for the control of industrial collaborative robots. In: IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication, pp. 132–137. IEEE, New York (2012). https://doi.org/10.1109/ROMAN.2012.6343743

  6. Bruce, S.M., Mann, A., Jones, C., Gavin, M.: Gestures expressed by children who are congenitally deaf-blind: topography, rate, and function. J. Visual Impair. Blind. 101(10), 637–652 (2007). https://doi.org/10.1177/0145482X0710101010

    Article  Google Scholar 

  7. Buzzi, M.C., Buzzi, M., Leporini, B., Trujillo, A.: Analyzing visually impaired people’s touch gestures on smartphones. Multimedia Tools Appl. 76(4), 5141–5169 (2016). https://doi.org/10.1007/s11042-016-3594-9

    Article  Google Scholar 

  8. Cho, M.-Y., Jeong, Y.S.: Human gesture recognition performance evaluation for service robots. In: Proceedings of the 19th International Conference on Advanced Communication Technology, pp. 847–851. IEEE, New York (2017). https://doi.org/10.23919/ICACT.2017.7890213

  9. Costa, D., Duarte, C.: Alternative modalities for visually impaired users to control smart TVs. Multimedia Tools Appl. 79(43–44), 31931–31955 (2020). https://doi.org/10.1007/s11042-020-09656-1

    Article  Google Scholar 

  10. Dim, N.K., Silpasuwanchai, C., Sarcar, S., Ren, X.: Designing mid-air TV gestures for blind people using user- and choice-based elicitation approaches. In: Proceedings of the 2016 ACM Conference on Designing Interactive Systems, pp. 204–214. ACM, New York (2016). https://doi.org/10.1145/2901790.2901834

  11. Jalab, H.A., Omer, H.K.: Human computer interface using hand gesture recognition based on neural network. In: Proceedings of 5th National Symposium on Information Technology, pp. 1–6. IEEE, New York (2015). https://doi.org/10.1109/NSITNSW.2015.7176405

  12. Kane, S.K., Wobbrock, J.O., Ladner, R.E.: Usable gestures for blind people: understanding preference and performance. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 413–422. ACM, New York (2011). https://doi.org/10.1145/1978942.1979001

  13. Kim, K., Ren, X., Choi, S., Tan, H.Z.: Assisting people with visual impairments in aiming at a target on a large wall-mounted display. Int. J. Hum Comput Stud. 86, 109–120 (2016). https://doi.org/10.1016/j.ijhcs.2015.10.002

    Article  Google Scholar 

  14. Lei, Q., Zhang, H., Yang, Y., He, Y., Bai, Y., Liu, S.: An investigation of applications of hand gestures recognition in industrial robots. Int. J. Mech. Eng. Robot. Res. 8(5), 729–741 (2019). https://doi.org/10.18178/ijmerr.8.5.729-741

  15. Li, X.: Human–robot interaction based on gesture and movement recognition. Signal Process. Image Commun. 81, 700–709. https://doi.org/10.1016/j.image.2019.115686

  16. Miao, M., Pham, H.A., Friebe, J., Weber, G.: Contrasting usability evaluation methods with blind users. Univ. Access Inf. Soc. 15(1), 63–76 (2014). https://doi.org/10.1007/s10209-014-0378-8

    Article  Google Scholar 

  17. Nacenta, M.A., Kamber, Y., Qiang, Y., Kristensson, P.O.: Memorability of pre-designed and user-defined gesture sets. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 1099–1108. ACM, New York (2013). https://doi.org/10.1145/2470654.2466142

  18. Nguyen, T.T.M., Pham, N.H., Dong, V.T., Nguyen, V.S., Tran, T.T.H.: A fully automatic hand gesture recognition system for human-robot interaction. In: Proceedings of the Second Symposium on Information and Communication Technology, pp. 112–119. ACM, New York (2011). https://doi.org/10.1145/2069216.2069241

  19. Norman, D.A.: Natural user interfaces are not natural. Interactions 17(3), 6–10 (2010). http://doi.acm.org/10.1145/1744161.1744163

  20. Obaid, M., Kistler, F., Häring, M., Bühling, R., André, E.: A framework for user-defined body gestures to control a humanoid robot. Int. J. Soc. Robot. 6(3), 383–396 (2014). https://doi.org/10.1007/s12369-014-0233-3

    Article  Google Scholar 

  21. Rahim, M.A., Shin, J., Islam, M.R.: Hand gesture recognition-based non-touch character writing system on a virtual keyboard. Multimedia Tools Appl. 79(17–18), 11813–11836 (2020). https://doi.org/10.1007/s11042-019-08448-6

    Article  Google Scholar 

  22. Villarreal-Narvaez, S., Vanderdonckt, J., Vatavu, R.-D., Wobbrock, J.O.: A systematic review of gesture elicitation studies. In: Proceedings of the 2020 ACM Designing Interactive Systems Conference, pp. 855–872. ACM, New York (2020). https://doi.org/10.1145/3357236.3395511

  23. Vuletic, T., Duffy, A., Hay, L., McTeague, C., Campbell, G., Grealy, M.: Systematic literature review of hand gestures used in human computer interaction interfaces. Int. J. Hum. Comput. Stud. 129, 74–94 (2019). https://doi.org/10.1016/j.ijhcs.2019.03.011

    Article  Google Scholar 

  24. Wobbrock, J.O., Aung, H.H., Rothrock, B., Myers, B.A.: Maximizing the guessability of symbolic input. In: CHI 2005 Extended Abstracts on Human Factors in Computing Systems, pp. 1869–1872. ACM, New York (2005). https://doi.org/10.1145/1056808.1057043

  25. Wobbrock, J.O., Morris, M.R., Wilson, A.D.: User-defined gestures for surface computing. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 1083–1092. ACM, New York (2009). https://doi.org/10.1145/1518701.1518866

  26. Yeasin, M., Chaudhuri, S.: Visual understanding of dynamic hand gestures. Pattern Recogn. 33(11), 1805–1817 (2000). https://doi.org/10.1016/S0031-3203(99)00175-2

    Article  Google Scholar 

  27. Zabulis, X., Baltzakis, H., Argyros, A.A.: Vision-based hand gesture recognition for human-computer interaction. In: Stephanidis, C. (ed.) The universal access handbook, pp. 1–59. CRC Press, Boca Raton (2009)

    Google Scholar 

  28. Zhang, B., Du, G., Shen, W., Li, F.: Gesture-based human-robot interface for dual-robot with hybrid sensors. Ind. Robot. 46(6), 800–811 (2019). https://doi.org/10.1108/IR-11-2018-0245

    Article  Google Scholar 

  29. Zhu, C., Sheng, W.: Wearable sensor-based hand gesture and daily activity recognition for robot-assisted living. IEEE Trans. Syst. Man Cybern. Part A Syst. Hum. 41(3), 569–573 (2011). https://doi.org/10.1109/TSMCA.2010.2093883

    Article  Google Scholar 

Download references

Acknowledgments

This research took place within the scope of project “MIRobO” (project number 16SV7969K) supported by German Federal Ministry of Education and Research. The authors acknowledge this financial support. Federal Ministry did not have an impact on study design, data acquisition, analysis and interpretation of data as well as authoring and submission of this paper. We especially acknowledge the support of Weißer Stock e.V., SFZ Förderzentrum gGmbH as well as local associations for visually impaired people BSV Sachsen e.V. and thank our participants.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Franziska Legler .

Editor information

Editors and Affiliations

Appendix

Appendix

Table 10. Description of a prototypical gesture for each category, (d) dynamic gesture, (s) static gesture, \o/ two-arm gesture, .o/ single-arm gesture, .o. no-hand gesture

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Legler, F., Langer, D., Lottermoser, LM., Dettmann, A., Bullinger, A.C. (2023). Put Your Hands Up - or Better Down? Towards Intuitive Gesture Interaction for Diverse Users of an Assistive Robot. In: Antona, M., Stephanidis, C. (eds) Universal Access in Human-Computer Interaction. HCII 2023. Lecture Notes in Computer Science, vol 14020. Springer, Cham. https://doi.org/10.1007/978-3-031-35681-0_17

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-35681-0_17

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-35680-3

  • Online ISBN: 978-3-031-35681-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics