Skip to main content
Log in

Design and evaluation of a hand gesture recognition approach for real-time interactions

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

Hand gestures are a natural and intuitive form for human-environment interaction and can be used as an input alternative in human-computer interaction (HCI) to enhance usability and naturalness. Many existing approaches have employed vision -based systems to detect and recognize hand gestures. However, vision-based systems usually require users to move their hands within restricted space, where the optical device can capture the motion of hands. Also, vision-based systems may suffer from self-occlusion issues due to sophisticated finger movements. In this work, we use a sensor-based motion tracking system to capture 3D hand and finger motions. To detect and recognize hand gestures, we propose a novel angular-velocity method, which is directly applied to real-time 3D motion data streamed by the sensor-based system. Our approach is capable of recognizing both static and dynamic gestures in real-time. We assess the recognition accuracy and execution performance with two interactive applications that require gesture input to interact with the virtual environment. Our experimental results show high recognition accuracy, high execution performance, and high-levels of usability.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

References

  1. Aigner R, Wigdor D, Benko H, Haller M, Lindbauer D, Ion A, Zhao S, Koh JTKV (2012) Understanding mid-air hand gestures: a study of human preferences in usage of gesture types for hci. Tech. rep. https://www.microsoft.com/en-us/research/publication/understanding-mid-air-hand-gestures-a-study-of-human-preferences-in-usage-of-gesture-types-for-hci/ https://www.microsoft.com/en-us/research/publication/understanding-mid-air-hand-gestures-a-study-of-human-preferences-in-usage-of-gesture-types-for-hci/

  2. Alavi S, Arsenault D, Whitehead A (2016) Quaternion-based gesture recognition using wireless wearable motion capture sensors. Sensors 16 (5):605. https://doi.org/10.3390/s16050605

    Article  Google Scholar 

  3. Bangor A, Kortum P, Miller J (2009) Determining what individual sus scores mean: adding an adjective rating scale. Journal of Usability Studies 4(3):114–123. https://uxpajournal.org/determining-what-individual-sus-scores-mean-adding-an-adjective-rating-scale

    Google Scholar 

  4. Brodie M, Walmsley A, Page W (2008) Fusion motion capture: a prototype system using inertial measurement units and gps for the biomechanical analysis of ski racing. Sports Technol 1(1):17–28. https://doi.org/10.1002/jst.6

    Article  Google Scholar 

  5. Brooke J, et al. (1996) Sus-a quick and dirty usability scale. Usability Evaluation in Industry 189(194):4–7

    Google Scholar 

  6. Cassell J (1998) A framework for gesture generation and interpretation. Computer Vision in Human-Machine Interaction, pp 191–215

  7. Cohen Y, Cohen JY (2008) Analysis of variance, in statistics and data with R: an applied approach through examples. Wiley, New York. https://doi.org/10.1002/9780470721896

    Book  Google Scholar 

  8. Davis FD (1989) Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Quarterly: 319–340

  9. Diliberti N, Peng C, Kaufman C, Dong Y, Hansberger JT (2019) Real-time gesture recognition using 3d sensory data and a light convolutional neural network. In: Proceedings of the 27th ACM international conference on multimedia, MM ’19. https://doi.org/10.1145/3343031.3350958. ACM, New York, pp 401–410

  10. Guna J, Jakus G, Pogačnik M, Tomažič S, Sodnik J (2014) An analysis of the precision and reliability of the leap motion sensor and its suitability for static and dynamic tracking. Sensors 14(2):3702. https://doi.org/10.3390/s140203702

    Article  Google Scholar 

  11. Häger-Ross CK, Schieber MH (2000) Quantifying the independence of human finger movements: comparisons of digits, hands, and movement frequencies. J Neurosci Official J Society Neurosci 20(22): 8542–50

    Article  Google Scholar 

  12. Hansberger JT, Peng C, Blakely V, Meacham S, Cao L, Diliberti N (2019) A multimodal interface for virtual information environments. In: Chen JY, Fragomeni G (eds) Virtual, augmented and mixed reality. Multimodal Interaction. Springer, Cham, pp 59–70

  13. Hansberger JT, Peng C, Mathis SL, Areyur Shanthakumar V, Meacham SC, Cao L, Blakely VR (2017) Dispelling the gorilla arm syndrome: the viability of prolonged gesture interactions. Springer, Cham, pp 505–520. https://doi.org/10.1007/978-3-319-57987-0_41

    Google Scholar 

  14. Hauptmann AG (1989) Speech and gestures for graphic image manipulation. SIGCHI Bull 20(SI):241–245. https://doi.org/10.1145/67450.67496

    Article  Google Scholar 

  15. Hummels C, Stappers PJ (1998) Meaningful gestures for human computer interaction: beyond hand postures. In: Third IEEE international conference on automatic face and gesture recognition, 1998. Proceedings. https://doi.org/10.1109/AFGR.1998.671012, pp 591–596

  16. Hutchins EL, Hollan JD, Norman DA (1985) Direct manipulation interfaces. Hum-Comput Interact 1(4):311–338. https://doi.org/10.1207/s15327051hci0104_2

    Article  Google Scholar 

  17. Kessler GD, Hodges LF, Walker N (1995) Evaluation of the cyberglove as a whole-hand input device. ACM Trans Comput-Hum Interact 2(4):263–283. https://doi.org/10.1145/212430.212431

    Article  Google Scholar 

  18. Kieras D, Meyer D, Ballas J (2001) Towards demystification of direct manipulation: cognitive modeling charts the gulf of execution. In: Proceedings of the SIGCHI conference on human factors in computing systems, CHI ’01. https://doi.org/10.1145/365024.365069. ACM, New York, pp 128–135

  19. Lang CE, Schieber MH (2004) Human finger independence: limitations due to passive mechanical coupling versus active neuromuscular control. J Neurophys 92 (5):2802–2810. https://doi.org/10.1152/jn.00480.2004

    Article  Google Scholar 

  20. Lee J, Kunii TL (1995) Model-based analysis of hand posture. IEEE Comput Graphics Appl 15(5):77–86

    Article  Google Scholar 

  21. Lin J, Wu Y, Huang TS (2000) Modeling the constraints of human hand motion. In: Proceedings of workshop on human motion. https://doi.org/10.1109/HUMO.2000.897381. IEEE, Austin, Texas, USA, pp 121–126

  22. Liu K, Kehtarnavaz N (2016) Real-time robust vision-based hand gesture recognition using stereo images. J Real-Time Image Process 11(1):201–209. https://doi.org/10.1007/s11554-013-0333-6

    Article  Google Scholar 

  23. Lu Z, Chen X, Li Q, Zhang X, Zhou P (2014) A hand gesture recognition framework and wearable gesture-based interaction prototype for mobile devices. IEEE Trans Human Mach Sys 44(2):293–299. https://doi.org/10.1109/THMS.2014.2302794

    Article  Google Scholar 

  24. Luzhnica G, Simon J, Lex E, Pammer V (2016) A sliding window approach to natural hand gesture recognition using a custom data glove. In: 2016 IEEE symposium on 3D user interfaces (3DUI). IEEE, pp 81–90

  25. Marin G, Dominio F, Zanuttigh P (2016) Hand gesture recognition with jointly calibrated leap motion and depth sensor. Multimed Tools Appl 75(22):14991–15015. https://doi.org/10.1007/s11042-015-2451-6

    Article  Google Scholar 

  26. Morris MR, Wobbrock JO, Wilson AD (2010) Understanding users’ preferences for surface gestures. In: Proceedings of graphics interface 2010, GI ’10. http://dl.acm.org/citation.cfm?id=1839214.1839260. Canadian Information Processing Society, Ottawa, Ontario, Canada, pp 261–268

  27. Neto P, Pereira D, Pires JN, Moreira AP (2013) Real-time and continuous hand gesture spotting: an approach based on artificial neural networks. In: 2013 IEEE international conference on robotics and automation. https://doi.org/10.1109/ICRA.2013.6630573, pp 178–183

  28. Nielsen M, Störring M, Moeslund TB, Granum E (2003) A procedure for developing intuitive and ergonomic gesture interfaces for hci. In: International gesture workshop. Springer, pp 409–420

  29. Pavlovic VI, Sharma R, Huang TS (1997) Visual interpretation of hand gestures for human-computer interaction: a review. IEEE Trans Pattern Anal Mach Intell 19 (7):677–695. https://doi.org/10.1109/34.598226

    Article  Google Scholar 

  30. Peng C, Hansberger J, Shanthakumar VA, Meacham S, Blakley V, Cao L (2018) A case study of user experience on hand-gesture video games. In: 2018 IEEE games, entertainment, media conference (GEM). https://doi.org/10.1109/GEM.2018.8516520, pp 453–457

  31. Peng C, Hansberger JT, Cao L, Shanthakumar VA (2017) Hand gesture controls for image categorization in immersive virtual environments. In: 2017 IEEE virtual reality (VR). https://doi.org/10.1109/VR.2017.7892311, pp 331–332

  32. Ramamoorthy A, Vaswani N, Chaudhury S, Banerjee S (2003) Recognition of dynamic hand gestures. Pattern Recogn 36(9):2069–2081. https://doi.org/10.1016/S0031-3203(03)00042-6

    Article  Google Scholar 

  33. Rautaray SS, Agrawal A (2015) Vision based hand gesture recognition for human computer interaction: a survey. Artificial Intelligence Review 43(1):1–54. https://doi.org/10.1007/s10462-012-9356-9

    Article  Google Scholar 

  34. Rice M, Wan M, Foo MH, Ng J, Wai Z, Kwok J, Lee S, Teo L (2011) Evaluating gesture-based games with older adults on a large screen display. In: Proceedings of the 2011 ACM SIGGRAPH symposium on video games. ACM, pp 17–24

  35. Sharma RP, Verma GK (2015) Human computer interaction using hand gesture. Procedia Comput Sci 54:721–727. https://doi.org/10.1016/j.procs.2015.06.085

    Article  Google Scholar 

  36. Siek KA, Rogers Y, Connelly KH (2005) Fat finger worries: how older and younger users physically interact with pdas. In: IFIP conference on human-computer interaction. Springer, pp 267–280

  37. Song Y, Demirdjian D, Davis R (2012) Continuous body and hand gesture recognition for natural human-computer interaction. ACM Trans Interact Intell Syst 2 (1):5:1–5:28. https://doi.org/10.1145/2133366.2133371

    Article  Google Scholar 

  38. Venkatesh V, Davis FD (2000) A theoretical extension of the technology acceptance model: four longitudinal field studies. Management Sci 46(2):186–204

    Article  Google Scholar 

  39. Vogel D, Casiez G (2012) Hand occlusion on a multi-touch tabletop. In: Proceedings of the SIGCHI conference on human factors in computing systems, CHI ’12. https://doi.org/10.1145/2207676.2208390. ACM, New York, pp 2307–2316

  40. Wachs JP, Kölsch M, Stern H, Edan Y (2011) Vision-based hand-gesture applications. Commun ACM 54(2):60–71. https://doi.org/10.1145/1897816.1897838

    Article  Google Scholar 

  41. Wang C, Liu Z, Chan SC (2015) Superpixel-based hand gesture recognition with kinect depth camera. IEEE Trans Multimed 17(1):29–39. https://doi.org/10.1109/TMM.2014.2374357

    Article  Google Scholar 

  42. Wobbrock JO, Morris MR, Wilson AD (2009) User-defined gestures for surface computing. In: Proceedings of the SIGCHI conference on human factors in computing systems, CHI ’09. https://doi.org/10.1145/1518701.1518866. ACM, New York, pp 1083–1092

  43. Xu D (2006) A neural network approach for hand gesture recognition in virtual reality driving training system of spg. In: 18th international conference on pattern recognition (ICPR’06). https://doi.org/10.1109/ICPR.2006.109, vol 3, pp 519–522

Download references

Acknowledgements

This work was supported by DOD grant W911NF-16-2-0016. We thank anonymous reviewers for their comments. We thank the people who participated in the user study.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Chao Peng.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Shanthakumar, V.A., Peng, C., Hansberger, J. et al. Design and evaluation of a hand gesture recognition approach for real-time interactions. Multimed Tools Appl 79, 17707–17730 (2020). https://doi.org/10.1007/s11042-019-08520-1

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-019-08520-1

Keywords

Navigation