Skip to main content
Log in

Target-shooting exergame with a hand gesture control

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

Exertion games (exergames) pose interesting challenges in terms of user interaction techniques. Players are commonly unable to use traditional input devices such as mouse and keyboard, given the body movement requirements of this type of videogames. In this work we propose a hand gesture interface to direct actions in a target-shooting exertion game that is played while exercising on an ergo-bike. A vision-based hand gesture interface for interacting with objects in a 3D videogame is designed and implemented. The system is capable to issue game commands to any computer game that normally responds to mouse and keyboard without modifying the underlying source code of the game. The vision system combines Bag-of-features and Support Vector Machine (SVM) to achieve user-independent and real-time hand gesture recognition. In particular, a Finite State Machine (FSM) is used to build the grammar that generates gesture commands for the game. We carried out a user study to gather feedback from participants, and our preliminary results show the high level of interest from users use this multimedia system that implements a natural way of interaction. Albeit some concerns in terms of comfort, users had a positive experience using our exertion game and they expressed their positive intention to use a system like this in their daily lives.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15

Similar content being viewed by others

References

  1. Baklouti M, Monacelli E, Guitteny V, Couvet S (2008) Intelligent assistive exoskeleton with vision based interface. In: Proceedings of the 6th international conference on Smart Homes and Health Telematics, vol. 5120, pp 123–135

  2. Brown E, Cairns P (2004) A grounded investigation of game immersion. In: CHI’04 extended abstracts on human factors in computing systems. ACM, New York, pp 1297–1300

  3. Chen Q, Georganas N, Petriu E (2008) Hand gesture recognition using Haar-like features and a stochastic context-free grammar. IEEE Transactions on Instrumentation and Measurement 57(8):1562–1571

    Google Scholar 

  4. Chen S, Pan Z, Zhang M, Shen H (2011) A case study of user immersion-based systematic design for serious heritage games. Multimedia Tools and Applications

  5. Dardas N, Alhaj M (2011) Hand gesture interaction with a 3D virtual environment. The International Journal of ACM JORDAN 2(3):186–194

    Google Scholar 

  6. Dardas N, Chen Q, Georganas N, Petriu E (2010) Hand gesture recognition using bag-of-features and multi-class support vector machine, HAVE 2010, 9th IEEE Int. Workshop on Haptic Audio Visual Environments and Games, Oct. 16–17, Phoenix, AZ, USA

  7. Dardas N, Georganas N (2011) Real time hand gesture detection and recognition using bag-of-features and support vector machine techniques. IEEE Transactions on Instrumentation and Measurement 60(11):3592–3607

    Article  Google Scholar 

  8. Davis FD (1986) A technology acceptance model for empirically testing new end-user information systems: theory and results. Massachusetts Institute of Technology. Available at: http://dspace.mit.edu/handle/1721.1/15192

  9. Dumas J (1998) Usability testing methods: subjective measures: part II - measuring attitudes and opinions. October issue of Common Ground, The newsletter of the Usability Professionals’ Association, pp 4–8

  10. Fang G, Gao W, Zhao D (2007) Large-vocabulary continuous sign language recognition based on transition-movement models. IEEE Trans Syst Man Cybern, Part A: Systems and Humans 37(1):1–9

    Article  Google Scholar 

  11. Goshorn R, Goshorn D, Kolsch M (2008) The enhancement of low-level classifications for ambient assisted living. Proceedings of the 2nd Workshop on Behaviour Monitoring and Interpretation, BMI’08

  12. Hsu C-W, Lin C-J (2002) A comparison of methods for multi-class support vector machines. IEEE 373 Trans Neural Network 13:415–425

    Article  Google Scholar 

  13. Hurst W, Wezel C (2012) Gesture-based interaction via finger tracking for mobile augmented reality. Multimedia Tools and Applications

  14. Jiang Y, Ngo C, Yang J (2007) Towards optimal bag-of-features for object categorization and semantic video retrieval. In ACM Int’l Conf. on Image and Video Retrieval

  15. Just A, Rodriguez Y, Marcel S (2006) Hand posture classification and recognition using the modified census transform. In: Proceedings of the Seventh IEEE international conference on automatic face and gesture recognition (FG’06), University of Southampton, UK, pp 351–356

  16. Kostomaj M, Boh B (2011) Design and evaluation of user’s physical experience in an ambient interactive storybook and full body interaction games. Multimed Tool Appl 54(2):499–525

    Article  Google Scholar 

  17. Krishnan N, Subban R, Selvakumar R et al (2007) Skin detection using color pixel classification with application to face detection: a comparative study. IEEE Int Conf on Computational Intelligence and Multimedia Applications 3:436–441

    Google Scholar 

  18. Lazebnik S, Schmid C, Ponce J (2006) Beyond bags of features: spatial pyramid matching for recognizing natural scene categories. In: Proc. of IEEE Conference on Computer Vision and Pattern Recognition

  19. Li K, Du Y, Fu Z (2011) TreeHeaven: a table game using vision-based gesture recognition. In: Proceedings of the 2011 ACM symposium on the role of design in UbiComp research & practice (RDURP ’11). ACM, New York, NY, USA, pp 11–14

  20. Likert R (1932) A technique for the measurement of attitudes. Arch Psychol 140(55)

  21. Lowe DG (2004) Distinctive image features from scale-invariant keypoints. Int J Comput Vision 60(2):91–110

    Article  Google Scholar 

  22. Nickel K, Stiefelhagen R (2007) Visual recognition of pointing gestures for human-robot interaction. Image and Vision Computing, Elsevier 25(12):1875–1884

    Article  Google Scholar 

  23. Ntalianis K, Doulamis A, Tsapatsoulis N, Doulamis N (2010) Human action annotation, modeling and analysis based on implicit user interaction. Multimed Tool Appl 50(1):199–225

    Article  Google Scholar 

  24. Pan Z, Li Y, Zhang M, Sun C, Guo K, Tang X, Zhou S (2010) A real-time multi-cue hand tracking algorithm based on computer vision. In: Virtual reality conference (VR). IEEE, Piscataway, pp 219–222

  25. Silva J, El Saddik A (2011) An adaptive game-based exercising framework. Virtual environments, human-computer interfaces and measurement systems (VECIMS)

  26. Song Y, Tang S, Zheng Y, Chua T, Zhang Y (2012) Exploring probabilistic localized video representation for human action recognition. Multimed Tools and Appl 58(3):663–685

    Google Scholar 

  27. Sparacino F (2008) Natural interaction in intelligent spaces: designing for architecture and entertainment. Multimedia Tools and Applications

  28. Takahashi M, Fujii M, Naemura M, Satoh S (2011) Human gesture recognition system for TV viewing using time-of-flight camera. Multimedia Tools and Applications

  29. Varona J, Jaume-i-Capo A, Gonzalez J, Perales F (2009) Toward natural interaction through visual recognition of body gestures in real-time. Interact Comput 21:3–10

    Article  Google Scholar 

  30. Viola P, Jones M (2004) Robust real-time object detection. International Journal of Computer Vision 57(2):137–154

    Article  Google Scholar 

  31. Weng C, Li Y, Zhang M, Guo K, Tang X, Pan Z (2010) Robust hand posture recognition integrating multi-cue hand tracking. In: Proceedings of the entertainment for education, and 5th international conference on E-learning and games. Springer, New York, pp 497–508

  32. Wu Y, Huang TS (1999) Human hand modeling, analysis and animation in the context of HC1. In: IEEE International Conference Image Processing. Kobe, Japan

  33. Zarit B, Super B, Quek F (1999) Comparison of five color models in skin pixel classification. In: ICCV’99 Int’lWorkshop on recognition, analysis and tracking of faces and gestures in real-time systems

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Nasser H. Dardas.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Demo hand gesture. (AVI 2983 kb)

Appendix 1. Questionnaire

Appendix 1. Questionnaire

For each one of the following statements, please mark the option below that closest reflects your level of agreement with the statement.

figure d

Rights and permissions

Reprints and permissions

About this article

Cite this article

Dardas, N.H., Silva, J.M. & El Saddik, A. Target-shooting exergame with a hand gesture control. Multimed Tools Appl 70, 2211–2233 (2014). https://doi.org/10.1007/s11042-012-1236-4

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-012-1236-4

Keywords

Navigation