Skip to main content
Log in

Gesture based human motion and game principles to aid understanding of science and cultural practices

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

We present a novel approach for recreating life-like experiences through an easy and natural gesture-based interaction. By focusing on the locations and transforming the role of the user, we are able to significantly maximise the understanding of an ancient cultural practice, behaviour or event over traditional approaches. Technology-based virtual environments that display object reconstructions, old landscapes, cultural artefacts, and scientific phenomena are coming into vogue. In traditional approaches the user is a visitor navigating through these virtual environments observing and picking objects. However, cultural practices and certain behaviours from nature are not normally made explicit and their dynamics still need to be understood. Thus, our research idea is to bring such practices to life by allowing the user to enact them. This means that user may re-live a step-by-step process to understand a practice, behaviour or event. Our solution is to enable the user to enact using gesture-based interaction with sensor-based technologies such as the versatile Kinect. This allows easier and natural ways to interact in multidimensional spaces such as museum exhibits. We use heuristic approaches and semantic models to interpret human gestures that are captured from the user’s skeletal representation. We present and evaluate three applications. For each of the three applications, we integrate these interaction metaphors with gaming elements, thereby achieving a gesture-set to enact a cultural practice, behaviour or event. User evaluation experiments revealed that our approach achieved easy and natural interaction with an overall enhanced learning experience.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

References

  1. Bulling A, Blanke Ulf, Schiele B (2014) A tutorial on human activity recognition using body-worn inertial sensors. ACM Comput Surv 46(3):33:1–33:33. doi:10.1145/2499621. ISSN 0360-0300

    Article  Google Scholar 

  2. Contreras VE, Navarro-Newball AA, Moreno I (2014) A comparative study between museums in Cali and Madrid. In: Sánchez IM, Pereira VA, Castellary AC (eds) ArTecnologa: augmented knowledge and accessibility. Universidad Complutense de Madrid, Madrid, pp 241–251. ISBN 978-84-697-1450-8 http://eprints.ucm.es/27152/

  3. Ibanez R, Soria Á, Teyseyre A, Campo M (2014) Easy gesture recognition for kinect. Adv Eng Soft 76(0):171–180. doi:10.1016/j.advengsoft.2014.07.005. ISSN 0965-9978

    Article  Google Scholar 

  4. Jang S, Elmqvist N, Ramani K (2014) Gestureanalyzer: Visual analytics for pattern analysis of mid-air hand gestures.. In: Proceedings of the 2nd ACM symposium on spatial user interaction, SUI ’14. ISBN 978-1-4503-2820-3. doi:10.1145/2659766.2659772. ACM, New York, pp 30–39

  5. Kamal A, Li Y, Lank E (2014) Teaching motion gestures via recognizer feedback.. In: Proceedings of the 19th international conference on intelligent user interfaces, IUI ’14. ISBN 978-1-4503-2184-6. doi:10.1145/2557500.2557521. ACM, New York, pp 73–82

  6. LaViola JJ, Keefe DF (2011) 3D Spatial interaction: applications for art, design, and science.. In: ACM SIGGRAPH 2011 Courses, SIGGRAPH ’11. ISBN 978-1-4503-0967-7. doi:10.1145/2037636.2037637. ACM, New York, NY, USA, pp 1:1–1:75

  7. Li J, Ngai G, Chan SCF, Hua KA, Va Leong H, Chan A (2014) From writing to painting: A kinect-based cross-modal chinese painting generation system.. In: Proceedings of the ACM international conference on multimedia, MM ’14. ISBN 978-1-4503-3063-3. doi:10.1145/2647868.2654911. ACM, New York, NY, USA, pp 57–66

  8. Microsoft (2013) Kinect for Windows: Human Interface Guidelines v1.8. Microsoft Corporation

  9. Navarro A, Loaiza D, Oviedo C, Castillo A, Portilla A, Linares D, Álvarez T (2014a) Talking to Teo: video game supported speech therapy. Entertain Comput. ISSN 1875-9521. doi:10.1016/j.entcom.2014.10.005

  10. Navarro AA, Mejía JD, Loaiza DF, Perea CF, Lozada D (2014b) Exploration and implementation of computer graphics technologies for the creation of interactive environments in the Museo de América. In: Sánchez IM , Pereira VA, Castellary AC (eds) ArTecnologia. Conocimiento aumentado y accesibilidad. Augmented Knowledge and Accessibility. Universidad Complutense de Madrid, Madrid, pp 196–201. ISBN 978-84-697-1450-8. http://eprints.ucm.es/27152/

  11. Pietroni E, Adami A (2014) Interacting with virtual reconstructions in museums: The etruscanning project. J Comput Cult Herit 7(2):9:1–9:29. ISSN 1556-4673. doi:10.1145/2611375

    Article  Google Scholar 

  12. Raptis M, Kirovski D, Hoppe H (2011) Real-time classification of dance gestures from skeleton animation. In: Proceedings of the 2011 ACM SIGGRAPH/Eurographics Symposium on Computer Animation, SCA ’11. ACM, New York, pp 147–156. ISBN 978-1-4503-0923-3. doi:10.1145/2019406.2019426

  13. Rosner D, Roccetti M, Marfia G (2014) The digitization of cultural practices. Commun. ACM 57(6):82–87. ISSN 0001-0782. doi:10.1145/2602695.2602701

    Article  Google Scholar 

  14. Ruta M, Scioscia F, Di Summa M, Ieva S, Di Sciascio E, Sacco M (2014) Semantic matchmaking for kinect-based posture and gesture recognition.. In: 2014 EEE international conference on semantic computing (ICSC). doi:10.1109/ICSC.2014.28, pp 15–22

  15. Sánchez IM, Navarro Newball AA (2013) Comunicacin cultural y tic: la representacin accesible de la cultura Chimú., vol 18, pp 541–554. ISSN 1988-3056 http://revistas.ucm.es/index.php/HICS/article/view/43987

  16. Seol Y, O’Sullivan C, Lee J (2013) Creature features: Online motion puppetry for non-human characters.. In: Proceedings of the 12th ACM SIGGRAPH/Eurographics Symposium on Computer Animation, SCA ’13. ACM, New York, pp 213–221. ISBN 978-1-4503-2132-7. doi:10.1145/2485895.2485903

  17. Topsom M., Prakash EC, Waraich A (2008) Investigating gestures as an input method for games.. In: Proceedings of cybergames 2008, pp. 39–43, Beijing, China, ISBN 978-7-302-18779-0

  18. Virtualware (2012a) Musealización del centro de interpretacin de armañón: vuelo virtual por la zona de especial conservación de los montes de ordunte, visited. http://virtualwaregroup.com/es/portfolio/

  19. Virtualware (2012b) Vuelo virtual por las cinco villas, visited. http://virtualwaregroup.com/es/portfolio/

  20. Xiao Y, Yuan J, Thalmann D (2013) Human-virtual human interaction by upper body gesture understanding.. In: Proceedings of the 19th ACM Symposium on Virtual Reality Software and Technology, VRST ’13. ACM, New York, pp 133–142. ISBN 978-1-4503-2379-6. doi:10.1145/2503713.2503727

  21. Xie L, Hong Jian Liao. (2014) Chapter 6: communication, signal and image processing, data acquisition, identification and recognition technologies. Appl Mech Mater 543-547:2879–2883. ISSN 1662-7482. doi:10.4028/www.scientific.net/AMM.543-547.2879

    Article  Google Scholar 

  22. Zhao X, Li X, Pang C, Sheng QZ, Wang S, Ye M (2014) Structured streaming skeleton – a new feature for online human gesture recognition. ACM Trans Multimedia Comput Commun Appl 11(1s):22:1–22:18. ISSN 1551-6857. doi:10.1145/2648583

    Article  Google Scholar 

Download references

Acknowledgments

Archaeological museum La Merced, Cali - Colombia. Natural Sciences Museum, Cali - Colombia. Museo de América, Madrid. This project is part of the I+D+i research “augmented knowledge and accessibility: Musegraphic representation of complex cultural content” (reference: HAR2011-25953. Ministerio de Economa y Competitividad, Spain) from the research group Museum I+D+C (Universidad Complutense, Madrid). Digital culture and hypermedia museology laboratory, with the collaboration of the project MOMU (interactive model for museums - DESTINO research group, Pontificia Universidad Javeriana, Cali), financed by the Ministry of economy and competitiveness and by the Ministry of education, culture and sport, and supported by the Museo de América, Madrid, Fundación ITMA, Museo Convento Santo Domingo-Qorikancha, Cusco, Optimedia, Schwann Beijing, Telefónica ICT and the performing arts group El Tinglao that integrates people with functional diversity.

Compliance with ethical standards

All human participants signed an informed consent. This project followed ethical guidelines of our host institutions.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Andrés Adolfo Navarro-Newball.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Navarro-Newball, A.A., Moreno, I., Prakash, E. et al. Gesture based human motion and game principles to aid understanding of science and cultural practices. Multimed Tools Appl 75, 11699–11722 (2016). https://doi.org/10.1007/s11042-015-2667-5

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-015-2667-5

Keywords

Navigation