skip to main content
10.1145/2542355.2542381acmconferencesArticle/Chapter ViewAbstractPublication Pagessiggraph-asiaConference Proceedingsconference-collections
research-article

Motion indexing of different emotional states using LMA components

Published:19 November 2013Publication History

ABSTRACT

Recently, there has been an increasing use of pre-recorded motion capture data, making motion indexing and classification essential for animating virtual characters and synthesising different actions. In this paper, we use a variety of features that encode characteristics of motion using the Body, Effort, Shape and Space components of Laban Movement Analysis (LMA), to explore the motion quality from acted dance performances. Using Principal Component Analysis (PCA), we evaluate the importance of the proposed features - with regards to their ability to separate the performer's emotional state - indicating the weight of each feature in motion classification. PCA has been also used for dimensionality reduction, laying the foundation for the qualitative and quantitative classification of movements based on their LMA characteristics. Early results show that the proposed features provide a representative space for indexing and classification of dance movements with regards to the emotion, which can be used for synthesis and composition purposes.

References

  1. Arikan, O., Forsyth, D. A., and O'Brien, J. F. 2003. Motion synthesis from annotations. ACM TOG 22, 3, 402--408. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Barbič, J., Safonova, A., Pan, J.-Y., Faloutsos, C., Hodgins, J. K., and Pollard, N. S. 2004. Segmenting motion capture data into distinct behaviors. In Proceedings of GI '04. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Chan, J. C. P., Leung, H., Tang, J. K. T., and Komura, T. 2011. A virtual reality dance training system using motion capture technology. IEEE Trans. on L. Tech. 4, 2, 187--195. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Chao, M.-W., Lin, C.-H., Assa, J., and Lee, T.-Y. 2012. Human motion retrieval from hand-drawn sketch. IEEE Trans. on Visualization and Computer Graphics 18, 5, 729--740. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Chi, D., Costa, M., Zhao, L., and Badler, N. 2000. The emote model for effort and shape. In Proceedings of SIGGRAPH '00, ACM, NY, USA, 173--182. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Cimen, G., Ilhan, H., Capin, T., and Gurcay, H. 2013. Classification of human motion based on affective state descriptors. Comp. Anim. & Virtual Worlds 24, 3--4, 355--363.Google ScholarGoogle ScholarCross RefCross Ref
  7. CMU, 2003. Carnegie Mellon Univ.: MoCap Database.Google ScholarGoogle Scholar
  8. Deng, Z., Gu, Q., and Li, Q. 2009. Perceptually consistent example-based human motion retrieval. In Proceedings of I3D '09, ACM, NY, USA, 191--198. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Fagerberg, P., Ståhl, A., and Höök, K. 2003. Designing gestures for affective input: An analysis of shape, effort and valance. In Proceedings of MUM'03, 57--65.Google ScholarGoogle Scholar
  10. Guest, A. H. 2005. Labanotation: The System of Analysing and Recording Movement. Routledge.Google ScholarGoogle Scholar
  11. Kapadia, M., Chiang, I.-k., Thomas, T., Badler, N. I., and Kider, Jr., J. T. 2013. Efficient motion retrieval in large motion databases. In Proceedings of I3D '13, ACM, NY, USA, 19--28. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Keogh, E., Palpanas, T., Zordan, V. B., Gunopulos, D., and Cardle, M. 2004. Indexing large human-motion databases. In Proceedings of VLDB, 780--791. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Kovar, L., and Gleicher, M. 2004. Automated extraction and parameterization of motions in large data sets. ACM Trans. of Graphics 23, 3, 559--568. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Kovar, L., Gleicher, M., and Pighin, F. 2002. Motion graphs. ACM Trans. of Graphics 21, 3, 473--482. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Krüger, B., Tautges, J., Weber, A., and Zinke, A. 2010. Fast local and global similarity searches in large motion capture databases. In Proceedings of SCA '10, 1--10. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Liu, G., Zhang, J., Wang, W., and McMillan, L. 2005. A system for analyzing and indexing human-motion databases. In SIGMOD '05, 924--926. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Luo, P., and Neff, M. 2012. A perceptual study of the relationship between posture and gesture for virtual characters. In Motion in Games, 254--265.Google ScholarGoogle Scholar
  18. Maletić, V. 1987. Body, Space, Expression: The Edevelopment of Rudolf Laban's Movement and Dance Concepts. Approaches to semiotics. De Gruyter Mouton.Google ScholarGoogle Scholar
  19. Min, J., Liu, H., and Chai, J. 2010. Synthesis and editing of personalized stylistic human motion. In Proceedings of I3D'10, ACM, NY, USA, 39--46. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Müller, M., Röder, T., and Clausen, M. 2005. Efficient content-based retrieval of motion capture data. ACM Trans. of Graphics 24, 3, 677--685. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Okajima, S., Wakayama, Y., and Okada, Y. 2012. Human motion retrieval system based on LMA features using IEC method. In Innov. in Intelligent Machines, 117--130.Google ScholarGoogle Scholar
  22. Shapiro, A., Cao, Y., and Faloutsos, P. 2006. Style components. In Proceedings of GI'06, 33--39. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Troje, N. F. 2009. Decomposing biological motion: A framework for analysis and synthesis of motion gait patterns. Journal of Motion 2, 5, 371--387.Google ScholarGoogle Scholar
  24. UCY, 2012. Univ. of Cyprus: Dance MoCap Database.Google ScholarGoogle Scholar
  25. UTA, 2011. Univ. of Texas-Arlington: Human Motion DB.Google ScholarGoogle Scholar
  26. Wu, S., Wang, Z., and Xia, S. 2009. Indexing and retrieval of human motion data by a hierarchical tree. In Proceedings of VRST, ACM, NY, USA, 207--214. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. Zhao, L., and Badler, N. I. 2005. Acquiring and validating motion qualities from live limb gestures. Graphical Models 67, 1, 1--16. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Motion indexing of different emotional states using LMA components

            Recommendations

            Comments

            Login options

            Check if you have access through your login credentials or your institution to get full access on this article.

            Sign in
            • Published in

              cover image ACM Conferences
              SA '13: SIGGRAPH Asia 2013 Technical Briefs
              November 2013
              135 pages
              ISBN:9781450326292
              DOI:10.1145/2542355
              • Conference Chairs:
              • Baoquan Chen,
              • Andrei Sharf

              Copyright © 2013 ACM

              Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

              Publisher

              Association for Computing Machinery

              New York, NY, United States

              Publication History

              • Published: 19 November 2013

              Permissions

              Request permissions about this article.

              Request Permissions

              Check for updates

              Qualifiers

              • research-article

              Acceptance Rates

              Overall Acceptance Rate178of869submissions,20%

            PDF Format

            View or Download as a PDF file.

            PDF

            eReader

            View online with eReader.

            eReader