skip to main content
10.1145/2915926.2915941acmotherconferencesArticle/Chapter ViewAbstractPublication PagescasaConference Proceedingsconference-collections
research-article

From Expressive End-Effector Trajectories to Expressive Bodily Motions

Published: 23 May 2016 Publication History

Abstract

Recent results in the affective computing sciences point towards the importance of virtual characters capable of conveying affect through their movements. However, in spite of all advances made on the synthesis of expressive motions, almost all of the existing approaches focus on the translation of stylistic content rather than on the generation of new expressive motions. Based on studies that show the importance of end-effector trajectories in the perception and recognition of affect, this paper proposes a new approach for the automatic generation of affective motions. In this approach, expressive content is embedded in a low-dimensional manifold built from the observation of end-effector trajectories. These trajectories are taken from an expressive motion capture database. Body motions are then reconstructed by a multi-chain Inverse Kinematics controller. The similarity between the expressive content of MoCap and synthesized motions is quantitatively assessed through information theory measures.

References

[1]
P. Baerlocher and R. Boulic. An inverse kinematics architecture enforcing an arbitrary number of strict priority levels. The visual computer, 20(6):402--417, 2004.
[2]
M. Brand and A. Hertzmann. Style machines. In Proceedings of the 27th annual conference on Computer graphics and interactive techniques, pages 183--192. ACM Press/Addison-Wesley Publishing Co., 2000.
[3]
L. Breiman. Random forests. Machine learning, 45(1):5--32, 2001.
[4]
P. Carreno-Medrano, S. Gibet, C. Larboulette, and P. Marteau. Corpus creation and perceptual evaluation of expressive theatrical gestures. In Intelligent Virtual Agents, pages 109--119. 2014.
[5]
P. Carreno-Medrano, S. Gibet, and P. Marteau. End-effectors trajectories: An efficient low-dimensional characterization of affective-expressive body motions. In Affective Computing and Intelligent Interaction (ACII), 2015 International Conference on, pages 435--441. IEEE, 2015.
[6]
J. Chai and J. Hodgins. Performance animation from low-dimensional control signals. In ACM Transactions on Graphics (TOG), volume 24, pages 686--696. ACM, 2005.
[7]
K. Choi and H. Ko. On-line motion retargetting. In Computer Graphics and Applications, 1999. Proceedings. Seventh Pacific Conference on, pages 32--42. IEEE, 1999.
[8]
K. Grochow, S. Martin, A. Hertzmann, and Z. Popović. Style-based inverse kinematics. In ACM transactions on graphics (TOG), volume 23, pages 522--531. ACM, 2004.
[9]
E. Hsu, K. Pulli, and J. Popović. Style translation for human motion. ACM Trans. Graph., 24(3):1082--1089, July 2005.
[10]
M. Karg, A.-A. Samadani, R. Gorbet, K. Kuhnlenz, J. Hoey, and D. Kulic. Body movements for affective expression: A survey of automatic recognition and generation. Affective Computing, IEEE Transactions on, 4(4):341--359, Oct 2013.
[11]
Y. Kim and M. Neff. Component-based locomotion composition. In Proceedings of the ACM SIGGRAPH/Eurographics Symposium on Computer Animation, pages 165--173. Eurographics Association, 2012.
[12]
M. Lau, Z. Bar-Joseph, and J. Kuffner. Modeling spatial and temporal variation in motion data. ACM Trans. Graph., 28(5):171:1--171:10, Dec. 2009.
[13]
S. Levine, J. M. Wang, A. Haraux, Z. Popović, and V. Koltun. Continuous character control with low-dimensional embeddings. ACM Transactions on Graphics (TOG), 31(4):28, 2012.
[14]
H. Liu, X. Wei, J. Chai, I. Ha, and T. Rhee. Realtime human motion control with a small number of inertial sensors. In Symposium on Interactive 3D Graphics and Games, pages 133--140. ACM, 2011.
[15]
W. Ma, S. Xia, J. Hodgins, X. Yang, C. Li, and Z. Wang. Modeling style and variation in human motion. In Proceedings of the 2010 ACM SIGGRAPH/Eurographics Symposium on Computer Animation, pages 21--30. Eurographics Association, 2010.
[16]
V. Monbet and P. Marteau. Non parametric resampling for stationary markov processes: The local grid bootstrap approach. Journal of Statistical Planning and Inference, 136(10):3319--3338, 2006.
[17]
F. Pollick, H. Paterson, A. Bruderlin, and A. Sanford. Perceiving affect from arm movement. Cognition, 82(2):B51--B61, December 2001.
[18]
L. Ren, A. Patrick, A. Efros, J. Hodgins, and J. Rehg. A data-driven approach to quantifying natural human motion. ACM Transactions on Graphics (TOG), 24(3):1090--1097, 2005.
[19]
C. L. Roether, L. Omlor, A. Christensen, and M. A. Giese. Critical features for the perception of emotion from gait. Journal of Vision, 9(6):15, 2009.
[20]
H. Shin and J. Lee. Motion synthesis and editing in low-dimensional spaces. Computer Animation and Virtual Worlds, 17(3-4):219--227, 2006.
[21]
C. So and G. Baciu. Entropy-based motion extraction for motion capture animation: Motion capture and retrieval. Computer Animation and Virtual Worlds, 16(3-4):225--235, 2005.
[22]
V. Svetnik, A. Liaw, C. Tong, J. Culberson, R. Sheridan, and B. Feuston. Random forest: a classification and regression tool for compound classification and qsar modeling. Journal of chemical information and computer sciences, 43(6):1947--1958, 2003.
[23]
J. Tautges, A. Zinke, B. Krüger, J. Baumann, A. Weber, T. Helten, M. Müller, H. Seidel, and B. Eberhardt. Motion reconstruction using sparse accelerometer data. ACM Transactions on Graphics (TOG), 30(3):18, 2011.
[24]
S. Thurman, M. Giese, and E. Grossman. Perceptual and computational analysis of critical features for biological motion. Journal of Vision, 10(12):15, 2010.
[25]
D. Tolani, A. Goswami, and N. Badler. Real-time inverse kinematics techniques for anthropomorphic limbs. Graphical models, 62(5):353--388, 2000.
[26]
S. Xia, C. Wang, J. Chai, and J. Hodgins. Realtime style transfer for unlabeled heterogeneous human motion. ACM Trans. Graph., 34(4):119:1--119:10, July 2015.

Cited By

View all
  • (2019)Real‐time 3D human pose and motion reconstruction from monocular RGB videosComputer Animation and Virtual Worlds10.1002/cav.188730:3-4Online publication date: 22-May-2019
  • (2018)Perceptual Validation for the Generation of Expressive Movements from End-Effector TrajectoriesACM Transactions on Interactive Intelligent Systems10.1145/31509768:3(1-26)Online publication date: 5-Jul-2018
  • (2017)Inverse Kinematics Techniques in Computer Graphics: A SurveyComputer Graphics Forum10.1111/cgf.1331037:6(35-58)Online publication date: 29-Nov-2017
  1. From Expressive End-Effector Trajectories to Expressive Bodily Motions

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    CASA '16: Proceedings of the 29th International Conference on Computer Animation and Social Agents
    May 2016
    200 pages
    ISBN:9781450347457
    DOI:10.1145/2915926
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 23 May 2016

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. bodily motions
    2. computer animation
    3. dimensionality reduction
    4. expressive motions
    5. inverse kinematics
    6. motion synthesis

    Qualifiers

    • Research-article
    • Research
    • Refereed limited

    Funding Sources

    Conference

    CASA '16
    CASA '16: Computer Animation and Social Agents
    May 23 - 25, 2016
    Geneva, Switzerland

    Acceptance Rates

    Overall Acceptance Rate 18 of 110 submissions, 16%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)1
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 28 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2019)Real‐time 3D human pose and motion reconstruction from monocular RGB videosComputer Animation and Virtual Worlds10.1002/cav.188730:3-4Online publication date: 22-May-2019
    • (2018)Perceptual Validation for the Generation of Expressive Movements from End-Effector TrajectoriesACM Transactions on Interactive Intelligent Systems10.1145/31509768:3(1-26)Online publication date: 5-Jul-2018
    • (2017)Inverse Kinematics Techniques in Computer Graphics: A SurveyComputer Graphics Forum10.1111/cgf.1331037:6(35-58)Online publication date: 29-Nov-2017

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media