Skip to main content
Log in

Fragment-based responsive character motion for interactive games

  • Original Article
  • Published:
The Visual Computer Aims and scope Submit manuscript

Abstract

Fragment-based character animation has become popular in recent years. By stringing appropriate motion capture fragments together, the system drives characters responding to the control signals of the user and generates realistic character motions. In this paper, we propose a novel, straightforward and fast method to build the control policy table, which selects the next motion fragment to play based on the current user’s input and the previous motion fragment. During the synthesis of the control policy table, we cluster similar fragments together to create several fragment classes. Dynamic programming is employed to generate the training samples based on the control signals of the user. Finally, we use a supervised learning routine to create the tabular control policy. We demonstrate the efficacy of our method by comparing the motions generated by our controller to the optimal controller and other previous controllers. The results indicate that although a reinforcement learning algorithm known as value iteration also creates the tabular control policy, it is more complex and requires more expensive space–time cost in synthesis of the control policy table. Our approach is simple but efficient, and is practical for interactive character games.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

References

  1. Magnenat-Thalmann, N., Thalmann, D.: Virtual humans: thirty years of research, what next? Vis. Comput. 21(12), 997–1015 (2005)

    Article  Google Scholar 

  2. Kry, P.G., Pai, D.K.: Interaction capture and synthesis. ACM Trans. Graph. 25(3), 872–880 (2006)

    Article  Google Scholar 

  3. Silva, M., Abe, Y., Popovic, J.: Simulation of human motion data using short-horizon model-predictive control. Comput. Graph. Forum 27(2), 371–380 (2008)

    Article  Google Scholar 

  4. McCann, J., Pollard, N.: Responsive characters from motion fragments. ACM Trans. Graph. (SIGGRAPH 2007) 26(3), Article 6, 7 pages (2007). DOI:10.1145/1239451.1239457

  5. Kover, L., Gleicher, M., Pighin, F.: Motion graphs. ACM Trans. Graph. 21(3), 473–482 (2002)

    Google Scholar 

  6. Gleicher, M., Shin, H.J., Kovar, L., Jepsen, A.: Snap-together motion: assembling run-time animations. In: Proceedings of the 2003 Symposium on Interactive 3D Graphics, I3D’03, Monterey, California, April 27–30, pp. 181–188. ACM, New York (2003)

    Chapter  Google Scholar 

  7. Arikan, O., Forsyth, D.A.: Interactive motion generation from examples. ACM Trans. Graph. 21(3), 483–490 (2002)

    Article  Google Scholar 

  8. Safonova, A., Hodgins, J.: Construction and optimal search of interpolated motion graphs. ACM Trans. Graph. 26(3), Article 106, 11 pages (2007). DOI:10.1145/1239451.1239557

  9. Heck, R., Gleicher, M.: Parametric motion graphs. In: Proceedings of the 2007 Symposium on Interactive 3D Graphics and Games, I3D’07, Seattle, Washington, April 30–May 2, pp. 129–136. ACM, New York (2007)

    Chapter  Google Scholar 

  10. Reitsma, P.S.A., Pollard, N.S.: Evaluating motion graphs for character navigation. In: Proceedings of the 2004 ACM Siggraph/Eurographics Symposium on Computer Animation, Grenoble, France, August 27–29, pp. 89–98 (2004)

  11. Pullen, K., Bregler, C.: Motion capture assisted animation: texturing and synthesis. In: Proceedings of the 29th Annual Conference on Computer Graphics and Interactive Techniques, SIGGRAPH’02, San Antonio, Texas, July 23–26, pp. 501–508. ACM, New York (2002)

    Chapter  Google Scholar 

  12. Schödl, A., Essa, I.: A machine learning for video based rendering. Technical Report: GIT-GVU-00-11, Georgia Institute of Technology (2000)

  13. Treuille, A., Lee, Y., Popovi’c, Z.: Near-optimal character animation with continuous control (SIGGRAPH 2007). ACM Trans. Graph. 26(3), Article 7, 7 pages (2007). DOI:10.1145/1239451.1239458

  14. Arikan, O., Forsyth, D.A., O’brien, J.F.: Motion synthesis from annotations. ACM Trans. Graph. 22(3), 402–408 (2003)

    Article  Google Scholar 

  15. Chai, J., Hodgins, J.K.: Performance animation from low-dimensional control signals. ACM Trans. Graph. 24(3), 686–696 (2005)

    Article  Google Scholar 

  16. Zordan, V.B., Macchietto, A., Medina, J., Soriano, M., Wu, C.C.: Interactive dynamic response for games. In: Proceedings of the 2007 ACM SIGGRAPH Symposium on Video Games, Sandbox’07, San Diego, California, August 4–5, pp. 9–14. ACM, New York (2007)

    Google Scholar 

  17. Komura, T., Leung, H., Kuffner, J.: Animating reactive motions for biped locomotion. In: Proceedings of the ACM Symposium on Virtual Reality Software and Technology, VRST’04, Hong Kong, November 10–12, pp. 32–40. ACM, New York (2004)

    Chapter  Google Scholar 

  18. Tang, B., Pan, Z.G., Zheng, L., Zhang, M.M.: Interactive generation of falling motions. Comput. Animat. Virtual Worlds 17(3–4), 271–279 (2006)

    Article  Google Scholar 

  19. Yin, K.K., Loken, K., Panne, M.: SIMBICON: Simple biped locomotion control. ACM Trans. Graph. (SIGGRAPH 2007) 26(3), Article 105, 10 pages (2007). DOI:10.1145/1239451.1239556

  20. Zordan, V.B., Majkowska, A., Chiu, B., Fast, M.: Dynamic response for motion capture animation. ACM Trans. Graph. 24(3), 697–701 (2005)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zhigeng Pan.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Cheng, X., Liu, G., Pan, Z. et al. Fragment-based responsive character motion for interactive games. Vis Comput 25, 479–485 (2009). https://doi.org/10.1007/s00371-009-0343-3

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00371-009-0343-3

Keywords

Navigation