Skip to main content
Log in

Combining heterogeneous digital human simulations: presenting a novel co-simulation approach for incorporating different character animation technologies

  • Original Article
  • Published:
The Visual Computer Aims and scope Submit manuscript

Abstract

Digital human simulation is important for various domains such as the entertainment, health care and production industries. A variety of simulation techniques and tools are available, ranging from motion-capture-based animation systems and deep learning to physics-based motion synthesis. Each technology has its advantages and disadvantages and is suited for particular use cases. Therefore, a combination of multiple technologies would result in more sophisticated simulations, which can address heterogeneous aspects. However, the different approaches are mostly tightly coupled with the development environment, thus inducing high porting efforts if being incorporated into different platforms. A combination of separately developed simulation systems either for benchmarking or comprehensive simulation is not possible yet. For the domain of plant simulation, the Functional Mock-up Interface (FMI) standard has already solved this problem. Initially being tailored to industrial needs, the standards allow exchanging dynamic simulation approaches such as solvers for mechatronic components. Inspired by the FMI standard, we present a novel framework to incorporate multiple digital human simulation approaches from multiple domains. In particular, the paper introduces the overall concept of the so-called Motion Model Units, as well as its underlying technical architecture. As main contribution, a novel co-simulation for the orchestration of multiple digital human simulation approaches is presented. The overall applicability is approved based on a quantitative evaluation using motion capture data and a user study.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15

Similar content being viewed by others

References

  1. 07006 ITEA Project MODELISAR—website. www.itea3.org/project/modelisar.html (2017). Accessed 14 May 2019

  2. Aristidou, A., Lasenby, J.: Fabrik: a fast, iterative solver for the inverse kinematics problem. Graph. Models 73(5), 243–260 (2011)

    Article  Google Scholar 

  3. Bastian, J., Clauß, C., Wolf, S., Schneider, P.: Master for co-simulation using fmi. In: Proceedings of the 8th International Modelica Conference, March 20th–22nd, Technical Univeristy, Dresden, Germany, 63, pp. 115–120. Linköping University Electronic Press (2011)

  4. Blochwitz, T., Otter, M., Akesson, J., Arnold, M., Clauss, C., Elmqvist, H., Friedrich, M., Junghanns, A., Mauss, J., Neumerkel, D.: Functional mockup interface 2.0: the standard for tool independent exchange of simulation models. In: Proceedings of the 9th International MODELICA Conference, September 3–5, 2012, Munich, Germany, pp. 173–184. Linköping University Electronic Press (2012)

  5. Buss, S.R.: Introduction to inverse kinematics with jacobian transpose, pseudoinverse and damped least squares methods. IEEE J. Robot. Autom. 17(1–19), 16 (2004)

    Google Scholar 

  6. Cerekovic, A., Pejsa, T., Pandzic, I.S.: Realactor: character animation and multimodal behavior realization system. In: International Workshop on Intelligent Virtual Agents, pp. 486–487. Springer (2009)

  7. Clavet, S.: Motion matching and the road to next-gen animation. https://www.gdcvault.com/play/1023280/Motion-Matching-and-The-Road (2016). Accessed 16 May 2019

  8. CMU Graphics Lab Motion Capture Database. http://mocap.cs.cmu.edu/ (2019). Accessed 16 May 2019

  9. Delp, S.L., Anderson, F.C., Arnold, A.S., Loan, P., Habib, A., John, C.T., Guendelman, E., Thelen, D.G.: Opensim: open-source software to create and analyze dynamic simulations of movement. IEEE Trans. Biomed. Eng. 54(11), 1940–1950 (2007)

    Article  Google Scholar 

  10. Faloutsos, P., Van de Panne, M., Terzopoulos, D.: Composable controllers for physics-based character animation. In: Proceedings of the 28th Annual Conference on Computer Graphics and Interactive Techniques, pp. 251–260. ACM (2001)

  11. Feng, A., Huang, Y., Kallmann, M., Shapiro, A.: An analysis of motion blending techniques. In: International Conference on Motion in Games, pp. 232–243. Springer (2012)

  12. Feng, A.W., Xu, Y., Shapiro, A.: An example-based motion synthesis technique for locomotion and object manipulation. In: Proceedings of the ACM SIGGRAPH Symposium on Interactive 3D Graphics and Games, pp. 95–102. ACM (2012)

  13. Gaisbauer, F.: MOSIM research project deliverable 2.2: MMU concept and interface specification. https://itea3.org/project/mosim.html (2019). Accessed 24 Sept 2019

  14. Gaisbauer, F., Agethen, P., Bär, T., Rukzio, E.: Introducing a modular concept for exchanging character animation approaches. In: Jain, E., Kosinka, J. (eds.) EG 2018—Posters. The Eurographics Association (2018)

  15. Gaisbauer, F., Agethen, P., Otto, M., Bär, T., Sues, J., Rukzio, E.: Presenting a modular framework for a holistic simulation of manual assembly tasks. Procedia CIRP 72, 768–773 (2018)

    Article  Google Scholar 

  16. Gaisbauer, F., Lehwald, J., Agethen, P., Sues, J., Rukzio, E.: Proposing a co-simulation model for coupling heterogeneous character animation systems. In: Proceedings of the 14th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications, Volume 1: GRAPP, pp. 65–76. INSTICC, SciTePress (2019)

  17. Hanson, L., Högberg, D., Carlson, J.S., Bohlin, R., Brolin, E., Delfs, N., Mårdberg, P., Stefan, G., Keyvani, A., Rhen, I.M.: IMMA—intelligently moving manikins in automotive applications. In: Third International Summit on Human Simulation (ISHS2014) (2014)

  18. Holden, D., Komura, T., Saito, J.: Phase-functioned neural networks for character control. ACM Trans. Graph. 36(4), 42 (2017)

    Article  Google Scholar 

  19. ITEA: ITEA Project 17028 MOSIM—website. https://itea3.org/project/mosim.html (2018). Accessed 24 Sept 2019

  20. Kallmann, M., Marsella, S.: Hierarchical motion controllers for real-time autonomous virtual humans. In: International Workshop on Intelligent Virtual Agents, pp. 253–265. Springer (2005)

  21. Kovar, L., Gleicher, M., Pighin, F.: Motion graphs. In: ACM SIGGRAPH 2008 Classes, p. 51. ACM (2008)

  22. Lee, Y., Wampler, K., Bernstein, G., Popović, J., Popović, Z.: Motion fields for interactive character locomotion. In: ACM Transactions on Graphics (TOG), vol. 29, p. 138. ACM (2010)

  23. Li, Z., Zhou, Y., Xiao, S., He, C., Li, H.: Auto-Conditioned LSTM Network for Extended Complex Human Motion Synthesis. arXiv preprint arXiv:1707.05363 (2017)

  24. Min, J., Chai, J.: Motion graphs++: a compact generative model for semantic motion analysis and synthesis. ACM Trans. Graph. 31(6), 153 (2012)

    Article  Google Scholar 

  25. Müller, B., Wolf, S., Brueggemann, G., Deng, Z., Miller, F., Selbie, W.: Handbook of Human Motion. Springer, Berlin (2018)

    Google Scholar 

  26. Reed, M.P., Faraway, J., Chaffin, D.B., Martin, B.J.: The humosim ergonomics framework: a new approach to digital human simulation for ergonomic analysis. In: SAE Technical Paper Series, SAE Technical Paper Series. SAE International400 Commonwealth Drive, Warrendale, PA, United States (2006). https://doi.org/10.4271/2006-01-2365

  27. Rootmotion final IK—website. http://www.root-motion.com/final-ik (2019). Accessed 14 May 2019

  28. Shapiro, A.: Building a character animation system. In: International Conference on Motion in Games, pp. 98–109. Springer (2011)

  29. Shoulson, A., Marshak, N., Kapadia, M., Badler, N.I.: Adapt: the agent developmentand prototyping testbed. IEEE Trans. Vis. Comput. Graph. 20(7), 1035–1047 (2014)

    Article  Google Scholar 

  30. Thiebaux, M., Marsella, S., Marshall, A.N., Kallmann, M.: Smartbody: Behavior realization for embodied conversational agents. In: Proceedings of the 7th International Joint Conference on Autonomous Agents and Multiagent Systems—Volume 1, pp. 151–158. International Foundation for Autonomous Agents and Multiagent Systems (2008)

  31. Tsai, Y.Y., Lin, W.C., Cheng, K.B., Lee, J., Lee, T.Y.: Real-time physics-based 3d biped character animation using an inverted pendulum model. IEEE Trans. Vis. Comput. Graph. 16(2), 325–337 (2010)

    Article  Google Scholar 

  32. Van Acker, B., Denil, J., Vangheluwe, H., De Meulenaere, P.: Generation of an optimised master algorithm for fmi co-simulation. In: Proceedings of the Symposium on Theory of Modeling & Simulation: DEVS Integrative M&S Symposium, pp. 205–212. Society for Computer Simulation International (2015)

  33. Wang, B., Baras, J.S.: Hybridsim: A modeling and co-simulation toolchain for cyber-physical systems. In: Proceedings of the 2013 IEEE/ACM 17th International Symposium on Distributed Simulation and Real Time Applications, pp. 33–40. IEEE Computer Society (2013)

  34. Welbergen, H., Reidsma, D.: A BML realizer for continuous, multimodal interaction with a virtual human (2010)

Download references

Acknowledgements

The authors acknowledge the financial support by the Federal Ministry of Education and Research of Germany within the MOSIM Project [19] (Grant No. 01IS18060A-H).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Felix Gaisbauer.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary material 1 (mp4 106122 KB)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Gaisbauer, F., Lampen, E., Agethen, P. et al. Combining heterogeneous digital human simulations: presenting a novel co-simulation approach for incorporating different character animation technologies. Vis Comput 37, 717–734 (2021). https://doi.org/10.1007/s00371-020-01792-x

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00371-020-01792-x

Keywords

Navigation