Skip to main content

Integrating Transformer-Based AI-Generated Dance Movements Into HRP-4 for Human-Robot Artistic Collaboration

  • Conference paper
  • First Online:
Human-Friendly Robotics 2024 (HFR 2024)

Part of the book series: Springer Proceedings in Advanced Robotics ((SPAR,volume 35))

Included in the following conference series:

  • 122 Accesses

Abstract

This paper presents the integration of dance movements generated from a transformer-based diffusion model onto the HRP-4 humanoid robot using the open-source framework mc_rtc. We explore how AI-generated movements can facilitate meaningful collaboration between robots and artists and discuss the methodology for incorporating these movements into the robot’s control framework. Furthermore, we examine the artistic perspectives and user engagement that arise from this integration.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    https://github.com/Stanford-TML/EDGE.

References

  1. Tseng, J., Castellon, R., Liu, K.: Edge: editable dance generation from music. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 448–458 (2023)

    Google Scholar 

  2. Aucouturier, J., et al.: Dancing robots and AI’s future. IEEE Intell. Syst. 23, 74–84 (2008)

    Google Scholar 

  3. Michalowski, M.: Rhythmic Human-Robot Social Interaction. Carnegie Mellon University (2010)

    Google Scholar 

  4. Dallard, A., Benallegue, M., Kanehiro, F., Kheddar, A.: Synchronized human-humanoid motion imitation. IEEE Rob. Autom. Lett. 8, 4155–4162 (2023)

    Article  Google Scholar 

  5. Dallard, A., Benallegue, M., Scianca, N., Kanehiro, F., Kheddar, A.: Robust Bipedal Walking with Closed-Loop MPC: Adios Stabilizers (2024)

    Google Scholar 

  6. Caron, S., Kheddar, A., Tempier, O.: Stair climbing stabilization of the HRP-4 humanoid robot using whole-body admittance control. In: 2019 International Conference on Robotics and Automation (ICRA), pp. 277–283 (2019)

    Google Scholar 

  7. Zhou, Y., Barnes, C., Lu, J., Yang, J., Li, H.: On the continuity of rotation representations in neural networks. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 5745–5753 (2019)

    Google Scholar 

  8. Kudoh, S., Shiratori, T., Nakaoka, S.I., Nakazawa, A., Kanehiro, F., Ikeuchi, K.: Entertainment robot: learning from observation paradigm for humanoid robot dancing. In: Proceedings IEEE/RSJ International Conference on Intelligent Robots and Systems Workshop: Art Robots, vol. 7 (2008)

    Google Scholar 

  9. Stasse, C.: Dynamic Whole-Body Motion Generation for the Dance of a Humanoid Robot, Tayeb Benarama facing HRP-2 (2011)

    Google Scholar 

  10. Ramos, O., Mansard, N., Stasse, O., Benazeth, C., Hak, S., Saab, L.: Dancing humanoid robots: systematic use of OSID to compute dynamically consistent movements following a motion capture pattern. IEEE Rob. Autom. Mag. 22, 16–26 (2015)

    Article  Google Scholar 

  11. Breazeal, C.: Toward sociable robots. Rob. Auton. Syst. 42, 167–175 (2003)

    Article  MATH  Google Scholar 

  12. Hong, H., Costa, I., Tanguy, A., Kheddar, A., Chen, C.: A cross-temporal robotic dance performance: dancing with a humanoid robot and artificial life. In: International Symposium on Electronic Arts (ISEA) (2024)

    Google Scholar 

  13. Thörn, O., Knudsen, P., Saffiotti, A.: Human-robot artistic co-creation: a study in improvised robot dance. In: 2020 29th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), pp. 845–850 (2020)

    Google Scholar 

  14. Circu, S., Yun, B., Kheddar, A., Chen, C., Croitoru, M.: Dance, dance, dance with My hands: third-party human robot-human interactions. In: 2023 32nd IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), pp. 1997–2002 (2023)

    Google Scholar 

  15. Peng, H., Zhou, C., Hu, H., Chao, F., Li, J.: Robotic dance in social robotics-a taxonomy. IEEE Trans. Hum.-Mach. Syst. 45, 281–293 (2015)

    Article  MATH  Google Scholar 

  16. Loper, M., Mahmood, N., Romero, J., Pons-Moll, G., Black, M.: SMPL: a skinned multi-person linear model. In: Seminal Graphics Papers: Pushing the Boundaries, vol. 2, pp. 851–866 (2023)

    Google Scholar 

  17. Aucouturier, J.: Using chaos to trade-off synchronization and autonomy in a dancing robot. Trends Controversies IEEE Intell. Syst. 23, 74–85 (2008)

    Article  MATH  Google Scholar 

  18. Kosuge, K., Takeda, T., Hirata, Y.: The human leads. In: IEEE Intelligent Systems, Dancing with a Robot (2008)

    Google Scholar 

  19. Dong, R., Chang, Q., Ikuno, S.: A deep learning framework for realistic robot motion generation. Neural Comput. Appl. 35, 23343–23356 (2023)

    Article  Google Scholar 

  20. Nie, B., Gao, Y.: DanceHAT: generate stable dances for humanoid robots with adversarial training. In: 2022 International Conference on Robotics and Automation (ICRA), pp. 8511–8517 (2022)

    Google Scholar 

  21. Qin, R., Zhou, C., Zhu, H., Shi, M., Chao, F., Li, N.: A music-driven dance system of humanoid robots. Int. J. Humanoid Rob. 15, 1850023 (2018)

    Article  MATH  Google Scholar 

  22. Augello, A., Cipolla, E., Infantino, I., Manfre, A., Pilato, G., Vella, F.: Creative robot dance with variational encoder. ArXiv Preprint ArXiv:1707.01489 (2017)

  23. Xie, B., Park, C.: Dance with a robot: encoder-decoder neural network for music-dance learning. In: Companion of the 2020 ACM/IEEE International Conference on Human-Robot Interaction, pp. 526–528 (2020)

    Google Scholar 

  24. Hong, H., Chen, C., Tanguy, A., Kheddar, A.: A dance performance with a humanoid robot using a real-time gesture responsive framework. In: 2024 33th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN) (2024)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hui-Ting Hong .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2025 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Hong, HT., Chen, CY., Kheddar, A. (2025). Integrating Transformer-Based AI-Generated Dance Movements Into HRP-4 for Human-Robot Artistic Collaboration. In: Paolillo, A., Giusti, A., Abbate, G. (eds) Human-Friendly Robotics 2024. HFR 2024. Springer Proceedings in Advanced Robotics, vol 35. Springer, Cham. https://doi.org/10.1007/978-3-031-81688-8_9

Download citation

Publish with us

Policies and ethics