Skip to main content

DanceDJ: A 3D Dance Animation Authoring System for Live Performance

  • Conference paper
  • First Online:
Advances in Computer Entertainment Technology (ACE 2017)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 10714))

Included in the following conference series:

Abstract

Dance is an important component of live performance for expressing emotion and presenting visual context. Human dance performances typically require expert knowledge of dance choreography and professional rehearsal, which are too costly for casual entertainment venues and clubs. Recent advancements in character animation and motion synthesis have made it possible to synthesize virtual 3D dance characters in real-time. The major problem in existing systems is a lack of an intuitive interfaces to control the animation for real-time dance controls. We propose a new system called the DanceDJ to solve this problem. Our system consists of two parts. The first part is an underlying motion analysis system that evaluates motion features including dance features such as the postures and movement tempo, as well as audio features such as the music tempo and structure. As a pre-process, given a dancing motion database, our system evaluates the quality of possible timings to connect and switch different dancing motions. During run-time, we propose a control interface that provides visual guidance. We observe that disk jockeys (DJs) effectively control the mixing of music using the DJ controller, and therefore propose a DJ controller for controlling dancing characters. This allows DJs to transfer their skills from music control to dance control using a similar hardware setup. We map different motion control functions onto the DJ controller, and visualize the timing of natural connection points, such that the DJ can effectively govern the synthesized dance motion. We conducted two user experiments to evaluate the user experience and the quality of the dance character. Quantitative analysis shows that our system performs well in both motion control and simulation quality.

N. Iwamoto and T. Kato contributed equally to this work.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 179.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 229.00
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Asahina, W., Okada, N., Iwamoto, N., Masuda, T., Fukusato, T., Morishima, S.: Automatic facial animation generation system of dancing characters considering emotion in dance and music. In: SIGGRAPH Asia 2015 Posters (SA 2015), pp. 11:1–11:1. ACM, New York (2015). http://doi.acm.org/10.1145/2820926.2820935

  2. Choi, B., i Ribera, R.B., Lewis, J.P., Seol, Y., Hong, S., Eom, H., Jung, S., Noh, J.: SketchiMo: sketch-based motion editing for articulated characters. ACM Trans. Graph. 35(4), 146:1–146:12 (2016). http://doi.acm.org/10.1145/2897824.2925970

    Article  Google Scholar 

  3. Choi, M.G., Yang, K., Igarashi, T., Mitani, J., Lee, J.: Retrieval and visualization of human motion data via stick figures. Comput. Graph. Forum 31(7pt1), 2057–2065 (2012)

    Article  Google Scholar 

  4. Dontcheva, M., Yngve, G., Popović, Z.: Layered acting for character animation. In: ACM SIGGRAPH 2003 Papers (SIGGRAPH 2003), pp. 409–416. ACM, New York (2003). http://doi.acm.org/10.1145/1201775.882285

  5. Fender, A., Müller, J., Lindlbauer, D.: Creature teacher: a performance-based animation system for creating cyclic movements. In: Proceedings of the 3rd ACM Symposium on Spatial User Interaction (SUI 2015), pp. 113–122. ACM, New York (2015). http://doi.acm.org/10.1145/2788940.2788944

  6. Glauser, O., Ma, W.C., Panozzo, D., Jacobson, A., Hilliges, O., Sorkine-Hornung, O.: Rig animation with a tangible and modular input device. ACM Trans. Graph. 35(4), 144:1–144:11 (2016). http://doi.acm.org/10.1145/2897824.2925909

    Article  Google Scholar 

  7. Groth, P., Shamma, D.A.: Spinning data: remixing live data like a music dj. In: CHI 2013 Extended Abstracts on Human Factors in Computing Systems (CHI EA 2013), pp. 3063–3066. ACM, New York (2013). http://doi.acm.org/10.1145/2468356.2479611

  8. Guay, M., Cani, M.P., Ronfard, R.: The line of action: an intuitive interface for expressive character posing. ACM Trans. Graph. 32(6), 205:1–205:8 (2013). http://doi.acm.org/10.1145/2508363.2508397

    Article  Google Scholar 

  9. Hahn, F., Mutzel, F., Coros, S., Thomaszewski, B., Nitti, M., Gross, M., Sumner, R.W.: Sketch abstractions for character posing. In: Proceedings of the 14th ACM SIGGRAPH/Eurographics Symposium on Computer Animation (SCA 2015), pp. 185–191. ACM, New York (2015). http://doi.acm.org/10.1145/2786784.2786785

  10. Held, R., Gupta, A., Curless, B., Agrawala, M.: 3D puppetry: a kinect-based interface for 3D animation. In: Proceedings of the 25th Annual ACM Symposium on User Interface Software and Technology (UIST 2012), pp. 423–434. ACM, New York (2012). http://doi.acm.org/10.1145/2380116.2380170

  11. Ishigaki, S., White, T., Zordan, V.B., Liu, C.K.: Performance-based control interface for character animation. ACM Trans. Graph. (SIGGRAPH) 28(3), 61 (2009)

    Article  Google Scholar 

  12. Jacobson, A., Panozzo, D., Glauser, O., Pradalier, C., Hilliges, O., Sorkine-Hornung, O.: Tangible and modular input device for character articulation. ACM Trans. Graph. 33(4), 82:1–82:12 (2014). http://doi.acm.org/10.1145/2601097.2601112

    Article  Google Scholar 

  13. Jin, M., Gopstein, D., Gingold, Y., Nealen, A.: Animesh: interleaved animation, modeling, and editing. ACM Trans. Graph. 34(6), 207:1–207:8 (2015). http://doi.acm.org/10.1145/2816795.2818114

    Article  Google Scholar 

  14. Kovar, L., Gleicher, M., Pighin, F.: Motion graphs. ACM Trans. Graph. 21(3), 473–482 (2002). http://doi.acm.org/10.1145/566654.566605

    Article  Google Scholar 

  15. Lee, J., Chai, J., Reitsma, P.S.A., Hodgins, J.K., Pollard, N.S.: Interactive control of avatars animated with human motion data. ACM Trans. Graph. 21(3), 491–500 (2002). http://doi.acm.org/10.1145/566654.566607

    Google Scholar 

  16. Liu, Z., Huang, J., Bu, S., Han, J., Tang, X., Li, X.: Template deformation-based 3-D reconstruction of full human body scans from low-cost depth cameras. IEEE Trans. Cybern. 47(3), 695–708 (2017)

    Article  Google Scholar 

  17. Liu, Z., Zhou, L., Leung, H., Shum, H.P.H.: Kinect posture reconstruction based on a local mixture of Gaussian process models. IEEE Trans. Vis. Comput. Graph. 22(11), 2437–2450 (2016)

    Article  Google Scholar 

  18. Norman, A., Amatriain, X.: Data jockey, a tool for meta-data enhanced digital DJing and active listening. In: ICMC. Michigan Publishing (2007)

    Google Scholar 

  19. Ragnhild, M.M., Mckelvin, M., Nest, R., Valdez, L., ping Yee, K., Back, M., Harrison, S.: SeismoSpin: a physical instrument for digital data. In: CHI 2003 Extended Abstracts on Human Factors in Computing Systems, pp. 832–833. ACM Press (2003)

    Google Scholar 

  20. Shiratori, T., Hodgins, J.K.: Accelerometer-based user interfaces for the control of a physically simulated character. ACM Trans. Graph. 27(5), 123:1–123:9 (2008). http://doi.acm.org/10.1145/1409060.1409076

    Article  Google Scholar 

  21. Shiratori, T., Nakazawa, A., Ikeuchi, K.: Dancing-to-music character animation. Comput. Graph. Forum 25(3), 449–458 (2006). http://dx.doi.org/10.1111/j.1467-8659.2006.00964.x

    Article  Google Scholar 

  22. Shirokura, T., Sakamoto, D., Sugiura, Y., Ono, T., Inami, M., Igarashi, T.: RoboJockey: real-time, simultaneous, and continuous creation of robot actions for everyone. In: Proceedings of the 7th International Conference on Advances in Computer Entertainment Technology (ACE 2010), pp. 53–56. ACM, New York (2010). http://doi.acm.org/10.1145/1971630.1971646

  23. Shum, H.P.H., Ho, E.S.L., Jiang, Y., Takagi, S.: Real-time posture reconstruction for Microsoft kinect. IEEE Trans. Cybern. 43(5), 1357–1369 (2013)

    Article  Google Scholar 

  24. Wang, J., Drucker, S.M., Agrawala, M., Cohen, M.F.: The cartoon animation filter. ACM Trans. Graph. 25(3), 1169–1173 (2006). http://doi.acm.org/10.1145/1141911.1142010

    Article  Google Scholar 

  25. Yazaki, Y., Soga, A., Umino, B., Hirayama, M.: Automatic composition by body-part motion synthesis for supporting dance creation. In: 2015 International Conference on Cyberworlds (CW), pp. 200–203, October 2015

    Google Scholar 

  26. Yoshizaki, W., Sugiura, Y., Chiou, A.C., Hashimoto, S., Inami, M., Igarashi, T., Akazawa, Y., Kawachi, K., Kagami, S., Mochimaru, M.: An actuated physical puppet as an input device for controlling a digital manikin. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI 2011), pp. 637–646. ACM, New York (2011). http://doi.acm.org/10.1145/1978942.1979034

  27. Zhai, S., Milgram, P.: Quantifying coordination in multiple DOF movement and its application to evaluating 6 DOF input devices. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI 1998), pp. 320–327. ACM Press/Addison-Wesley Publishing Co., New York (1998). http://dx.doi.org/10.1145/274644.274689

  28. Zhang, P., Siu, K., Zhang, J., Liu, C.K., Chai, J.: Leveraging depth cameras and wearable pressure sensors for full-body kinematics and dynamics capture. ACM Trans. Graph. 33(6), 1–14 (2014). http://doi.acm.org/10.1145/2661229.2661286

    Google Scholar 

Download references

Acknowledgement

This work was supported in part by JST ACCEL Grant Number JPMJAC1602, Japan. It was also supported by the Engineering and Physical Sciences Research Council (EPSRC) (Ref: EP/M002632/1) and the Royal Society (Ref: IE160609).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hubert P. H. Shum .

Editor information

Editors and Affiliations

1 Electronic supplementary material

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG, part of Springer Nature

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Iwamoto, N., Kato, T., Shum, H.P.H., Kakitsuka, R., Hara, K., Morishima, S. (2018). DanceDJ: A 3D Dance Animation Authoring System for Live Performance. In: Cheok, A., Inami, M., Romão, T. (eds) Advances in Computer Entertainment Technology. ACE 2017. Lecture Notes in Computer Science(), vol 10714. Springer, Cham. https://doi.org/10.1007/978-3-319-76270-8_46

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-76270-8_46

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-76269-2

  • Online ISBN: 978-3-319-76270-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics