Skip to main content

Advertisement

Log in

Musical-based interaction system for the Waseda Flutist Robot

Implementation of the visual tracking interaction module

  • Published:
Autonomous Robots Aims and scope Submit manuscript

Abstract

Since 1990, at Waseda University the development on the Anthropomorphic Flutist Robot has been focused on mechanically reproducing the physiology of the organs involved during the flute playing (i.e. lungs, lips, etc.) and implementing basic cognitive capabilities to interact with flutist beginners. As a results of the research efforts done until now, the Waseda Flutist Robot is considered to play the flute nearly similar to the performance of a intermediate human player. However, we consider that in order to extend the interaction capabilities of the flutist robot with musical partners, further research efforts should be done. In this paper, we propose as a long-term goal to enable the flutist robot to interact more naturally with musical partners on the context of a Jazz band. For this purpose a Musical-Based Interaction System (MbIS) is proposed to enable the robot the process both visual and aural cues coming throughout the interaction with musicians. In particular, in this paper, the details of the implementation of the visual tracking module on the Waseda Flutist Robot No. 4 Refined IV (WF-4RIV) is presented. The visual tracking module is composed by two levels of interaction: basic (visual interface for the musician based on controlling virtual buttons and faders) and advanced (instrument tracking system so that the robot can process motion gestures performed by the musical partner in real-time which are then directly mapped into musical parameters of the performance of the robot). The experiments carried out were focused in verifying the effectiveness and usability of the proposed levels of interaction. In particular, we focused on determining how well our the WF-4RIV dynamically changes musical parameters while interacting with a human player. From the experimental results we observed that the physical constraints of the robot play an important role during the interaction. Although further improvements should be done to overcome such constrains, we expect that the interaction experience may become more natural.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  • Akai Professional Japan (2010). Akai MPC 2000. http://www.akaipro.com/mpc. Accessed 11 March 2010.

  • Alberto, L. G. (1994). Probability and random processes for electrical engineering (p. 606). New York: Prentice Hall.

    Google Scholar 

  • Arnaud, D., & Simon, G. (2000). On sequential Monte Carlo sampling methods for Bayesian filtering. Statistics and Computing, 10(3), 197–208.

    Article  Google Scholar 

  • Arulampalam, M. S., Maskell, S., Gordon, N., & Clapp, T. (2002). A tutorial on particle filters for online nonlinear/non-gaussianbayesian tracking. IEEE Transactions on Signal Processing, 50(2), 174–188.

    Article  Google Scholar 

  • Bray, M., Koller-Meier, E., & Van Gool, L. (2004). Smart particle filtering for 3D hand tracking. In Proceedings of 6th IEEE international conference on automatic face and gesture recognition (pp. 675–680).

  • David, S., & Richard, F. (1996). Toward robust skin identification in video images. In Proc. of the 2nd international conference on automatic face and gesture recognition (pp. 379–384)

  • Deutscher, J., Blake, A., & Reid, I. (2000). Articulated body motion capture by annealed particle filtering. In Proc. of the IEEE computer society conference on computer vision and pattern recognition (pp. 126–133).

  • Fordham, J. (1993). Jazz (p. 216). London: Dorling Kindersley.

    Google Scholar 

  • Goto, S. (2006). The case study of an application of the system ‘bodysuit and robotmusic: its introduction and aesthetics’. In Proc. of the international conference on new interfaces for musical expression (pp. 292–295).

  • Hasan, L., Yu, N., & Paradiso, A. (2002). The thermenova: a hybrid free-gesture interface. In Proc. of the international conference on new instruments for musical expression (pp. 1–6).

  • Kato, I., Ohteru, S., Shirai, K., Narita, S., Sugano, S., Matsushima, T., Kobayash, T., & Fujisawa, E. (1978). The robot musician WABOT-2 (Waseda Robot-2). Robotics, 3(2), 143–155.

    Article  Google Scholar 

  • Kuwabara, H., Seki, H., Sasada, Y., Aiguo, M., & Shimojo, M. (2006). The development of a violin musician’s robot. In Proc. of the IEEE/RSJ int. conference on intelligent robots and systems—workshop: musical performance robots and its applications (pp. 18–23).

  • Nummiaro, K., Koller-Meier, E., & Van Gool, L. (2002). An adaptive color-based particle filter. Image and Vision Computing, 21(1), 99–110.

    Article  Google Scholar 

  • Richardson, I. (2003). H. 264 and MPEG-4 video compression: video coding for next-generation multimedia (p. 320). New York: Wiley.

    Book  Google Scholar 

  • Rokeby, D. (1986). A very nervous system. In Arte, technologia e informatica, Venice biennale, Venice, Italy.

  • Shibuya, K. (2006). Analysis of human kansei and development of a violin playing robot. In Proc. of the IEEE/RSJ int. conference on intelligent robots and systems—workshop: musical performance robots and its applications (pp. 13–17).

  • Solis, J., & Takanishi, A. (2007). An overview of the research approaches on musical performance robots. In Proc. of the international conference on computer music (pp. 356–359).

  • Solis, J., Isoda, S., Chida, K., Takanishi, A., & Wakamatsu, K. (2004). Anthropomorphic flutist robot for teaching flute playing to beginner students. In Proc. of the IEEE international conference on robotics and automation (pp. 146–150).

  • Solis, J., Chida, K., Suefuji, K., & Takanishi, A. (2006). The development of the anthropomorphic flutist robot at Waseda university. International Journal of Humanoid Robots, 3(2), 127–151.

    Article  Google Scholar 

  • Solis, J., Suefuji, K., Chida, K., Taniguchi, K., & Takanishi, A. (2006). Imitating the human flute playing by the WF-4RII: mechanical, perceptual and performance control systems. In Proc. of the 1st IEEE/RAS-EMBS international conference on biomedical robotics and biomechatronics (pp. 1024–1029).

  • Solis, J., Suefuji, K., Taniguchi, K., & Takanishi, A. (2006). Towards an autonomous musical teaching system from the Waseda Flutist Robot to flutist beginners. In Proc. of the IEEE/RSJ international conference on intelligent robots and systems—workshop: musical performance robots and its applications (pp. 24–29).

  • Solis, J., Marcheschi, S., Frisoli, M., & Bergamasco, M. (2007). Reactive robots system: an active human/robot interaction for transferring skill from robot to unskilled persons. International Advanced Robotics Journal, 21(3), 267–291.

    Article  Google Scholar 

  • Solis, J., Suefuji, K., Taniguchi, K., Ninomiya, T., Maeda, M., & Takanishi, A. (2007). Implementation of expressive performance rules on the WF-4RIII by modeling a professional flutist performance using NN. In Proc. of the international conference on robotics and automation (pp. 2252–2557).

  • Solis, J., Taniguchi, K., Ninomiya, T., Yamamoto, T., & Takanishi, A. (2008). Understanding the mechanisms of the human motor control by imitating flute playing with the Waseda Flutist Robot WF-4RIV. Mechanism and Machine Theory (Special Issue on Bio-Inspired Mechanism Engineering), 44(3), 527–540.

    Google Scholar 

  • Solis, J., Taniguchi, K., Ninomiya, T., Yamamoto, T., & Takanishi, A. (2009). Implementation of an auditory feedback control system on an anthropomorphic flutist robot inspired by the performance of a professional flutist. Advanced Robotics Journal, 23, 1849–1871.

    Article  Google Scholar 

  • Sony Computer Entertainment (2008). PlayStation Eyetoy. http://www.eyetoy.com. Accessed 11 March 2010.

  • Sugano, S., & Kato, I. (1987). WABOT-2: autonomous robot with dexterous finger-arm coordination control in keyboard performance. In Proc. of the int. conference on robotics and automation (pp. 90–97).

  • Takashima, S., & Miyawaki, T. (2006). Control of an automatic performance robot of saxophone: performance control using standard midi files. In Proc. of the IEEE/RSJ int. conference on intelligent robots and systems—workshop: musical performance robots and its applications (pp. 30–35).

  • Toyota Partner Robots (2003). Overview. http:www.toyota.co.jp/en/special/robot/. Accessed 11 March 2010.

  • van der Merwe, R., de Freitas, N., Doucet, A., & Wan, E. (2000). The unscented particle filter. In Proc. of the advanced neural information processing system signal processing (pp. 584–590).

  • Wanderley, M., & Battier, M. (2000). Trends in gestural control of music. Technical report, IRCAM Centre Pompidou.

  • Weiss, F., & Palindrome Intermedia Dance Group (2001). Lob der anwesenheit—dance performance. Ludwigsforum, Aachen.

  • Winkler, T. (1995). Making motion musical: gestural mapping strategies for interactive computer music. In Proc. of the international conference of computer music (pp. 261–264).

  • Wren, C. R., Azarbayejani, A., Darrell, T., & Pentland, A. P. (1997). Pfinder: real-time tracking of the human body. IEEE Transactions on Pattern Analysis and Machine Intelligence, 19(7), 780–785.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jorge Solis.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Petersen, K., Solis, J. & Takanishi, A. Musical-based interaction system for the Waseda Flutist Robot. Auton Robot 28, 471–488 (2010). https://doi.org/10.1007/s10514-010-9180-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10514-010-9180-5

Keywords

Navigation