Skip to main content

Displaying Expression in Musical Performance by Means of a Mobile Robot

  • Conference paper
Affective Computing and Intelligent Interaction (ACII 2007)

Part of the book series: Lecture Notes in Computer Science ((LNIP,volume 4738))

  • 5735 Accesses

Abstract

In recent times several attempts have been made to give a robot or broader spoken a computer some kind of feelings in order to understand and model human capacities. The main idea of our work was the design of expressive robot movements for the display of emotional content embedded in the audio layer in both live and recorded music performance. Starting from results in studies on musicians’ body in emotional expressive music performance (see [3]), we tried to map different movement cues (e.g. speed, fluency) to movements of a small mobile robot. The robot had constraints of sensors and motors, so the emotions were implemented taking into account only the main characteristics of musicians’ movements. We implemented movements for the three emotions happiness, anger and sadness. Subjects were asked to judge in a perceptual test which emotional intentions were communicated by the movements.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Bresin, R.: What is the color of that music performance? In: ICMC, pp. 367–370 (2005)

    Google Scholar 

  2. Burger, B.: Communication of Musical Expression from Mobile Robots to Humans. Master’s Thesis (in preparation)

    Google Scholar 

  3. Dahl, S., Friberg, A.: Visual perception of expressiveness in musicians’ body movements. Music Perception, 24(5) (in press)

    Google Scholar 

  4. Friberg, A., Schoonderwaldt, E., Juslin, P.N.: CUEX: An algorithm for extracting expressive tone variables from audio recordings. Acoustica united with Acta Acoustica 93(3), 411–420 (2007)

    Google Scholar 

  5. Isbister, K., Höök, K., Sharp, M., Laaksolahti, J.: The Sensual Evaluation Instrument: Developing an Affective Evaluation Tool. In: CHI, pp. 1163–1172 (2006)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Ana C. R. Paiva Rui Prada Rosalind W. Picard

Rights and permissions

Reprints and permissions

Copyright information

© 2007 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Burger, B., Bresin, R. (2007). Displaying Expression in Musical Performance by Means of a Mobile Robot. In: Paiva, A.C.R., Prada, R., Picard, R.W. (eds) Affective Computing and Intelligent Interaction. ACII 2007. Lecture Notes in Computer Science, vol 4738. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-74889-2_83

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-74889-2_83

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-74888-5

  • Online ISBN: 978-3-540-74889-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics