Skip to main content

User-Centered Control of Audio and Visual Expressive Feedback by Full-Body Movements

  • Conference paper
Book cover Affective Computing and Intelligent Interaction (ACII 2007)

Part of the book series: Lecture Notes in Computer Science ((LNIP,volume 4738))

Abstract

In this paper we describe a system allowing users to express themselves through their full-body movement and gesture and to control in real-time the generation of an audio-visual feedback. The systems analyses in real-time the user’s full-body movement and gesture, extracts expressive motion features and maps the values of the expressive motion features onto real-time control of acoustic parameters for rendering a music performance. At the same time, a visual feedback generated in real-time is projected on a screen in front of the users with their coloured silhouette, depending on the emotion their movement communicates. Human movement analysis and visual feedback generation were done with the EyesWeb software platform and the music performance rendering with pDM. Evaluation tests were done with human participants to test the usability of the interface and the effectiveness of the design.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Picard, R.: Affective Computing. MIT Press, Boston, MA (1997)

    Google Scholar 

  2. Hashimoto, S.: KANSEI as the Third Target of Information Processing and Related Topics in Japan. In: Proc. International Workshop on KANSEI: The technology of emotion, Genova, pp. 101–104 (1997)

    Google Scholar 

  3. Cowie, R., Douglas-Cowie, E., Tsapatsoulis, N., Votsis, N., Kollias, N., Fellenz, W., Taylor, J.: Emotion Recognition in Human-Computer Interaction. IEEE Signal Processing Magazine 18(1), 32–80 (2001)

    Article  Google Scholar 

  4. Scherer, K.R., Wallbott, H.G.: Analysis of Nonverbal Behavior. In: HANDBOOK OF DISCOURSE: ANALYSIS, vol. 2(11). Academic Press, London (1985)

    Google Scholar 

  5. Wallbott, H.G., Scherer, K.R.: Cues and Channels in Emotion Recognition. Journal of Personality and Social Psychology 51(4), 690–699 (1986)

    Article  Google Scholar 

  6. DeMeijer, M.: The contribution of general features of body movement to the attribution of emotions. Journal of Nonverbal Behavior 13, 247–268 (1989)

    Article  Google Scholar 

  7. Wallbott, H.G.: Bodily expression of emotion. European Journal of Social Psychology, Eur. J. Soc. Psychol. 28, 879–896 (1998)

    Google Scholar 

  8. Boone, R.T., Cunningham, J.G.: Children’s decoding of emotion in expressive body movement: the development of cue attunement. Developmental Psychology 34, 1007–1016 (1998)

    Article  Google Scholar 

  9. Pollick, F.E.: The Features People Use to Recognize Human Movement Style. In: Camurri, A., Volpe, G. (eds.) GW 2003. LNCS (LNAI), vol. 2915, pp. 20–39. Springer, Heidelberg (2004)

    Google Scholar 

  10. Camurri, A., De Poli, G., Leman, M., Volpe, G.: Toward Communicating Expressiveness and Affect in Multimodal Interactive Systems for Performing Art and Cultural Applications. IEEE Multimedia Magazine 12(1), 43–53 (2005)

    Article  Google Scholar 

  11. Camurri, A.: Interactive Dance/Music Systems. In: Proc. Intl. Computer Music Conference ICMC 1995. In: The Banff Centre for the arts, Canada, (September, 3-7), pp. 245–252. ICMA-Intl.Comp.Mus.Association (1995)

    Google Scholar 

  12. Camurri, A., Trocca, R.: Movement and gesture in intelligent interactive music systems. In: Battier, M., Wanderley, M. (eds.) Trends in Gestural Control of Music, Ircam Publ. (2000)

    Google Scholar 

  13. Krueger, M.: Artificial Reality II. Addison-Wesley Professional, London, UK (1991)

    Google Scholar 

  14. Leman, M.: Embodied Music Cognition and Mediation Technology. MIT-Press, Cambridge, MA (in print)

    Google Scholar 

  15. Höök, K.: User-Centred Design and Evaluation of Affective Interfaces. In: From Brows to Trust. In: Ruttkay, Z., Pelachaud, C. (eds.) Evaluating Embodied Conversational Agents, Kluwer’s Human-Computer Interaction Series (2004)

    Google Scholar 

  16. Camurri, A., Coletta, P., Massari, A., Mazzarino, B., Peri, M., Ricchetti, M., Ricci, A., Volpe, G.: Toward real-time multimodal processing: EyesWeb 4.0, in Proc. AISB 2004 Convention: Motion, Emotion and Cognition, Leeds, UK (March 2004)

    Google Scholar 

  17. Friberg, A.: pDM: an expressive sequencer with real-time control of the KTH music performance rules. Computer Music Journal 30(1), 37–48 (2006)

    Article  Google Scholar 

  18. Friberg, A., Bresin, R., Sundberg, J.: Overview of the KTH rule system for music performance. Advances in Experimental Psychology, special issue on Music Performance 2(2-3), 145–161 (2006)

    Google Scholar 

  19. Camurri, A., Mazzarino, B., Volpe, G.: Analysis of Expressive Gesture: The Eyesweb Expressive Gesture Processing Library. In: Camurri, A., Volpe, G. (eds.) GW 2003. LNCS (LNAI), vol. 2915, Springer, Heidelberg (2004)

    Google Scholar 

  20. Camurri, A., Lagerlöf, I., Volpe, G.: Recognizing Emotion from Dance Movement: Comparison of Spectator Recognition and Automated Techniques. International Journal of Human-Computer Studies 59(1-2), 213–225 (2003)

    Google Scholar 

  21. Camurri, A., Castellano, G., Ricchetti, M., Volpe, G.: Subject interfaces: measuring bodily activation during an emotional experience of music. In: Gibet, S., Courty, N., Kamp, J.F. (eds.) Gesture in Human-Computer Interaction and Simulation, vol. 3881, pp. 268–279. Springer, Heidelberg (2006)

    Chapter  Google Scholar 

  22. Dahl, S., Friberg, A.: Visual perception of expressiveness in musicians’ body movements. Music Perception (forthcoming)

    Google Scholar 

  23. Bresin, R.: What is the color of that music performance? In: proceedings of the International Computer Music Conference - ICMC 2005, pp. 367–370 (2005)

    Google Scholar 

  24. Juslin, P., laukka, P.: Communication of emotions in vocal expression and music performance: Different channels, same code? Psychological Bulletin 129(5), 770–814 (2003)

    Article  Google Scholar 

  25. Juslin, P.N.: Communicating Emotion in Music Performance: a Review and Theoretical Framework. In: Juslin, P.N., Sloboda, J.A. (eds.) Music and Emotion: Theory and research, pp. 309–337. Oxford: University Press, Oxford (2001)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Ana C. R. Paiva Rui Prada Rosalind W. Picard

Rights and permissions

Reprints and permissions

Copyright information

© 2007 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Castellano, G., Bresin, R., Camurri, A., Volpe, G. (2007). User-Centered Control of Audio and Visual Expressive Feedback by Full-Body Movements. In: Paiva, A.C.R., Prada, R., Picard, R.W. (eds) Affective Computing and Intelligent Interaction. ACII 2007. Lecture Notes in Computer Science, vol 4738. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-74889-2_44

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-74889-2_44

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-74888-5

  • Online ISBN: 978-3-540-74889-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics