Skip to main content
Log in

Human movement expressivity for mobile active music listening

Journal on Multimodal User Interfaces Aims and scope Submit manuscript

Abstract

In this paper we describe the SAME networked platform for context-aware, experience-centric mobile music applications, and we present an implementation of the SAME active music listening paradigm: the Mobile Conductor. It allows the user to express herself in conducting a virtual ensemble playing a MIDI piece of music by means of her mobile phone. The mobile phone detects the user’s hand movement and molds the music performance style by modulating its speed, volume, and intonation.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

References

  1. Anttila A (2006) Sonicpulse: exploring a shared music space. In: 3rd international workshop on mobile music technology

  2. Ball G, Breese J (2000) Emotion and personality in a conversational agent. In: Cassell J, Sullivan J, Prevost S, Churchill E (eds) Embodied conversational characters. MIT Press, Cambridge

    Google Scholar 

  3. Camurri A (1995) Interactive dance/music systems. In: Proceedings of international computer music conference

  4. Camurri A, Canepa C, Coletta P, Mazzarino B, Volpe G (2007) Mappe per affetti erranti: a multimodal system for social active listening and expressive performance. In: Proceedings of the 8th international conference on new interfaces for musical expression

  5. Camurri A, Canepa C, Volpe G (2007) Active listening to a virtual orchestra through an expressive gestural interface: the orchestra explorer. In: Proceedings of the 7th international conference on new interfaces for musical expression

  6. Camurri A, Mazzarino B, Volpe G (2004) Analysis of expressive gesture: the eyesweb expressive gesture processing library. Lecture notes in computer science

  7. Camurri A, Volpe G, Vinet H, Bresin R, Maestre E, Llop J, Kleimola J, Valimaki S, Seppanen J (2009) User-centric context-aware mobile applications for embodied music listening. In: Proceedings of the 1st international ICST conference on user centric media

  8. Castellano G, Bresin R, Camurri A, Volpe G (2007) Expressive control of music and visual media by full-body movement. In: Proceedings of the 7th international conference on New interfaces for musical expression, pp 390–391

  9. EyesWeb: http://www.eyesweb.org

  10. Gallaher PE (1992) Individual differences in nonverbal behavior: dimensions of style. J Pers Soc Psychol 63(1):133–145

    Article  Google Scholar 

  11. Gaye L, Mazé R, Holmquist L (2003) Sonic city: the urban environment as a musical interface. In: Proceedings of the 3rd international conference on new interfaces for musical expression

  12. Goto M (2007) Active music listening interfaces based on signal processing. In: Proceedings of the 2007 IEEE international conference on acoustics, speech, and signal processing

  13. iPod: http://www.apple.com/ipod

  14. Johansson G (1973) Visual perception of biological motion and a model for its analysis. Percept Psychophys 14:201–211

    Google Scholar 

  15. Laban R, Lawrence FC (1947) Effort. Macdonald & Evans, USA

    Google Scholar 

  16. Leman M, Demey M, Lesaffre M, van Noorden L, Moelants D (2009) Concepts, technology and assessment of the social music game sync-in team. In: Proceedings of the 12th IEEE international conference on computational science and engineering

  17. Mancini M, Bresin R, Pelachaud C (2007) A virtual head driven by music expressivity. IEEE Trans Audio Speech Lang Process 15(6):1833–1841

    Article  Google Scholar 

  18. Östergren M, Juhlin O (2004) Sound pryer: truly mobile joint listening. In: 1st international workshop on mobile music technology

  19. Paiva A, Andersson G, Höök K, Mourao D, Costa M, Martinho C (2002) Sentoy in fantasya: designing an affective sympathetic interface to a computer game. Pers Ubiquitous Comput 6(5-6):378–389

    Article  Google Scholar 

  20. Pollick FE (2004) The features people use to recognize human movement style. In: Camurri A, Volpe G (eds) Gesture-based communication in human-computer interaction-gesture workshop 2003. LNAI, vol 2915. Springer, Berlin, pp 10–19

    Chapter  Google Scholar 

  21. PureData: http://puredata.info

  22. Rohs M, Essl G (2007) Camus2-collaborative music performance with mobile camera phones. In: Proceedings of the international conference on advances in computer entertainment technology (ACE)

  23. Rohs M, Essl G, Roth M (2006) Camus: live music performance using camera phones and visual grid tracking. In: NIME ’06: proceedings of the 2006 conference on new interfaces for musical expression. IRCAM, Paris, pp 31–36

    Google Scholar 

  24. vvvv: http://vvvv.org

  25. Wallbott HG (1998) Bodily expression of emotion. Eur J Soc Psychol 28:879–896

    Article  Google Scholar 

  26. Wallbott HG, Scherer KR (1986) Cues and channels in emotion recognition. J Pers Soc Psychol 51(4):690–699

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Maurizio Mancini.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Mancini, M., Varni, G., Kleimola, J. et al. Human movement expressivity for mobile active music listening. J Multimodal User Interfaces 4, 27–35 (2010). https://doi.org/10.1007/s12193-010-0047-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12193-010-0047-z

Keywords

Navigation