Skip to main content
Log in

A System for Mobile Active Music Listening Based on Social Interaction and Embodiment

  • Published:
Mobile Networks and Applications Aims and scope Submit manuscript

Abstract

Social interaction and embodiment are key issues for future User Centric Media. Social networks and games are more and more characterized by an active, physical participation of the users. The integration in mobile devices of a growing number of sensors to capture users’ physical activity (e.g., accelerometers, cameras) and context information (GPS, location) supports novel systems capable to connect audiovisual content processing and communication to users social behavior, including joint movement and physical engagement. In this paper, a system enabling a novel paradigm for social, active experience of sound and music content is presented. An instance of such a system, named Sync‘n’Move, allowing two users to explore a multi-channel pre-recorded music piece as the result of their social interaction, and in particular of their synchronization, is introduced. This research has been developed in the framework of the EU-ICT Project SAME (www.sameproject.eu) and has been presented at Agora Festival (IRCAM, Centre Pompidou, Paris, June 2009). In that occasion, Sync‘n’Move has been evaluated by both expert and non expert users, and results are briefly presented. Perspectives on the impact of such a novel paradigm and system in future User Centric Media are finally discussed, with a specific focus on social active experience of audiovisual content.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

References

  1. Barrington L, O’Malley D, Turnbull D, Lanckriet G (2009) User-centered design of a social game to tag music. In: Proceedings of the ACM SIGKDD workshop on human computation, Paris, ACM New York, NY, USA

  2. Brown E, Cairns P (2004) A grounded investigation of game immersion. In: Proceedings of Conference on Human Factors in Computing Systems (CHI 2004), pp 1297–1300

  3. Camurri A (1995) Interactive dance/music systems. In: Proceedings International Computer Music Conference (ICMC-95), The Banff Centre for the arts, Canada, ICMA-Intl.Comp.Mus.Association, pp 245–252

  4. Camurri A, De Poli G, Leman M, Volpe G (2005) Toward communicating expressiveness and affect in multimodal interactive systems for performing art and cultural applications. IEEE Multimedia Magazine 12(1):43–53

    Article  Google Scholar 

  5. Camurri A, Coletta P, Varni G, Ghisio S (2007) Developing multimodal interactive systems with EyesWeb XMI. In: Proceedings of the 7th International Conference on New Interfaces for Musical Expression, Genova, ACM New York, NY, USA, pp 305–308

  6. Camurri A, Volpe G (2008) Active and personalized experience of sound and music content. In: Proceedings of 12th IEEE international symposium on consumer electronics, IEEE Press

  7. Camurri A, Volpe G, Vinet H, Bresin R, Fabiani M, Dubus G, Maestre E, Llop J, Kleimola J, Oksanen S, Välimäki V, Seppanen J (2009) User-centric context-aware mobile applications for embodied music listening. In: Proceedings of the 1st International ICST Conference on User Centric Media. ISBN 978-963-9799-84-4, Venice, Italy

  8. Clayton M, Sager R, Will U (2004) In time with the music: the concept of entrainment and its significance for ethnomusicology. ESEM counterpoint 1:1–82

    Google Scholar 

  9. De Bruyn L, Leman M, Moelants D (2008) Quantifying children’s embodiment of musical rhythm in individual and group settings. In: Proceedings of the 10th International Conference on Music Perception and Cognition, Sapporo, Japan

  10. Eckmann JP, Kamphorst SO, Ruelle D (1987) Recurrence plots of dynamical system. Europhys Lett 5:973–977

    Article  Google Scholar 

  11. Goto M (2007) Active music listening interfaces based on signal processing. In: Proceedings of the 2007 IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP 2007), IV-1441-1444

  12. Keller PE (2008) Joint action in music performance. In: Morganti F, Carassa A, Riva G (eds) Enacting intersubjectivity: a cognitive and social perspective on the study of interaction. IOS Press, Amsterdam, pp 205–221

  13. Kirschner S, Tomasello M (2009) Joint drumming: social context facilitates synchronization in preschool children. J Exp Child Psych 102:299–314

    Article  Google Scholar 

  14. Laso-Ballesteros I, Daras P (eds) (2008) User centric future media internet. EU Commission

  15. Laso-Ballesteros I, Daras P (eds) (2009) User centric media in the future internet. EU Commission

  16. Leman M, Demey M, Lesaffre M, van Noorden L, Moelants D (2009) Concepts, technology and assessment of the social music game ’Sync-in Team’. In: Proceedings of the 12th IEEE International Conference on Computational Science and Engineering Vancouver, BC, Canada. IEEE Computer Society

  17. Marwan N, Romano MC, Thiel M, Kurths J (2007) Recurrence plots for the analysis of complex systems. Physics Reports 438:237–329

    Article  MathSciNet  Google Scholar 

  18. Miyake Y (2009) Interpersonal synchronization of body motion and the walk-matewalking support robot. IEEE Trans Robot Autom 25(3):638–644

    MathSciNet  Google Scholar 

  19. Pachet F, Delerue O (2000) On-the-fly multi track mixing. In: Proceedings of 109th AES convention, Los Angeles, USA

  20. Pachet F (2004) Creativity studies and musical interaction. In: Delige I, Wiggins G (eds) Musical creativity: current research in theory and practice. Psychology Press

  21. Pentland A (2007) Social signal processing. IEEE Signal Process Mag 24(4):108–111

    Article  Google Scholar 

  22. Pikovsky A, Rosenblum MG, Kurths J (2001) Synchronisation: a universal concept in nonlinear sciences. Cambridge University Press, Cambridge

    Book  Google Scholar 

  23. Poincaré H (1890) Sur le probléme des trois corps et les équations de la dynamique. Acta Mathematica 13:1–271

    MATH  Google Scholar 

  24. Romano MC, Thiel M, Kurths J, Kiss IZ, Hudson JL (2005) Detection of synchronisation for non-phase coherent and non-stationarity data. Europhys Lett 71(3):466–472

    Article  Google Scholar 

  25. Rowe R (1993) Interactive music systems: machine listening and composition. MIT Press, Cambridge MA

    Google Scholar 

  26. Stockholm J, Pasquier P (2009) Reinforcement learning of listener response for mood classification of audio. In: Proceedings of the 12th IEEE International Conference on Computational Science and Engineering Vancouver, BC, Canada. IEEE Computer Society

  27. Varni G, Camurri A, Coletta P, Volpe G (2009) Toward real-time automated measure of empathy and dominance. In: Proceedings of the 12th IEEE International Conference on Computational Science and Engineering Vancouver, BC, Canada. IEEE Computer Society

  28. Zbilut J, Webber CL Jr (1992) Embeddings and delays as derived from quantification of recurrence plots. Phys Lett A 5:199–203

    Article  Google Scholar 

Download references

Acknowledgements

The authors would like to thank Paolo Coletta for EyesWeb support and development and Alberto Massari for creating Python scripts for reading and transmitting accelerometer data and audio streams for Nokia phones. The authors also thank Norbert Marwan for precious suggestions on recurrence and Carlo Chiorri for precious discussion on statistics.

The research described in this paper is partially supported by the EU FP7 ICT SAME project, and also benefits from the preliminary results obtained in the EU FP7 ICT I-SEARCH project, specifically in advances on the retrieval of sound materials basing on user interaction; future work within I-SEARCH will include the extension of the paradigm to audiovisual search and retrieval.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Gualtiero Volpe.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Varni, G., Mancini, M., Volpe, G. et al. A System for Mobile Active Music Listening Based on Social Interaction and Embodiment. Mobile Netw Appl 16, 375–384 (2011). https://doi.org/10.1007/s11036-010-0256-4

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11036-010-0256-4

Keywords

Navigation