Abstract
This paper evaluates three different interactive sonifications of dyadic coordinated human rhythmic activity. An index of phase synchronisation of gestures was chosen as coordination metric. The sonifications are implemented as three prototype applications exploiting mobile devices: Sync’n’Moog, Sync’n’Move, and Sync’n’Mood. Sync’n’Moog sonifies the phase synchronisation index by acting directly on the audio signal and applying a nonlinear time-varying filtering technique. Sync’n’Move intervenes on the multi-track music content by making the single instruments emerge and hide. Sync’n’Mood manipulates the affective features of the music performance. The three sonifications were also tested against a condition without sonification.
References
Mathews MV (1991) The Radio Baton and conductor program, or: pitch, the most important and least expressive part of music. Comput Music J 15(4):37–46
Ilmonen T, Takala T (1999) Conductor following with artificial neural networks. In: Proceedings of the 1999 international computer music conference, Beijing, China, pp 367–370
Paradiso JA (1997) Electronic music: new ways to play. IEEE Spectr 34(12):18–30
Wanderley M, Depalle P (2004) Gestural control of sound synthesis. Proc IEEE 92(4):632–644
Karjalainen M, Mäki-Patola T, Kanerva A, Huovilainen A (2006) Virtual air guitar. J Audio Eng Soc 54(10):964–980
Pakarinen J, Puputti T, Välimäki V (2008) Virtual slide guitar. Comput Music J 32(3):42–54
Huovilainen A (2004) Non-linear digital implementation of the Moog ladder filter. In: Proceedings of the 7th international conference on digital audio effects (DAFx’04), Naples, Italy, pp 61–64
Moog RA (1965) A voltage-controlled low-pass high-pass filter for audio signal processing. In: Proceedings of the 17th audio engineering society convention
Stilson T, Smith JO (1996) Analyzing the Moog VCF with considerations for digital implementation. In: Proceedings of the 1996 international computer music conference, Hong Kong, pp 398–401
Välimäki V, Huovilainen A (2006) Oscillator and filter algorithms for virtual analog synthesis. Comput Music J 30(2):19–31
Varni G, Volpe G, Camurri A (2010) A System for real-time multimodal analysis of nonverbal affective social interaction in user-centric media. IEEE Trans Multimed 12(6):576–590
Volpe G, Camurri A (2011) A system for embodied social active listening to sound and music content. J Comput Cult Heritage 4:2:1–2:23
Laso-Ballesteros I, Daras P (2008) User centric future media Internet. EU Commission, Brussels
Marwan N, Romano MC, Thiel M, Kurths J (2007) Recurrence plots for the analysis of complex systems. Phys Rep 438:237–329
Varni G, Mancini M, Volpe G, Camurri A (2010) A system for mobile active music listening based on social interaction and embodiment. Mob Netw Appl 16(3):375–384
Bingham GP, Schmidt RC, Turvey MT, Rosenblum LD (1991) Task dynamics and resource dynamics in the assembly of coordinated rhythmic activity. J Exp Psychol Hum Percept Perform 17(2):359–381
Richardson MJ, Marsh KL, Schmidt RC (2005) Effects of visual and verbal interaction on unintentional interpersonal coordination. J Exp Psychol Hum Percept Perform 31(1):62–79
Thiel M, Romano MC, Kurths J, Rolfs M, Kiegl R (2006) Twin surrogates to test for complex synchronisation. Europhys Lett 75:535–541
Gaye L, Mazé R, Holmquist LE (2003) Sonic city: the urban environment as a musical interface. In: Proceedings of the 2003 international conference on new interfaces for musical expression
Östergren M, Juhlin O (2004) Sound pryer: truly mobile joint listening. In: Proceedings of the 1st international workshop on mobile music technology
Anttila A (2006) SonicPulse: exploring a shared music space. In: Proceedings of the 3rd international workshop on mobile music technology
Rohs M, Essl G, Roth M (2006) CaMus: live music performance using camera phones and visual grid tracking. In: Proceedings of the 2006 international conference on new interfaces for musical expression, pp 31–36
Leman M, Demey M, Lesaffre M, van Noorden L, Moelants D (2009) Concepts, technology and assessment of the social music game Sync-in Team. In: Proceedings of the 12th IEEE international conference on computational science and engineering
Marsh K, Richardson M, Schmidt R (2009) Social connection through joint action and interpersonal coordination. Top Cogn Sci 1(2):320–339
Wallis I, Ingalls T, Rikakis T, Olsen L, Chen Y, Xu W, Sundaram H (2007) Real-time sonification of movement for an immersive stroke rehabilitation environment. In: Proceedings of the 13th international conference on auditory display
Godbout A, Boyd JE (2010) Corrective sonic feedback for speed skating: a case study. In: Proceedings of the 16th international conference on auditory display
Effenberg AE (2005) Movement sonification: effects on perception and action. IEEE Multimed 12(2):53–59
Schaffert N, Mattes K, Effenberg AE (2010) Listen to the boat motion: acoustic information for elite rowers. In: Proceedings of 3rd interactive sonification workshop
Dubus G, Bresin R (2010) Sonification of sculler movements, development of preliminary methods. In: Proceedings of the 3rd interactive sonification workshop
Barrass S, Schaffert N, Barrass T (2010) Probing preferences between six designs of interactive sonifications for recreational sports, health and fitness. In: Proceedings of the 3rd interactive sonification workshop
Fabiani M, Dubus G, Bresin R (2010) Interactive sonification of emotionally expressive gestures by means of music performance. In: Proceedings of the 3rd interactive sonification workshop
Friberg A, Bresin R, Sundberg J (2006) Overview of the KTH rule system for musical performance. Adv Cogn Psychol 2(2):145–161
Friberg A (2006) pDM: an expressive sequencer with real-time control of the KTH music-performance rules. Comput Music J 30(1):37–48
Kessous L, Jacquemin C, Filatriau JJ (2008) Real-time sonification of physiological data in an artistic performance context. In: Proceedings of the 14th international conference on auditory display
Barrass S, Kramer G (1999) Using sonification. Multimed Syst 7:23–31
Hermann T (2008) Taxonomy and definitions for sonification and auditory display. In: Proceedings of the 14th international conference on auditory display
Goebl W (2001) Melody lead in piano performance: expressive device or artifact? J Acoust Soc Am 110(1):563–572
Bresin R, Friberg A (2011) Emotion rendering in music: range and characteristic values of seven musical variables. Cortex 47(9):1068–1081
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Varni, G., Dubus, G., Oksanen, S. et al. Interactive sonification of synchronisation of motoric behaviour in social active listening to music with mobile devices. J Multimodal User Interfaces 5, 157–173 (2012). https://doi.org/10.1007/s12193-011-0079-z
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12193-011-0079-z