Abstract
While listeners’ emotional response to music is the subject of numerous studies, less attention is paid to the dynamic emotion variations due to the interaction between artists and audiences in live improvised music performances. By opening a direct communication channel from audience members to performers, the Mood Conductor system provides an experimental framework to study this phenomenon. Mood Conductor facilitates interactive performances and thus also has an inherent entertainment value. The framework allows audience members to send emotional directions using their mobile devices in order to “conduct” improvised performances. Emotion coordinates indicted by the audience in the arousal-valence space are aggregated and clustered to create a video projection. This is used by the musicians as guidance, and provides visual feedback to the audience. Three different systems were developed and tested within our framework so far. These systems were trialled in several public performances with different ensembles. Qualitative and quantitative evaluations demonstrated that musicians and audiences were highly engaged with the system, and raised new insights enabling future improvements of the framework.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
- 2.
- 3.
- 4.
- 5.
- 6.
- 7.
Colour codes for some prominent emotion were taken from the following resource: http://www.wefeelfine.org/data/files/feelings.txt.
- 8.
See photos at: http://bit.ly/moodcphotos.
- 9.
- 10.
Photos can be found online at http://bit.ly/moodcphotos.
- 11.
Audience participants’ questionnaires: session 1: http://bit.ly/mcs1q1 and http://bit.ly/mcs1q2 ; session 2: http://bit.ly/mcs2q1 and http://bit.ly/mcs2q2.
- 12.
Performers’ questionnaire: http://bit.ly/mcperformer.
- 13.
- 14.
- 15.
References
Asmus, E.P.: Nine affective dimensions. Technical report, University of Miami (1986)
Avidan, S.: Ensemble tracking. In: IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’05), San Diego, California, USA (2005)
Barkhuus, L., Jørgensen, T.: Engaging the crowd: studies of audience-performer interaction. In: CHI ’08 Extended Abstracts on Human Factors in Computing Systems, CHI EA ’08, pp. 2925–2930. New York (2008)
Barthet, M., Fazekas, G., Sandler, M.: Music emotion recognition: from content- to context-based models. In: Aramaki, M., Barthet, M., Kronland-Martinet, R., Ystad, S. (eds.) CMMR 2012. LNCS, vol. 7900, pp. 228–252. Springer, Heidelberg (2013)
Bradley, M., Lang, P.J.: Affective norms for english words: instruction manual and affective ratings. Technical Report C-2, University of Florida (2010)
Cheng, Y.: Mean shift, mode seeking, and clustering. IEEE Trans. Pattern Anal. Mach. Intell. 17(8), 790–799 (1995)
Cover, T.M., Hart, P.E.: Nearest neighbor pattern classification. IEEE Trans. Inf. Theor. 13(1), 21–27 (1967)
Dean, R.: Creative Improvisation: Jazz, Contemporary Music and Beyond. Open University Press, Philadelphia (1989)
Fazekas, G., Barthet, M., Sandler, M.: Mood conductor: emotion-driven interactive music performance. In: International Conference on Affective Computing and Intelligent Interaction (ACII’13), Geneva, Switzerland, 2–5 September 2013 (2013)
Fielding, R.T., Taylor, R.N.: Principled design of the modern Web architecture. ACM Trans. Internet Technol. 2(2), 115–150 (2002)
Gorow, R.: Hearing and Writing Music: Professional Training for Today’s Musician, 2nd edn. September Publishing, Gardena (2002)
Hayes-Roth, B., Sincoff, E., Brownston, L., Huard, R., Lent, B.: Directed improvisation. Technical report, Stanford University (1994)
Iacobini, M., Gonsalves, T., Berthouze, N., Frith, C.: Creating emotional communication with interactive artwork. In: 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops, pp. 1–6 (2009)
Johnstone, K.: Improvisation and the Theatre. Methuen Drama, London (2007)
Kendall, R.A., Carterette, E.C.: The communication of musical expression. Music Percept. 8(2), 129–164 (1990)
Kim, Y., Schmidt, E.M., Emelle, L.: Moodswings: a collaborative game for music mood label collection. In: Proceeding of the International Society for Music Information Retrieval (ISMIR) Conference (2008)
Lou, T., Barthet, M., Fazekas, G., Sandler, M.: Evaluation of the Mood Conductor interactive system based on audience and performers’ perspectives. In: Proceedings of the 10th International Symposium on Computer Music Multidisciplinary Research (CMMR), pp. 594–609 (2013)
Lou, T., Barthet, M., Fazekas, G., Sandler, M.: Evaluation and improvement of the mood conductor interactive system. In: 53rd AES Internationa Conference on Semantic Audio, London, UK, 26–29 January 2014 (2014)
Russell, J.A.: A circumplex model of affect. J. Pers. Soc. Psychol. 39(6), 1161–1178 (1980)
Saari, P., Barthet, M., Fazekas, G., Eerola, T., Sandler, M.: Semantic models of musical mood: comparison between crowd-sourced and curated editorial tags. In: Proceedings of IEEE International Conference on Multimedia & Expo (ICME2013) International Workshop on Affective Analysis in Multimedia (AAM), San Jose, CA, USA, 15–19 July 2013 (2013)
Schubert, E.: Measuring emotion continuously: validity and reliability of the two-dimensional emotion-space. Aust. J. Psychol. 51(3), 154–165 (1999)
Schubert, E., Ferguson, S., Farrar, N., Taylor, D., McPherson, G.E.: Continuous response to music using discrete emotion faces. In: Barthet, M., Dixon, S. (eds.) Proceedings of the 9th International Symposium on Computer Music Modeling and Retrieval (CMMR’12), pp. 3–19 (2012)
Sgouros, N.M.: Supporting audience and player interaction during interactive media performances. In: 2000 IEEE International Conference on Multimedia and Expo, 2000, ICME 2000, vol. 3, pp. 1367–1370 (2000)
Sloboda, J.A., Juslin, P.N.: Psychological perspectives on music and emotion. In: Juslin, P.N., Sloboda, J.A. (eds.) Music and Emotion Theory and Research. Series in Affective Science, pp. 71–104. Oxford University Press, Oxford (2001)
Thayer, R.E.: The Biopsychology of Mood and Arousal. Oxford University Press, New York (1989)
Van Zijl, A.G.W., Sloboda, J.: Performers’ experienced emotions in the construction of expressive musical performance: an exploratory investigation. Psychol. Music 39(2), 196–219 (2010)
Acknowledgments
The authors acknowledge the kind contribution of the vocal quartet VoXP who performed during some of the events detailed in this paper, and Matthias Gregori from SoundCloud Ltd. who implemented the client interface of MC System 1. This work was partly funded by the EPSRC Grant EP/K009559/1, the TSB funded “Making Musical Mood Metadata” project (TS/J002283/1), the EPSRC and AHRC Centre for Doctoral Training in Media and Arts Technology (EP/L01632X/1), and the EPRRC funded “Fusing Semantic and Audio Technologies for Intelligent Music Production and Consumption” (FAST-IMPACt) project (EP/L019981/1).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2014 Springer International Publishing Switzerland
About this paper
Cite this paper
Fazekas, G., Barthet, M., Sandler, M.B. (2014). Novel Methods in Facilitating Audience and Performer Interaction Using the Mood Conductor Framework. In: Aramaki, M., Derrien, O., Kronland-Martinet, R., Ystad, S. (eds) Sound, Music, and Motion. CMMR 2013. Lecture Notes in Computer Science(), vol 8905. Springer, Cham. https://doi.org/10.1007/978-3-319-12976-1_8
Download citation
DOI: https://doi.org/10.1007/978-3-319-12976-1_8
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-12975-4
Online ISBN: 978-3-319-12976-1
eBook Packages: Computer ScienceComputer Science (R0)