Skip to main content

Novel Methods in Facilitating Audience and Performer Interaction Using the Mood Conductor Framework

  • Conference paper
  • First Online:
Sound, Music, and Motion (CMMR 2013)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 8905))

Included in the following conference series:

Abstract

While listeners’ emotional response to music is the subject of numerous studies, less attention is paid to the dynamic emotion variations due to the interaction between artists and audiences in live improvised music performances. By opening a direct communication channel from audience members to performers, the Mood Conductor system provides an experimental framework to study this phenomenon. Mood Conductor facilitates interactive performances and thus also has an inherent entertainment value. The framework allows audience members to send emotional directions using their mobile devices in order to “conduct” improvised performances. Emotion coordinates indicted by the audience in the arousal-valence space are aggregated and clustered to create a video projection. This is used by the musicians as guidance, and provides visual feedback to the audience. Three different systems were developed and tested within our framework so far. These systems were trialled in several public performances with different ensembles. Qualitative and quantitative evaluations demonstrated that musicians and audiences were highly engaged with the system, and raised new insights enabling future improvements of the framework.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    http://bit.ly/moodxp2.

  2. 2.

    http://www.youtube.com/watch?v=o9Fd7nV2IWs.

  3. 3.

    https://www.facebook.com/VoXPerformance.

  4. 4.

    http://bit.ly/moodxp2.

  5. 5.

    http://www.cherrypy.org/.

  6. 6.

    www.pygame.org.

  7. 7.

    Colour codes for some prominent emotion were taken from the following resource: http://www.wefeelfine.org/data/files/feelings.txt.

  8. 8.

    See photos at: http://bit.ly/moodcphotos.

  9. 9.

    http://cmmr2012.eecs.qmul.ac.uk/music-programme.

  10. 10.

    Photos can be found online at http://bit.ly/moodcphotos.

  11. 11.

    Audience participants’ questionnaires: session 1: http://bit.ly/mcs1q1 and http://bit.ly/mcs1q2 ; session 2: http://bit.ly/mcs2q1 and http://bit.ly/mcs2q2.

  12. 12.

    Performers’ questionnaire: http://bit.ly/mcperformer.

  13. 13.

    http://lanyrd.com/scmwgx and http://lanyrd.com/scmxby.

  14. 14.

    http://bit.ly/mcbarbican.

  15. 15.

    http://bit.ly/mc_acii2013_video.

References

  1. Asmus, E.P.: Nine affective dimensions. Technical report, University of Miami (1986)

    Google Scholar 

  2. Avidan, S.: Ensemble tracking. In: IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’05), San Diego, California, USA (2005)

    Google Scholar 

  3. Barkhuus, L., Jørgensen, T.: Engaging the crowd: studies of audience-performer interaction. In: CHI ’08 Extended Abstracts on Human Factors in Computing Systems, CHI EA ’08, pp. 2925–2930. New York (2008)

    Google Scholar 

  4. Barthet, M., Fazekas, G., Sandler, M.: Music emotion recognition: from content- to context-based models. In: Aramaki, M., Barthet, M., Kronland-Martinet, R., Ystad, S. (eds.) CMMR 2012. LNCS, vol. 7900, pp. 228–252. Springer, Heidelberg (2013)

    Chapter  Google Scholar 

  5. Bradley, M., Lang, P.J.: Affective norms for english words: instruction manual and affective ratings. Technical Report C-2, University of Florida (2010)

    Google Scholar 

  6. Cheng, Y.: Mean shift, mode seeking, and clustering. IEEE Trans. Pattern Anal. Mach. Intell. 17(8), 790–799 (1995)

    Article  Google Scholar 

  7. Cover, T.M., Hart, P.E.: Nearest neighbor pattern classification. IEEE Trans. Inf. Theor. 13(1), 21–27 (1967)

    Article  MATH  Google Scholar 

  8. Dean, R.: Creative Improvisation: Jazz, Contemporary Music and Beyond. Open University Press, Philadelphia (1989)

    Google Scholar 

  9. Fazekas, G., Barthet, M., Sandler, M.: Mood conductor: emotion-driven interactive music performance. In: International Conference on Affective Computing and Intelligent Interaction (ACII’13), Geneva, Switzerland, 2–5 September 2013 (2013)

    Google Scholar 

  10. Fielding, R.T., Taylor, R.N.: Principled design of the modern Web architecture. ACM Trans. Internet Technol. 2(2), 115–150 (2002)

    Article  Google Scholar 

  11. Gorow, R.: Hearing and Writing Music: Professional Training for Today’s Musician, 2nd edn. September Publishing, Gardena (2002)

    Google Scholar 

  12. Hayes-Roth, B., Sincoff, E., Brownston, L., Huard, R., Lent, B.: Directed improvisation. Technical report, Stanford University (1994)

    Google Scholar 

  13. Iacobini, M., Gonsalves, T., Berthouze, N., Frith, C.: Creating emotional communication with interactive artwork. In: 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops, pp. 1–6 (2009)

    Google Scholar 

  14. Johnstone, K.: Improvisation and the Theatre. Methuen Drama, London (2007)

    Google Scholar 

  15. Kendall, R.A., Carterette, E.C.: The communication of musical expression. Music Percept. 8(2), 129–164 (1990)

    Article  Google Scholar 

  16. Kim, Y., Schmidt, E.M., Emelle, L.: Moodswings: a collaborative game for music mood label collection. In: Proceeding of the International Society for Music Information Retrieval (ISMIR) Conference (2008)

    Google Scholar 

  17. Lou, T., Barthet, M., Fazekas, G., Sandler, M.: Evaluation of the Mood Conductor interactive system based on audience and performers’ perspectives. In: Proceedings of the 10th International Symposium on Computer Music Multidisciplinary Research (CMMR), pp. 594–609 (2013)

    Google Scholar 

  18. Lou, T., Barthet, M., Fazekas, G., Sandler, M.: Evaluation and improvement of the mood conductor interactive system. In: 53rd AES Internationa Conference on Semantic Audio, London, UK, 26–29 January 2014 (2014)

    Google Scholar 

  19. Russell, J.A.: A circumplex model of affect. J. Pers. Soc. Psychol. 39(6), 1161–1178 (1980)

    Article  Google Scholar 

  20. Saari, P., Barthet, M., Fazekas, G., Eerola, T., Sandler, M.: Semantic models of musical mood: comparison between crowd-sourced and curated editorial tags. In: Proceedings of IEEE International Conference on Multimedia & Expo (ICME2013) International Workshop on Affective Analysis in Multimedia (AAM), San Jose, CA, USA, 15–19 July 2013 (2013)

    Google Scholar 

  21. Schubert, E.: Measuring emotion continuously: validity and reliability of the two-dimensional emotion-space. Aust. J. Psychol. 51(3), 154–165 (1999)

    Article  Google Scholar 

  22. Schubert, E., Ferguson, S., Farrar, N., Taylor, D., McPherson, G.E.: Continuous response to music using discrete emotion faces. In: Barthet, M., Dixon, S. (eds.) Proceedings of the 9th International Symposium on Computer Music Modeling and Retrieval (CMMR’12), pp. 3–19 (2012)

    Google Scholar 

  23. Sgouros, N.M.: Supporting audience and player interaction during interactive media performances. In: 2000 IEEE International Conference on Multimedia and Expo, 2000, ICME 2000, vol. 3, pp. 1367–1370 (2000)

    Google Scholar 

  24. Sloboda, J.A., Juslin, P.N.: Psychological perspectives on music and emotion. In: Juslin, P.N., Sloboda, J.A. (eds.) Music and Emotion Theory and Research. Series in Affective Science, pp. 71–104. Oxford University Press, Oxford (2001)

    Google Scholar 

  25. Thayer, R.E.: The Biopsychology of Mood and Arousal. Oxford University Press, New York (1989)

    Google Scholar 

  26. Van Zijl, A.G.W., Sloboda, J.: Performers’ experienced emotions in the construction of expressive musical performance: an exploratory investigation. Psychol. Music 39(2), 196–219 (2010)

    Article  Google Scholar 

Download references

Acknowledgments

The authors acknowledge the kind contribution of the vocal quartet VoXP who performed during some of the events detailed in this paper, and Matthias Gregori from SoundCloud Ltd. who implemented the client interface of MC System 1. This work was partly funded by the EPSRC Grant EP/K009559/1, the TSB funded “Making Musical Mood Metadata” project (TS/J002283/1), the EPSRC and AHRC Centre for Doctoral Training in Media and Arts Technology (EP/L01632X/1), and the EPRRC funded “Fusing Semantic and Audio Technologies for Intelligent Music Production and Consumption” (FAST-IMPACt) project (EP/L019981/1).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to György Fazekas .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer International Publishing Switzerland

About this paper

Cite this paper

Fazekas, G., Barthet, M., Sandler, M.B. (2014). Novel Methods in Facilitating Audience and Performer Interaction Using the Mood Conductor Framework. In: Aramaki, M., Derrien, O., Kronland-Martinet, R., Ystad, S. (eds) Sound, Music, and Motion. CMMR 2013. Lecture Notes in Computer Science(), vol 8905. Springer, Cham. https://doi.org/10.1007/978-3-319-12976-1_8

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-12976-1_8

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-12975-4

  • Online ISBN: 978-3-319-12976-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics