Abstract
This paper introduces a possible approach for evaluating and predicting listeners’ emotional engagement during particular musical performances. A set of audio parameters (cues) is extracted from recorded audio files of two contrasting movements from Bach’s Solo Violin Sonatas and Partitas and compared to listeners’ responses, obtained by moving a slider while listening to music. The cues showing the highest correlations are then used for generating decision trees and a set of rules which will be useful for predicting the emotional engagement (EM) experienced by potential listeners in similar pieces. The model is tested on two different movements of the Solos showing very promising results.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Scherer, K.R.: Why music does not produce basic emotions: pleading for a new approach to measuring the emotional effect of music. In: Proceedings SMAC 2003, Stockholm, Sweden, pp. 25–28 (2003)
Cooke, D.: The Language of Music. Oxford University Press, Oxford (1959)
Sloboda, J.: Music structure and emotional response: Some empirical findings. Psychology of Music 19, 110–120 (1991)
Gabrielsson, A., Lindstrom, E.: The influence of musical structure on emotional expression. In: Juslin, P., Sloboda, J. (eds.) Music and Emotion: theory and research, pp. 223–248. Oxford University Press, London (2001)
Juslin, P.: Communicating emotion in music performance: a review and a theoretical framework. In: Juslin, P., Sloboda, J. (eds.) Music and Emotion: theory and research, pp. 309–337. Oxford University Press, London (2001)
Leman, M., Vermeulen, V., De Voogdt, L., Taelman, J., Moelants, D., Lesaffre, M.: Correlation of Gestural Musical Audio Cues and Perceived Expressive Qualities. In: Camurri, A., Volpe, G. (eds.) GW 2003. LNCS, vol. 2915, pp. 40–54. Springer, Heidelberg (2004)
Timmers, R., Marolt, M., Camurri, A., Volpe, G.: Listeners’ emotional engagement with performances of a Skriabin etude: An explorative case study. Psychology of Music (submitted, 2004)
Krumhansl, K., Schenk, D.L.: Can dance reflect the structural and expressive quality of music? A perceptual experiment on Balanchine’s choreography of Mozart’s divertimento No.15. Musica Scientiae I, 63–85 (1997)
Barbier, P.: The world of the Castrati: the history of an extraordinary operatic phenomenon. Souvenir Press (1999)
Berri, P.: Paganini: Documenti e testimonianze. Genova (1962)
Camurri, A., Hashimoto, S., Ricchetti, M., Suzuki, K., Trocca, R., Volpe, G.: EyesWeb – Toward gesture and affect recognition in dance/music interactive systems. Computer Music Journal 24(1), MIT Press (2000)
Picard, R.: Affective Computing. MIT Press, Cambridge (1997)
Todd, N.: Music and Motion: a personal view. In: Proceedings 4th Workshop on Rhythm Perception, Bourges, France (1992)
Dillon, R.: On the recognition of expressive intentions in music playing: a computational approach with experiments and applications. Ph.D Thesis, DIST University of Genoa (2004)
Dillon, R.: Extracting audio cues in real time to understand musical expressiveness. In: Proceedings Current research directions in computer music, MOSART Workshop, Barcelona, Spain, pp. 41–44 (2001)
Dillon, R.: A statistical approach to expressive intention recognition in violin performances. In: Proceedings SMAC 2003, Stockholm, Sweden, pp. 529–532 (2003)
Dillon, R.: Classifying musical performance by statistical analysis of audio cues. Journal of New Music Research 32(3), 327–332 (2003)
Quinlan, J.R.: C4.5: Programs for Machine Learning. Morgan Kaufmann, San Mateo, California (1993)
Timmers, R., Camurri, A., Volpe, G.: On the relation between performance cues and emotional engagement. In: Proceedings SMAC 2003, Stockholm, Sweden, pp. 569–572 (2003)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2006 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Dillon, R. (2006). A Possible Model for Predicting Listeners’ Emotional Engagement. In: Kronland-Martinet, R., Voinier, T., Ystad, S. (eds) Computer Music Modeling and Retrieval. CMMR 2005. Lecture Notes in Computer Science, vol 3902. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11751069_6
Download citation
DOI: https://doi.org/10.1007/11751069_6
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-34027-0
Online ISBN: 978-3-540-34028-7
eBook Packages: Computer ScienceComputer Science (R0)