Skip to main content

The Role of Time in Music Emotion Recognition: Modeling Musical Emotions from Time-Varying Music Features

  • Conference paper
From Sounds to Music and Emotions (CMMR 2012)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 7900))

Included in the following conference series:

Abstract

Music is widely perceived as expressive of emotion. However, there is no consensus on which factors in music contribute to the expression of emotions, making it difficult to find robust objective predictors for music emotion recognition (MER). Currently, MER systems use supervised learning to map non time-varying feature vectors into regions of an emotion space guided by human annotations. In this work, we argue that time is neglected in MER even though musical experience is intrinsically temporal. We advance that the temporal variation of music features rather than feature values should be used as predictors in MER because the temporal evolution of musical sounds lies at the core of the cognitive processes that regulate the emotional response to music. We criticize the traditional machine learning approach to MER, then we review recent proposals to exploit the temporal variation of music features to predict time-varying ratings of emotions over the course of the music. Finally, we discuss the representation of musical time as the flow of musical information rather than clock time. Musical time is experienced through auditory memory, so music emotion recognition should exploit cognitive properties of music listening such as repetitions and expectations.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Aucouturier, J.J., Pachet, F.: Improving Timbre Similarity: How High is the Sky? Journal of Negative Results in Speech and Audio Sciences 1(1) (2004)

    Google Scholar 

  2. Barrington, L., Chan, A.B., Lanckriet, G.: Dynamic texture models of music. In: Proc. ICASSP (2009)

    Google Scholar 

  3. Caetano, M., Wiering, F.: The Role of Time in Music Emotion Recognition. In: Proceedings of the International Symposium on Computer Music Modeling and Retrieval (2012)

    Google Scholar 

  4. Celma, O., Serra, X.: FOAFing the Music: Bridging the Semantic Gap in Music Recommendation. Journal of Web Semantics 6(4) (2008)

    Google Scholar 

  5. Coutinho, E., Cangelosi, A.: The use of spatio-temporal connectionist models in psychological studies of musical emotions. Music Perception: An Interdisciplinary Journal 27(1), 1–15 (2009)

    Article  Google Scholar 

  6. Coutinho, E., Cangelosi, A.: A Neural Network Model for the Prediction of Musical Emotions. In: Nefti-Meziani, S., Grey, J.G. (eds.) Advances in Cognitive Systems, pp. 331–368. IET Publisher, London (2010) ISBN: 978-1849190756

    Google Scholar 

  7. Coutinho, E., Cangelosi, A.: Musical emotions: predicting second-by-second subjective feelings of emotion from low-level psychoacoustic features and physiological measurements. Emotion 11(4), 921–937 (2011)

    Article  Google Scholar 

  8. Coviello, E., Chan, A.B., Lanckriet, G.: Time Series Models for Semantic Music Annotation. IEEE Transactions on Audio, Speech, and Language Processing 19(5), 1343–1359 (2011)

    Article  Google Scholar 

  9. Gabrielsson, A., Lindström, E.: The Role of Structure in the Musical Expression of Emotions. In: Juslin, P.N., Sloboda, J. (eds.) Handbook of Music and Emotion: Theory, Research, Applications, pp. 367–400 (2011)

    Google Scholar 

  10. Hevner, K.: Experimental Studies of the Elements of Expression in Music. The Am. Journ. Psychology 48(2), 246–268 (1936)

    Article  Google Scholar 

  11. Hu, X., Downie, J.S., Laurier, C., Bay, M., Ehmann, A.F.: The 2007 MIREX Audio Mood Classification Task: Lessons Learned. In: Proc. ISMIR (2008)

    Google Scholar 

  12. Huq, A., Bello, J.P., Rowe, R.: Automated Music Emotion Recognition: A Systematic Evaluation. Journal of New Music Research 39(4), 227–244 (2010)

    Article  Google Scholar 

  13. Huron, D.: Sweet Anticipation: Music and the Psychology of Expectation. MIT Press (2006)

    Google Scholar 

  14. Juslin, P.N., Västfjäll, D.: Emotional Responses to Music: The Need to Consider Underlying Mechanisms. Behavioral and Brain Sciences 31(5), 559–621 (2008)

    Google Scholar 

  15. Juslin, P., Timmers, R.: Expression and Communication of Emotion in Music Performance. In: Juslin, P.N., Sloboda, J. (eds.) Handbook of Music and Emotion: Theory, Research, Applications, pp. 453–489 (2011)

    Google Scholar 

  16. Kim, Y., Schmidt, E., Emelle, L.: MoodSwings: A. Collaborative Game for Music Mood Label Collection. In: Proceedings of the 9th International Conference on Music Information Retrieval, ISMIR (2008)

    Google Scholar 

  17. Kim, Y., Schmidt, E., Migneco, R., Morton, B., Richardson, P., Scott, J., Speck, J., Turnbull, D.: Music Emotion Recognition: A State of the Art Review. In: Proc. ISMIR (2010)

    Google Scholar 

  18. Korhonen, M.D., Clausi, D.A., Jernigan, M.E.: Modeling Emotional Content of Music Using System Identification. IEEE Transactions on Systems, Man, and Cybernetics, Part B, Cybernetics 36(3), 588–599 (2005)

    Article  Google Scholar 

  19. Krumhansl, C.L.: An Exploratory Study of Musical Emotions and Psychophysiology. Canadian Journal of Experimental Psychology 51, 336–352 (1997)

    Article  Google Scholar 

  20. Krumhansl, C.L.: Music: A Link Between Cognition and Emotion. Current Directions in Psychological Science 11, 45–50 (2002)

    Article  Google Scholar 

  21. Lu, L., Liu, D., Zhang, H.J.: Automatic Mood Detection and Tracking of Music Audio Signals. IEEE Trans. Audio, Speech, Lang. Proc. 14(1) (2006)

    Google Scholar 

  22. McAlpin, C.: Is Music the Language of Emotions? The Musical Quarterly 11(3), 427–443 (1925)

    Article  Google Scholar 

  23. MacDorman, K.F., Ough, S., Ho, C.C.: Automatic Emotion Prediction of Song Excerpts: Index Construction, Algorithm Design, and Empirical Comparison. Journal of New Music Research 36, 283–301 (2007)

    Google Scholar 

  24. Meyer, L.B.: Music, the Arts, and Ideas. University of Chicago Press, Chicago (1967)

    Google Scholar 

  25. Meyer, L.B.: Emotion and Meaning in Music. University of Chicago Press, Chicago (1956)

    Google Scholar 

  26. Mion, L., Poli, G.: Score-Independent Audio Features for Description of Music Expression. IEEE Transactions on Audio, Speech, and Language Processing 16(2), 458–466 (2008)

    Article  Google Scholar 

  27. Müller, M., Ellis, D.P.W., Klapuri, A., Richard, G.: Signal Processing for Music Analysis. IEEE Journal of Selected Topics in Sig. Proc. 5(6), 1088–1110 (2011)

    Article  Google Scholar 

  28. Nagel, F., Kopiez, R., Grewe, O., Altenmüller, E.: EMuJoy. Software for the Continuous Measurement of Emotions in Music. Behavior Research Methods 39(2), 283–290 (2007)

    Article  Google Scholar 

  29. Pressing, J.: Relations Between Musical and Scientific Properties of Time. Contemporary Music Review 7(2), 105–122 (1993)

    Article  Google Scholar 

  30. Russell, J.A.: A Circumplex Model of Affect. Journ. Personality and Social Psychology 39, 1161–1178 (1980)

    Article  Google Scholar 

  31. Scherer, K.R.: Which Emotions Can be Induced by Music? What are the Underlying Mechanisms? and How can We Measure Them? Journal of New Music Research 33(3), 239–251 (2005)

    Article  MathSciNet  Google Scholar 

  32. Schmidt, E., Kim, Y.: Prediction of Time-Varying Musical Mood Distributions Using Kalman Filtering. In: Proc. Ninth International Conference on Machine Learning and Applications (ICMLA), pp. 655–660 (2010)

    Google Scholar 

  33. Schmidt, E., Kim, Y.: Prediction of Time-Varying Musical Mood Distributions from Audio. In: Proc. ISMIR (2010)

    Google Scholar 

  34. Schmidt, E., Kim, Y.: Modeling Musical Emotion Dynamics with Conditional Random Fields. In: Proc. ISMIR (2011)

    Google Scholar 

  35. Schubert, E.: Measuring Emotion Continuously: Validity and Reliability of the Two-Dimensional Emotion Space. Australian Journal of Psychology 51(3), 154–165 (1999)

    Article  Google Scholar 

  36. Schubert, E.: Modeling Perceived Emotion with Continuous Musical Features. Music Perception 21(4), 561–585 (2004)

    Article  Google Scholar 

  37. Schubert, E.: Introduction to Interrupted Time Series Analysis of Emotion in Music: The Case of Arousal, Valence and Points of Rest. In: Proc. International Conference on Music Perception and Cognition (2004)

    Google Scholar 

  38. Schubert, E.: Analysis of Emotional Dimensions in Music Using Time Series Techniques. Journ. Music Research 31, 65–80 (2006)

    Google Scholar 

  39. Schubert, E.: Continuous Self-Report Methods. In: Juslin, P.N., Sloboda, J. (eds.) Handbook of Music and Emotion: Theory, Research, Applications, pp. 223–254 (2011)

    Google Scholar 

  40. Smalley, D.: Spectro-morphology and Structuring Processes. In: Emmerson, S. (ed.) The Language of Electroacoustic Music, pp. 61–93. Macmillan, London (1986)

    Google Scholar 

  41. Smalley, D.: Spectromorphology: Explaining Sound-Shapes. Organised Sound 2(2), 107–126 (1997)

    Article  Google Scholar 

  42. Snyder, B.: Music and Memory: An Introduction. MIT Press, Cambridge (2001)

    Google Scholar 

  43. Stevens, S.S., Volkman, J., Newman, E.B.: A scale for the measurement of the psychological magnitude pitch. Journal of the Acoustical Society of America 8(3), 185–190 (1937)

    Article  Google Scholar 

  44. Vaizman, Y., Granot, R.Y., Lanckriet, G.: Modeling Dynamic Patterns for Emotional Content in Music. In: Proc. ISMIR (2011)

    Google Scholar 

  45. Vempala, N., Russo, F.A.: Predicting Emotion from Music Audio Features Using Neural Networks. In: Proceedings of the 9th International Symposium on Computer Music Modeling and Retrieval, CMMR (2012)

    Google Scholar 

  46. Vos, P.G., van Dijk, A., Schomaker, L.: Melodic Cues for Metre. Perception 23(8), 965–976 (1994)

    Article  Google Scholar 

  47. Wiggins, G.A.: Semantic Gap?? Schemantic Schmap!! Methodological Considerations in the Scientific Study of Music. In: IEEE International Symposium on Multimedia, pp. 477–482 (2009)

    Google Scholar 

  48. Yang, Y.H., Lin, Y., Su, Y.F., Chen, H.H.: A regression approach to music emotion recognition. IEEE Trans. Audio, Speech, Lang. Proc. 16(2), 448–457 (2008)

    Article  Google Scholar 

  49. Yang, Y., Chen, H.: Ranking-Based Emotion Recognition for Music Organization and Retrieval. IEEE Trans. Audio, Speech, Lang. Proc. 19(4) (2011)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2013 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Caetano, M., Mouchtaris, A., Wiering, F. (2013). The Role of Time in Music Emotion Recognition: Modeling Musical Emotions from Time-Varying Music Features. In: Aramaki, M., Barthet, M., Kronland-Martinet, R., Ystad, S. (eds) From Sounds to Music and Emotions. CMMR 2012. Lecture Notes in Computer Science, vol 7900. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-41248-6_10

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-41248-6_10

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-41247-9

  • Online ISBN: 978-3-642-41248-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics