ABSTRACT
This short paper proposes a method to classify music video clips uploaded to a video sharing service into music mood categories such as 'cheerful,' 'wistful,' and 'aggressive.' The method leverages viewer comments posted to the music video clips for the music mood classification. It extracts specific features from the comments: (1) adjectives in comments, (2) lengthened words in comments, and (3) comments in chorus sections. Our experimental results classifying 695 video clips into six mood categories showed that our method outperformed the baseline in terms of macro and micro averaged F-measures. In addition, our method outperformed the existing approaches that utilize lyrics and audio signals of songs.
- S. Brody and N. Diakopoulos. Cooooooooooooooollllllllllllll!!!!!!!!!!!!!!: using word lengthening to detect sentiment in microblogs. In Proc. of EMNLP, pages 562--570, 2011. Google ScholarDigital Library
- C. Eickhoff, W. Li, and A. de Vries. Exploiting user comments for audio-visual content indexing and retrieval. In Proc. of ECIR, to appear, 2013. Google ScholarDigital Library
- K. Filippova and K. Hall. Improved video categorization from text metadata and user comments. In Proc. of SIGIR, pages 835--842, 2011. Google ScholarDigital Library
- M. Goto. A chorus section detection method for musical audio signals and its application to a music listening station. IEEE Transactions on Audio, Speech and Language Processing, 14(5), pages 1783--1794, 2006. Google ScholarDigital Library
- K. Hevner. Experimental studies of the elements of expression in music. The American Journal of Psychology, 48(2), pages 246--268, 1936.Google Scholar
- X. Hu, J. Downie, and A. Ehmann. Lyric text mining in music mood classification. In Proc. of ISMIR, pages 411--416, 2009.Google Scholar
- X. Hu, J. Downie, C. Laurier,M. Bay, and A. Ehmann. The 2007 MIREX audio mood classification task: Lessons learned. In Proc. of ISMIR, pages 462--467, 2008.Google Scholar
- H. Kenmochi and H. Ohshita. Vocaloid--commercial singing synthesizer based on sample concatenation. In Proc. of INTERSPEECH, pages 4009--4010, 2007.Google Scholar
- Y. Kim, E. Schmidt, R. Migneco, B. Morton, P. Richardson, J. Scott, J. Speck, and D. Turnbull. Music emotion recognition: A state of the art review. In Proc. of ISMIR, pages 255--266, 2010.Google Scholar
- C. Laurier, J. Grivolla, and P. Herrera. Multimodal music mood classification using audio and lyrics. In Proc. of ICMLA, pages 688--693, 2008. Google ScholarDigital Library
- J. Russell. A circumplex model of affect. Journal of Personality and Social Psychology, 39(6), pages 1161--1178, 1980.Google Scholar
- G. Tzanetakis and P. Cook. MARSYAS: A framework for audio analysis. Organised sound, 4(3), pages 169--175, 1999. Google ScholarDigital Library
- K. Yoshii and M. Goto. MusicCommentator: Generating comments synchronized with musical audio signals by a joint probabilistic model of acoustic and textual features. In Proc. of EC, pages 85--97, 2009. Google ScholarDigital Library
Index Terms
- Leveraging viewer comments for mood classification of music video clips
Recommendations
Automatic mood detection and tracking of music audio signals
Music mood describes the inherent emotional expression of a music clip. It is helpful in music understanding, music retrieval, and some other music-related applications. In this paper, a hierarchical framework is presented to automate the task of mood ...
LAMP, A Lyrics and Audio MandoPop Dataset for Music Mood Estimation: Dataset Compilation, System Construction, and Testing
TAAI '10: Proceedings of the 2010 International Conference on Technologies and Applications of Artificial IntelligenceMusic mood estimation (MME) is an emerging subfield in music information retrieval research. Whereas most MME research focuses on audio analysis, exploring the significance of lyrics in predicting song emotion has been receiving more attention in recent ...
Computational Analysis of Jazz Music: Estimating Tonality through Chord Progression Distances
CSAE '23: Proceedings of the 7th International Conference on Computer Science and Application EngineeringCurrently, research in music informatics focuses extensively on music theory, particularly on the theoretical systems of Western classical music dating back to the 19th century. However, contemporary popular music genres such as pop, rock, and jazz often ...
Comments