Emotion-based music recommendation by affinity discovery from film music*

https://doi.org/10.1016/j.eswa.2008.09.042Get rights and content

Abstract

With the growth of digital music, the development of music recommendation is helpful for users to pick desirable music pieces from a huge repository of music. The existing music recommendation approaches are based on a user’s preference on music. However, sometimes, it might better meet users’ requirement to recommend music pieces according to emotions. In this paper, we propose a novel framework for emotion-based music recommendation. The core of the recommendation framework is the construction of the music emotion model by affinity discovery from film music, which plays an important role in conveying emotions in film. We investigate the music feature extraction and propose the Music Affinity Graph and Music Affinity Graph-Plus algorithms for the construction of music emotion model. Experimental result shows the proposed emotion-based music recommendation achieves 85% accuracy in average.

Introduction

Digital music has become popular in human life, owing to the advancement of the digital music technology. The rapid growing demand for music management techniques and digital music applications makes music information retrieval an important research field. The main goal of music information retrieval is to retrieve a specific (set of) music object and it requires users providing information about the music to be retrieved, such as title, lyrics or humming tune. An important branch of music information retrieval research is personalized music recommendation. Personalized music recommendation techniques attempt to filter out the music a user dislikes and recommend that a user might like. The techniques typically make recommendation by analyzing a user’s preference via music accessing behavior, rather than user-specified preference information. There exist three major approaches for the personalized music recommendation. One is the content-based filtering approach which analyzes content of the music which users liked in the past and recommends similar music (Kuo & Shan, 2002). Another is the collaborative filtering approach which recommends music that peer group of similar preference liked (Shardanand & Maes, 1995). The other is the hybrid approach which integrates the content and collaborative information for personalized music recommendation (Chen and Chen, 2001, Yoshii et al., 2006).

These recommendation approaches are based on the users’ preferences observed from the listening behavior. However, sometimes, the music a user needs is decided by the emotion of the user or context. Most people experience music every day with affective response. For example, we may feel cheerful when listening to an excellent performance at a concert, and may feel sad when listening to the music of a late night movie. Consequently, recommending music according to emotion better meets the users’ requirement in some cases.

Some researchers have devoted to the understanding of the relationships between music and emotion from the philosophical, musicological, psychological and anthropological perspectives (Gabrielsson et al., 2001, Tao and Ogihara, 2004). To recommend music based on emotions, the straightforward approach is to recommend music by the rules, in terms of the relationship between emotion and music elements, observed by the psychological research. Another possible approach is to learn the rules by training from music pre-labeled with emotion types. However, the emotion labeling is time-consuming.

In our work, we propose a generic framework for emotion-based music recommendation by affinity discovery from film music. In particular, we investigate music feature extraction and propose a modified Mixed Media Graph (MMG) algorithm, Music Affinity Graph (MAG), to discover the relationship between music features and emotions from film music. However, both MAG and MMG algorithms have the problem that the discrimination powers of the discovered features are not necessarily high. It means that the discovered features might be highly related to not only the query emotions but also other emotions. Consequently, we propose the Music Affinity Graph-Plus (MAG-Plus) algorithm to take the discrimination power into consideration. We also discuss some existing researches on film emotion detection, which can be used in the recommendation framework to avoid labor work in emotion labeling. Potential applications of our proposed emotion-based music recommendation framework include music therapy, music score selection for production of home video, background music playing in shopping mall to stimulate sales, and music playing in context-aware home to accommodate inhabitants’ emotion.

Section snippets

Framework overview

Fig. 1 shows the process of the proposed generic music recommendation framework. The heart of the framework is the construction of the music emotion model from film music, owing to the close relation between the emotion and music in films. Kalinak (1992) has claimed that music is “the most efficient code” for emotional expression in film. A film music composer usually composes music according to a scenario. The purpose of the composition generally agrees with how audiences react to it. The

Music feature extraction

Music elements which affect the emotion include melody, rhythm, tempo, mode, key, harmony, dynamics and tone-color. Among these music elements, melody, mode, tempo and rhythm have stronger effects on emotions. Generally speaking, major scale is brighter, happier than minor; rapid tempo is more exciting or tenser than slow tempo.

Take Schubert’s Der Lindenbaum from Winterreise cycle as an example. It describes a man who is disappointed in love and drifts from home recalls the linden tree at home.

Affinity discovery and music recommendation

Emotion-based music recommendation recommends several music pieces corresponding to the query emotions. More precisely, given a query set of emotions, we wish to find out the corresponding music features to rank the database music. The affinities between music features and emotions should be discovered from training data. The affinity graph algorithm, Mixed Media Graph, is adopted and modified for the proposed emotion-based music recommendation.

MMG was proposed to find correlations across the

Performance evaluation

To evaluate the effectiveness of our proposed music recommendation approach, we performed experiments on a collection of 107 film music from 20 animated films. We choose animated films because emotions in animated films are more clear and explicit in general. The 20 films include the productions of Disney, Studio Ghibli and DreamWorks, such as Lion King, Spirited Away, and Shrek. The MIDI files of Studio Ghibli were collected from the website “A Dedication to Studio Ghibli Films” (//www.wingsee.com/ghibli/

Conclusions

In this paper, we presented a generic framework to recommend music based on emotion. The core of our proposed recommendation framework is to construct the music emotion model from film music, for music plays an important role in conveying emotions in film. The construction process of music emotion model consists of feature extraction, emotion detection and association discovery. We proposed the feature extraction approaches to extract chord, rhythm and tempo. For the association discovery

References (21)

  • B. Adams et al.

    Toward automatic extraction of expressive elements from motion pictures: Tempo

    IEEE Transactions on Multimedia

    (2002)
  • Chan, C. H. & Jones, G. J. F. (2005). Affect-based indexing and retrieval of films. In Proceedings of the 13th ACM...
  • Chen, H. C., & Chen, A. L. P. (2001). A music recommendation system based on music data grouping and user interests. In...
  • A. Gabrielsson et al.

    The influence of musical structure on emotional expression

  • L. Giannetti

    Understanding movies

    (2004)
  • A. Hanjalic et al.

    Affective video content representation and modeling

    IEEE Transactions on Multimedia

    (2005)
  • Hsu, J. L., Liu, C. C., & Chen, A. L. P. (1998). Efficient repeating pattern finding in music databases. In Proceedings...
  • K. Kalinak

    Settling the score: Music and the classical Hollywood film

    (1992)
  • Kang, H. B. (2003). Affective content detection using HMMs. In Proceedings of the 11th ACM international conference on...
  • Kuo, F. F., & Shan, M. K. (2002). A personalized music filtering system based on melody style classification. In...
There are more references available in the full text version of this article.

Cited by (60)

  • Utilizing context-relevant keywords extracted from a large collection of user-generated documents for music discovery

    2017, Information Processing and Management
    Citation Excerpt :

    Some studies have focused on modeling the user’s emotional state when searching for music. Shan, Kuo, Chiang, and Lee (2009) utilized emotion by building a music emotion model using an affinity discovery from film music. The authors developed a Mixed Media Graph and an affinity graph algorithm to discover the affinities between music features and emotion.

  • Iomust2: Internet of music therapy things to improve mental health management

    2023, IoT and Cloud Computing-Based Healthcare Information Systems
  • Movie Recommendation System with Sentimental Analysis Using Cosine Similarity Technique

    2022, 3rd International Conference on Innovations in Computer Science and Software Engineering, ICONICS 2022
View all citing articles on Scopus
*

Part of the content of this paper has been published in ACM Proceedings of International Conference on Multimedia, 2005.

View full text