Abstract
Music mood analysis is crucial for music applications that involve search and recommendation. At present, the classification of music mood mainly depends on manual annotation or music information from audio and lyrics. However, manual annotation requires a large number of users, and the process of acquiring and processing audio or lyric information is complicated. Therefore, we need a new and simple method to analyze music mood. As music mood itself is specific psychological feelings caused by various music elements acting together on the user’s hearing, in this paper we try to use information related to users, like their physiology and activities information. The development of wearable devices provides us with the opportunity to record user’s lifelog. The experiment results suggest that the classification method based on user information can effectively identify music mood. By integrating with the classification method based on music information, the recognition effect can be further improved.
This work is supported by Natural Science Foundation of China (Grant No. 61672311, 61532011).
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Bai, J., et al.: Music emotions recognition by cognitive classification methodologies. In: 16th ICCI*CC (2017)
Corona, H., O’Mahony, M.P.: An exploration of mood classification in the million songs dataset. In: 12th SMC Conference. Maynooth University, Ireland (2015)
Soleimaninejadian, P., et al.: THIR2 at the NTCIR-13 lifelog-2 task: bridging technology and psychology through the lifelog personality, mood and sleep quality. In: NTCIR-13 (2017)
Davydov, D.M., Stewart, R., Ritchie, K., Chaudieu, I.: Depressed mood and blood pressure: the moderating effect of situation-specific arousal levels. Int. J. Psychophysiol. 85(2), 212–223 (2012)
Johnston, D.W., Anastasiades, P.: The relationship between heart rate and mood in real life. J. Psychosom. Res. 34(1), 21–27 (1990)
Vahey, R., Becerra, R.: Galvanic skin response in mood disorders: a critical review. IJP&PT 15(2), 275–304 (2015)
Thayer, R.E., et al.: Amount of daily walking predicts energy, mood, personality, and health. In: Poster Presented at the APA, Washington DC (2005)
Kato, M.P., Liu, Y.: Overview of NTCIR-13. In: NTCIR-13 (2017)
Nam, Y., Shin, D., Shin, D.: Personal search system based on android using lifelog and machine learning. Pers. Ubiquit. Comput. 22(1), 201–218 (2018)
Mafrur, R., Nugraha, I.G.D., Choi, D.: Modeling and discovering human behavior from smartphone sensing life-log data for identification purpose. HCIS 5, 31 (2015)
Chung, C., Cook, J., Bales, E., Zia, J., Munson, S.A.: More than telemonitoring: health provider use and nonuse of life-log data in irritable bowel syndrome and weight management. J. Med. Internet Res. 17(8), e203 (2015)
Maeda, M., Nomiya, H., Sakaue, S., Hochin, T., Nishizaki, Y.: Emotional video scene retrieval system for lifelog videos based on facial expression intensity. In: 18th SNPD, Kanazawa, pp. 551–556 (2017)
Byrne, D., Kelly, L., Jones, G.J.F.: Multiple multimodal mobile devices: lessons learned from engineering lifelog solutions. In: Software Design and Development: Concepts, Methodologies, Tools, and Applications (2014)
Jacquemard, T., Novitzky, P., OBrolchin, F., Smeaton, A.F., Gordijn, B.: Challenges and opportunities of lifelog technologies: a literature review and critical analysis. Sci. Eng. Ethics 20(2), 379–409 (2014)
Kashyap, N., Choudhury, T., Chaudhary, D.K., Lal, R.: Mood based classification of music by analyzing lyrical data using text mining. In: ICMETE, Ghaziabad, pp. 287–292 (2016)
Li, J., Gao, S., Han, N., Fang Z., Liao, J.: Music mood classification via deep belief network. In: ICDMW, Atlantic City, New Jersey, pp. 1241–1245 (2015)
Xue, H., Xue, L., Su, F.: Multimodal music mood classification by fusion of audio and lyrics. In: He, X., Luo, S., Tao, D., Xu, C., Yang, J., Hasan, M.A. (eds.) MMM 2015. LNCS, vol. 8936, pp. 26–37. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-14442-9_3
Xiong, Y., Su, F., Wang, Q.: Automatic music mood classification by learning cross-media relevance between audio and lyrics. In: ICME, pp. 961–966 (2017)
Choi, K., Lee, J.H., Hu, X., Downie, J.S.: Music subject classification based on lyrics and user interpretations. In: 79th ASIS&T (2016)
Russell, J.A.: A circumplex model of affect. J. Pers. Soc. Psychol. 39(6), 1161–1178 (1980)
Wu, Y., Chang, E.Y., Chang, K.C., Smith, J.R.: Optimal multimodal fusion for multimedia data analysis. In: 12th ACM-MM, New York, pp. 572–579 (2004)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer Nature Switzerland AG
About this paper
Cite this paper
Tong, H. et al. (2018). Music Mood Classification Based on Lifelog. In: Zhang, S., Liu, TY., Li, X., Guo, J., Li, C. (eds) Information Retrieval. CCIR 2018. Lecture Notes in Computer Science(), vol 11168. Springer, Cham. https://doi.org/10.1007/978-3-030-01012-6_5
Download citation
DOI: https://doi.org/10.1007/978-3-030-01012-6_5
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-01011-9
Online ISBN: 978-3-030-01012-6
eBook Packages: Computer ScienceComputer Science (R0)