ABSTRACT
The emotion matching model is a method suitable for evaluating the matching degree of words and songs, but only through the emotion matching model cannot make a correct evaluation for evaluating the matching degree of words and songs. In order to improve this problem, this paper proposes an intelligent evaluation algorithm for the matching degree of lyrics and songs based on LabVIEW digital images. On the basis of analyzing the relationship between LabVIEW digital images and music melody, emotional feature analysis technology and algorithm description, the MagnaTagATune data set and the MusiClef data set are selected. And the 300 music in the MirexMood dataset to verify the model matching effect through experiments. The experimental results show that the calculation model can reasonably evaluate the emotional content of music lyrics, and make accurate and intelligent evaluation of music.
- Sharp A, Bacon B A, Champoux F. Enhanced tactile identification of musical emotion in the deaf. Experimental Brain Research, 2020, 238(4): 1-8.Google Scholar
- Manno F, Lau C, J Fernandez-Ruiz, The human amygdala disconnecting from auditory cortex preferentially discriminates musical sound of uncertain emotion by altering hemispheric weighting. Scientific Reports, 2019, 9(1): 1-18.Google ScholarCross Ref
- Rosenberg S, Reardon-Smith H. OF BODY, OF EMOTION: A TOOLKIT FOR TRANSFORMATIVE SOUND USE. Breast Cancer Online, 2020, 74(292): 64-73.Google Scholar
- Shorner-Johnson. Music and the Sin of Sloth: The Gendered Articulation of Worthy Musical Time in Early American Music. Philosophy of Music Education Review, 2019, 27(1): 51.Google ScholarCross Ref
- Panwar S, Rad P, Choo K, Are you emotional or depressed? Learning about your emotional state from your music using machine learning. Journal of Supercomputing, 2019, 75(6): 2986-3009.Google ScholarDigital Library
- You S, Sun L, Li X, Contextual prediction modulates musical tension: Evidence from behavioral and neural responses. Brain and Cognition, 2021, 152(1): 105771.Google ScholarCross Ref
- Chen H B, Wu D D, He J, Emotion Recognition in Patients with Parkinson Disease. Cognitive and Behavioral Neurology, 2019, 32(4): 247-255.Google ScholarCross Ref
- Zhou L, Liu F, Jiang J, Impaired emotional processing of chords in congenital amusia: Electrophysiological and behavioral evidence. Brain and cognition, 2019, 135(Oct.):103577.1-103577.11.Google Scholar
- Rutherford S. "Loud and Open Speaking in 'the People's' Mighty Name": Eliza Cook, Music and Politics. Journal of British Studies, 2021, 60(2): 416-429.Google ScholarCross Ref
- Wei W L, Lin J C, Liu T L, Learning To Visualize Music Through Shot Sequence For Automatic Concert Video Mashup. IEEE Transactions on Multimedia, 2020, (99): 1-1.Google Scholar
- Campbell S. 'Agitate, educate, organise': partisanship, popular music and the Northern Ireland conflict. Popular Music, 2020, 39(2): 233-256.Google ScholarCross Ref
- Varni G, Mancini M, Fadiga L, The change matters! Measuring the effect of changing the leader in joint music performances. IEEE Transactions on Affective Computing, 2019, (99): 1-1.Google Scholar
Index Terms
- An Intelligent Evaluation Algorithm for the Matching Degree of Music Lyrics Based on LabVIEW Digital Image
Recommendations
Music/lyrics composition system considering user's image and music genre
SMC'09: Proceedings of the 2009 IEEE international conference on Systems, Man and CyberneticsThis paper proposes a music/lyrics composition system consisting of two sections, a lyric composing section and a music composing section, which considers user's image of a song and music genre. First of all, a user has an image of music/lyrics to ...
Automatic Lyrics Transcription of Polyphonic Music With Lyrics-Chord Multi-Task Learning
Lyrics are the words that make up a song, while chords are harmonic sets of multiple notes in music. Lyrics and chords are generally essential information in music, i.e. unaccompanied singing vocals mixed with instrumental music, representing important ...
Comments