skip to main content
10.1145/2046396.2046435acmconferencesArticle/Chapter ViewAbstractPublication PagesuistConference Proceedingsconference-collections
poster

MoodMusic: a method for cooperative, generative music playlist creation

Published: 16 October 2011 Publication History

Abstract

Music is a major element of social gatherings. However, creating playlists that suit everyone's tastes and the mood of the group can require a large amount of manual effort. In this paper, we present MoodMusic, a method to dynamically generate contextually appropriate music playlists for groups of people. MoodMusic uses speaker pitch and intensity in the conversation to determine the current 'mood'. MoodMusic then queries the online music libraries of the speakers to choose songs appropriate for that mood. This allows groups to listen to music appropriate for their current mood without managing playlists. This work contributes a novel method for dynamically creating music playlists for groups based on their music preferences and current mood.

References

[1]
Bertin-Mahieux, T., Eck, D., Maillet, F., and Lamere, P. Autotagger: A model for predicting social tags from acoustic features on large music databases. Journal of New Music Research 37, 2 (2008), 115--135.
[2]
Dabek, F., Healey, J., and Picard, R. A new affect-perceiving interface and its application to personalized music selection. Proc. from the 1998 Workshop on Perceptual User Interfaces. (1998).
[3]
Gregg, V.H. and Shepherd, A.J. Factor Structure of Scores on the State Version of the Four Dimension Mood Scale. Educational and Psychological Measurement 69, 1 (2009), 146--156.
[4]
Ilie, G. and Thompson, W.F. A Comparison of Acoustic Cues in Music and Speech for Three Dimensions of Affect. Music Perception: An Interdisciplinary Journal 23, 4 (2006), 319--330.
[5]
Knobloch, S. and Zillmann, D. Mood Management via the Digital Jukebox. Journal of Communication 52, 2 (2002), 351--366.
[6]
Martin, P.J. Sounds and society: themes in the sociology of music. Manchester University Press, 1997.
[7]
Rho, S., Han, B.-jun, and Hwang, E. SVR-based music mood classification and context-based music recommendation. Proceedings of the 17th ACM international conference on Multimedia, ACM (2009), 713--716.
[8]
Russell, J.A. A circumplex model of affect. Journal of personality and social psychology 39, 6 (1980), 1161.
[9]
Tan, L. and Karnjanadecha, M. Pitch detection algorithm: autocorrelation method and AMDF. Proceedings of the 3rd International Symposium on Communications and Information Technology, (2003), 551--556.
[10]
Thayer, R.E. Activation states as assessed by verbal report and four psychophysiological variables. Psychophysiology 7, 1 (1970), 86--94.
[11]
1Thayer, R.E. The origin of everyday moods: Managing energy, tension and stress. Oxford University Press, New York, NY, 1996.

Cited By

View all
  • (2024)Surveying More Than Two Decades of Music Information Retrieval Research on PlaylistsACM Transactions on Intelligent Systems and Technology10.1145/368839815:6(1-68)Online publication date: 12-Aug-2024
  • (2019)Detecting and Adapting to Users’ Cognitive and Affective State to Develop Intelligent Musical InterfacesMethionine Dependence of Cancer and Aging10.1007/978-3-319-92069-6_11(163-177)Online publication date: 7-Feb-2019
  • (2018)ReflektorProceedings of the 2018 ACM International Conference on Supporting Group Work10.1145/3148330.3148331(27-38)Online publication date: 7-Jan-2018
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
UIST '11 Adjunct: Proceedings of the 24th annual ACM symposium adjunct on User interface software and technology
October 2011
108 pages
ISBN:9781450310147
DOI:10.1145/2046396

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 16 October 2011

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. affective computing
  2. audio interfaces
  3. collaborative filtering
  4. conversation analysis
  5. mood detection
  6. music

Qualifiers

  • Poster

Conference

UIST '11

Acceptance Rates

Overall Acceptance Rate 355 of 1,733 submissions, 20%

Upcoming Conference

UIST '25
The 38th Annual ACM Symposium on User Interface Software and Technology
September 28 - October 1, 2025
Busan , Republic of Korea

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)6
  • Downloads (Last 6 weeks)1
Reflects downloads up to 14 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Surveying More Than Two Decades of Music Information Retrieval Research on PlaylistsACM Transactions on Intelligent Systems and Technology10.1145/368839815:6(1-68)Online publication date: 12-Aug-2024
  • (2019)Detecting and Adapting to Users’ Cognitive and Affective State to Develop Intelligent Musical InterfacesMethionine Dependence of Cancer and Aging10.1007/978-3-319-92069-6_11(163-177)Online publication date: 7-Feb-2019
  • (2018)ReflektorProceedings of the 2018 ACM International Conference on Supporting Group Work10.1145/3148330.3148331(27-38)Online publication date: 7-Jan-2018
  • (2016)Recent developments in affective recommender systemsPhysica A: Statistical Mechanics and its Applications10.1016/j.physa.2016.05.046461(182-190)Online publication date: Nov-2016
  • (2015)Segmenting music library for generation of playlist using machine learning2015 IEEE International Conference on Electro/Information Technology (EIT)10.1109/EIT.2015.7293429(421-425)Online publication date: May-2015
  • (2013)A social crowd-controlled orchestraProceedings of the 2013 conference on Computer supported cooperative work companion10.1145/2441955.2442018(267-272)Online publication date: 23-Feb-2013

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media