ABSTRACT
This paper describes a real-time music-arranging system that reacts to immediate affective cues from a listener. Data was collected on the potential of certain musical dimensions to elicit change in a listener's affective state using sound files created explicitly for the experiment through composition/production, segmentation, and re-assembly of music along these dimensions. Based on listener data, a probabilistic state transition model was developed to infer the listener's current affective state. A second model was made that would select music segments and re-arrange ('re-mix') them to induce a target affective state. We propose that this approach provides a new perspective for characterizing musical preference.
- Gabrielsson, A. and Lindstrom, E. (2001) The influence of musical structure on emotional expression, in P. N Juslin and J. A. Sloboda (eds) Music and Emotion: Theory and Research, pp. 223--248. Oxford University Press.Google Scholar
- Healey, J., Dabek, F. and Picard, R. W. (1998) A New Affect-Perceiving Interface and Its Application to Personalized Music Selection, Proc. from the 1998 Workshop on Perceptual User Interfaces.Google Scholar
- Kim, S., and André, E. (2004) A Generate and Sense Approach to Automated Music Composition, Proc. of the 9th International Conference on Intelligent User Interfaces, pp. 268--270. Google ScholarDigital Library
- North, A. C. and Hargreaves, D. J. (1997) Liking, arousal potential and the emotions expressed by music, Scandinavian Journal of Psychology 38:47.Google ScholarCross Ref
- Picard, R. W. and Scheirer, J. (2001) The Galvactivator: A glove that senses and communicates skin conductivity, Proc. from the 9th International Conference on Human-Computer Interaction.Google Scholar
- Picard, R. W. (1997) Affective Computing. MIT Press. Google ScholarDigital Library
- Russell, J. A. (1980) A Circumplex Model of Affect, Journal of Personality and Social Psychology, 39:6, pp. 1161--1178, American Psychological Association.Google ScholarCross Ref
Index Terms
- The affective remixer: personalized music arranging
Recommendations
MoodMusic: a method for cooperative, generative music playlist creation
UIST '11 Adjunct: Proceedings of the 24th annual ACM symposium adjunct on User interface software and technologyMusic is a major element of social gatherings. However, creating playlists that suit everyone's tastes and the mood of the group can require a large amount of manual effort. In this paper, we present MoodMusic, a method to dynamically generate ...
Pulsed Melodic Affective Processing: Musical structures for increasing transparency in emotional computation
Pulsed Melodic Affective Processing (PMAP) is a method for the processing of artificial emotions in affective computing. PMAP is a data stream designed to be listened to, as well as computed with. The affective state is represented by numbers that are ...
Tune in to your emotions: a robust personalized affective music player
The emotional power of music is exploited in a personalized affective music player (AMP) that selects music for mood enhancement. A biosignal approach is used to measure listeners' personal emotional reactions to their own music as input for affective ...
Comments