Skip to main content

Modelling Affective-based Music Compositional Intelligence with the Aid of ANS Analyses

  • Conference paper
Research and Development in Intelligent Systems XXIV (SGAI 2007)

Abstract

This research investigates the use of emotion data derived from analyzing change in activity in the autonomic nervous system (ANS) as revealed by brainwave production to support the creative music compositional intelligence of an adaptive interface. A relational model of the influence of musical events on the listener’s affect is first induced using inductive logic programming paradigms with the emotion data and musical score features as inputs of the induction task. The components of composition such as interval and scale, instrumentation, chord progression and melody are automatically combined using genetic algorithm and melodic transformation heuristics that depend on the predictive knowledge and character of the induced model. Out of the four targeted basic emotional states, namely, stress, joy, sadness, and relaxation, the empirical results reported here show that the system is able to successfully compose tunes that convey one of these affective states.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Bresin R, Friberg A. Emotional coloring of computer-controlled music performance.Computer Music Journal 2000; 24(4):44-62

    Google Scholar 

  2. Kim S, Andre E. Composing affective music with a generate and sense approach. In:Barr V, Markov Z (eds) Proceedings of the 17th International FLAIRS Conference, Special Track on AI and Music, AAAI Press, 2004

    Google Scholar 

  3. Numao M, Takagi S, Nakamura K. Constructive adaptive user interfaces – composing music based on human feelings. In: Proceedings of the 18th National Conference on AI,AAAI Press, 2002, pp 193-198 108 Max Bramer, Frans Coenen and Miltos Petridis (Eds)

    Google Scholar 

  4. Riecken D. Wolfgang: ‘emotions’ plus goals enable learning. In: Proceedings of the IEEE International Conference on Systems, Man and Cybernetics, 1998, pp 1119-1120

    Google Scholar 

  5. Unehara M, Onisawa T. Music composition system based on subjective evaluation. In: Proceedings of the IEEE International Conference on Systems, Man and Cybernetics, 2003, pp 980-986

    Google Scholar 

  6. Legaspi R, Hashimoto Y, Moriyama K, Kurihara S, Numao M. Music compositional intelligence with an affective flavor. In: Proceedings of the 12th International Conference on Intelligent User Interfaces, ACM Press, 2007, pp 216-224

    Google Scholar 

  7. Sloboda JA. Music structure and emotional response: Some empirical findings.Psychology of Music 1991; 19(2):110-120

    Google Scholar 

  8. Gabrielsson A, Lindstrom E. The influence of musical structure on emotional expression. In: Juslin PN, Sloboda JA (eds) Music and emotion: Theory and research, Oxford University Press, New York, 2001, pp 223-248

    Google Scholar 

  9. Legaspi R, Hashimoto Y, Numao M. An emotion-driven musical piece generator for a constructive adaptive user interface. In: Proceedings of the 9th Pacific Rim International Conference on AI, 2006, pp 890-894 (LNAI 4009)

    Google Scholar 

  10. Roz C. The autonomic nervous system: Barometer of emotional intensity and internal conflict. A lecture given for Confer, 27th March 2001, [a copy can be found in] http://www.thinkbody.co.uk/papers/autonomic-nervous-system.htm

    Google Scholar 

  11. Picard RW, Healey J. Affective wearables. Personal and Ubiquitous Computing 1997; 1(4):231-240

    Google Scholar 

  12. Musha T, Terasaki Y, Haque HA, Ivanitsky GA. Feature extraction from EEGs associated with emotions, Artif Life Robotics 1997; 1:15-19

    Article  Google Scholar 

  13. Juslin PN, Sloboda JA. Music and emotion: Theory and research. Oxford University Press, New York, 2001

    Google Scholar 

  14. Juslin PN. Studies of music performance: A theoretical analysis of empirical findings. In: Proceedings of the Stockholm Music Acoustics Conference, 2003, pp 513-516

    Google Scholar 

  15. Nattee C, Sinthupinyo S, Numao M, Okada T. Learning first-order rules from data with multiple parts: Applications on mining chemical compound data. In: Proceedings of the 21st International Conference on Machine Learning, 2004, pp 77-85

    Google Scholar 

  16. Quinlan JR. Learning logical definitions from relations. Machine Learning 1990; 5:239-266

    Google Scholar 

  17. Tangkitvanich S, Shimura M. Refining a relational theory with multiple faults in the concept and subconcept. In: Machine Learning: Proceedings of the Ninth International Workshop, 1992, pp 436-444 18. Posner J, Russell JA, Peterson BS. The circumplex model of affect: An integrative approach to affective neuroscience, cognitive development, and psychopathology. Development and Psychopathology 2005; 17:715-734.

    Google Scholar 

  18. Wiggins GA, Papadopoulos G, Phon-Amnuaisuk S, Tuson A. Evolutionary methods for musical composition. International Journal of Computing Anticipatory Systems 1999; 1(1)

    Google Scholar 

  19. Johanson BE, Poli R. GP-Music: An interactive genetic programming system for music generation with automated fitness raters. Technical Report CSRP-98-13, School of Computer Science, The University of Birmingham, 1998

    Google Scholar 

  20. Unehara M, Onisawa T. Interactive music composition system – composition of 16-bars musical work with a melody part and backing parts. In: Proceedings of the IEEE International Conference on Systems, Man and Cybernetics, 2004,pp 5736-5741

    Google Scholar 

  21. Unehara M, Onisawa T. Interactive music composition system – composition of 16-bars musical work with a melody part and backing parts. In: Proceedings of the IEEE International Conference on Systems, Man and Cybernetics, 2004, pp 5736-5741

    Google Scholar 

  22. Li T, Ogihara M. Detecting emotion in music. In: Proceedings of the 4th International Conference on Music Information Retrieval, 2003, pp 239-240

    Google Scholar 

  23. Rosenboom D. Extended musical interface with the human nervous system: Assessment and prospectus. Leonardo Monograph Series, Monograph No.1 1990/1997

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2008 Springer-Verlag London Limited

About this paper

Cite this paper

Sugimoto, T., Legaspi, R., Ota, A., Moriyama, K., Kurihara, S., Numao, M. (2008). Modelling Affective-based Music Compositional Intelligence with the Aid of ANS Analyses. In: Bramer, M., Coenen, F., Petridis, M. (eds) Research and Development in Intelligent Systems XXIV. SGAI 2007. Springer, London. https://doi.org/10.1007/978-1-84800-094-0_8

Download citation

  • DOI: https://doi.org/10.1007/978-1-84800-094-0_8

  • Publisher Name: Springer, London

  • Print ISBN: 978-1-84800-093-3

  • Online ISBN: 978-1-84800-094-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics