Skip to main content

Detecting and Adapting to Users’ Cognitive and Affective State to Develop Intelligent Musical Interfaces

  • Chapter
  • First Online:
New Directions in Music and Human-Computer Interaction

Part of the book series: Springer Series on Cultural Computing ((SSCC))

Abstract

In musical instrument interfaces, such as piano keyboards, the player’s communication channels may be limited by the expressivity and resolution of input devices, the expressivity of relevant body parts, and human attention bottlenecks. In this chapter, we consider intelligent musical interfaces that can measure cognitive or affective states implicitly in real-time to allow musically appropriate adaptations by the system without conscious effort on the part of the user. This chapter focuses on two specific areas in music where the detection of cognitive and affective states has been applied to interaction design for music: musical learning (including learning instruments or pieces of music) and musical creativity (including composing and improvisation). The motivation, theory, and technological basis for work of this kind are discussed. Relevant existing work is considered. The design and evaluation of two systems of this kind for musical learning and musical creativity implemented by the authors is presented and critiqued.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 139.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 179.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  • Afergan D, Peck EM, Solovey ET, Jenkins A, Hincks SW, Brown ET, Chang R, Jacob RJ (2014a) Dynamic difficulty using brain metrics of workload. In: Proceedings of CHI 2014, pp 3797–3806

    Google Scholar 

  • Afergan D, Shibata T, Hincks SW, Peck EM, Yuksel BF, Chang R, Jacob RJ (2014b) Brain-based target expansion. In: Proceedings of UIST

    Google Scholar 

  • Aggarwal JK, Cai Q (1999) Human motion analysis: a review. Comput Vis Image Underst 73(3)

    Article  Google Scholar 

  • Arslan B, Brouse A, Castet J, Lehembre R, Simon C, Filatriau J-J, Noirhomme Q (2006) A real time music synthesis environment driven with biological signals. In: Proceedings of IEEE ICASSP

    Google Scholar 

  • Ayaz H, Shewokis PA, Bunce S, Izzetoglu K, Willems B, Onaral B (2012) Optical brain monitoring for operator training and mental workload assessment. Neuroimage 59(1):36–47

    Article  Google Scholar 

  • Bauer JS, Jansen A, Cirimele J. (2011) MoodMusic: a method for cooperative, generative music playlist creation. In: Proceedings of the 24th annual ACM symposium adjunct on user interface software and technology (UIST), pp 85–86

    Google Scholar 

  • Chew YCD, Caspary E (2011) MusEEGk: a brain computer musical interface. In: Extended abstracts CHI 2011, pp 1417–1422

    Google Scholar 

  • Chung JW, Vercoe GS (2006) The affective remixer: personalized music arranging. In: CHI’06 extended abstracts on human factors in computing systems, 21 Apr 2006. ACM, pp 393–398

    Google Scholar 

  • Craig S, Graesser A, Sullins J, Gholson B (2004) Affect and learning: an exploratory look into the role of affect in learning with AutoTutor. J Educ Media 29(3):241–250 (2004)

    Article  Google Scholar 

  • D’Esposito M, Postle BR, Rypma B (2000) Prefrontal cortical contributions to working memory: evidence from event-related fMRI studies. Exp Brain Res 133(1):3–11

    Article  Google Scholar 

  • Ekman P, Friesen WV, Hager, JC (2002) Facial action coding system. A human face. Salt Lake City, USA

    Google Scholar 

  • Ekman P, Friesen W (1978) Manual for the facial action coding system. Consulting Psychology Press

    Google Scholar 

  • FakhrHosseini M, Jeon M (2016) The effects of various music on angry drivers’ subjective, behavioral, and physiological states. In: Proceedings of the 8th international conference on automotive user interfaces and interactive vehicular applications adjunct, 24 Oct 2016. ACM, pp 191–196

    Google Scholar 

  • Fox PT, Raichle ME, Mintun MA, Dence C (1988) Glucose during focal physiologic neural activity nonoxidative consumption. Science 241:462–464

    Article  Google Scholar 

  • Gabrielsson A, Lindstrom E (2010) The role of structure in the musical expression of emotions. In Handbook of music and emotion: theory, research, applications, pp 367–400

    Chapter  Google Scholar 

  • Gagnon L, Peretz I (2003) Mode and tempo relative contributions to “happy-sad” judgements in equitone melodies. Cogn Emot 17(1):25–40

    Article  Google Scholar 

  • Gevins A, Smith ME, Leong H, McEvoy L, Whitfeld S, Du R, Rush G (1998) Monitoring working memory load during computer-based tasks with EEG pattern recognition methods. Hum Factors: J Hum Factors Ergon Soc 40(1):79–91

    Article  Google Scholar 

  • Gevins A, Smith ME, McEvoy L, Yu D (1997) High-resolution EEG mapping of cortical activation related to working memory: effects of task difficulty, type of processing, and practice. Cereb Cortex 7:374–385

    Article  Google Scholar 

  • Girouard A, Solovey E, Jacob RJK (2013) Designing a passive brain computer interface using real time classification of functional near-infrared spectroscopy. Int J Auton Adapt Commun Syst 6(1):26–44

    Article  Google Scholar 

  • Grierson, M (2008) Composing with brainwaves: minimal trial P300b recognition as an indication of subjective preference for the control of a musical instrument. ICMC

    Google Scholar 

  • Gundel A, Wilson GF (1992) Topographical changes in the ongoing EEG related to the difficulty of mental tasks. Brain Topogr 5(1):17–25

    Article  Google Scholar 

  • Juslin PN (1997) Perceived emotional expression in synthesized performances of a short melody: capturing the listener’s judgment policy. Musicae scientiae 1(2):225–256

    Article  Google Scholar 

  • Juslin PN, Sloboda JA (2010) Handbook of music and emotion: theory, research, applications. Oxford University Press

    Google Scholar 

  • El Kaliouby R, Robinson P (2005) Generalization of a vision-based computational model of mind-reading. In: Proceedings of first international conference on affective computing and intelligent interaction, pp 582–589

    Google Scholar 

  • Kim HJ, Yoo MJ, Kwon JY, Lee IK (2009) Generating affective music icons in the emotion plane. In: CHI’09 extended abstracts on human factors in computing systems. ACM, pp 3389–3394

    Google Scholar 

  • Kort B, Reilly R, Picard RW (2001) An affective model of interplay between emotions and learning: reengineering educational pedagogy-building a learning companion. In: Advanced learning technologies, 2001. IEEE, pp 43–46

    Google Scholar 

  • Knapp RB, Lusted HS (1990) A bioelectric controller for computer music applications. Comput Music J 42–47

    Article  Google Scholar 

  • Le Groux S, Manzolli J, Verschure PF (2010) Disembodied and collaborative musical interaction in the multimodal brain orchestra. In: Proceedings of NIME, pp 309–314

    Google Scholar 

  • Lucier A (1976) Statement on: music for solo performer. In: Rosenboom D (ed) Biofeedback and the arts, results of early experiments. Aesthetic Research Center of Canada Publications, Vancouver, pp 60–61

    Google Scholar 

  • Manoach DS, Schlaug G, Siewert B, Darby DG, Bly BM, Benfeld A, Edelman RR, Warach S (1997) Prefrontal cortex fMRI signal changes are correlated with working memory load. NeuroReport 8(2):545–549

    Article  Google Scholar 

  • Mayer RE (1999) Fifty years of creativity research. In: Sternberg RJ (ed) Handbook of creativity. Cambridge University Press

    Google Scholar 

  • Mealla S, Väljamäe A, Bosi M, Jordà S (2011) Listening to your brain: Implicit interaction in collaborative music performances. In: Proceedings of NIME ’11 (2011)

    Google Scholar 

  • Meyer LB (2008) Emotion and meaning in music. University of Chicago Press

    Google Scholar 

  • McDaniel B, D’Mello S, King B, Chipman P, Tapp K, Graesser A (2007) Facial features for affective state detection in learning environments. In: Proceedings of 29th annual meeting of the cognitive science society

    Google Scholar 

  • Miranda E, Brouse A (2005) Toward direct brain-computer musical interfaces. In: Proceedings of NIME, pp 216–219

    Google Scholar 

  • Miranda E, Sharman E, Kilborn K, Duncan A (2003) On harnessing the electroencephalogram for the musical braincap. Comput Music J 27(2):80–102

    Article  Google Scholar 

  • Moriyama T, Ozawa S (1999) Emotion recognition and synthesis system on speech. In: IEEE international conference on multimedia computing and systems, Florence, Italy

    Google Scholar 

  • Morreale F, De Angeli A (2016) Collaborating with an autonomous agent to generate affective music. Comput Entertain (CIE) 14(3):5

    Google Scholar 

  • Nasoz F, Alvarez K, Lisetti CL, Finkelstein N (2004) Emotion recognition from physiological signals using wireless sensors for presence technologies. Cogn Technol Work 6:4–14

    Article  Google Scholar 

  • Pavlovic VI, Sharma RS, Huang TS (1997) Visual interpretation of hand gestures for human-computer interaction: a review. IEEE Trans Pattern Anal Mach Intell

    Google Scholar 

  • Peck EM, Yuksel BF, Ottely A, Jacob RJ, Chang R (2013) Using fNIRS brain sensing to evaluate information visualization interfaces. In: Proceedings of CHI 2013, pp 473–482

    Google Scholar 

  • Petrushin VA (2000) Emotion recognition in speech signal: experimental study, development and application. ICSLP Beijing, China

    Google Scholar 

  • Picard RW (1997) Affective computing. MIT Press, Cambridge

    Google Scholar 

  • Picard RW, Vyzas E, Healey J (2001) Toward machine emotional intelligence: analysis of affective physiological state. IEEE Trans Pattern Anal Mach Intell 23(10):1175–1191

    Article  Google Scholar 

  • Repovš G, Baddeley A (2006) The multi-component model of working memory: explorations in experimental cognitive psychology. Neuroscience 139(1):5–21

    Article  Google Scholar 

  • Russell JA (1980) A circumplex model of affect. J Pers Soc Psych. 39:1161–1178

    Article  Google Scholar 

  • Sasaki S, Hirai T, Ohya H, Morishima S (2013) Affective music recommendation system using input images. In: ACM SIGGRAPH 2013. ACM, p. 90

    Google Scholar 

  • Shneiderman B (2009) Creativity support tools: a grand challenge for HCI researchers. Eng User Interface 1–9

    Google Scholar 

  • Shneiderman B, Fischer G, Myers B, Edmonds E, Eisenberg M, Jennings P (2006) Creativity support tools: report from a U.S. national science foundation sponsored workshop. Int J Hum-Comput Interact 20(2):61–77

    Article  Google Scholar 

  • Shneiderman B, Fischer G, Czerwinski M, Myers B, Resnik M (2005) Creativity support tools: a workshop sponsored by the National Science Foundation

    Google Scholar 

  • Solovey ET, Girouard A, Chauncey K, Hirshfield LM, Sassaroli A, Zheng F, Fantini S, Jacob RJK (2009) Using fNIRS brain sensing in realistic HCI settings: experiments and guidelines. In: Proceedings of UIST

    Google Scholar 

  • Solovey ET, Lalooses F, Chauncey K, Weaver D, Scheutz M, Sassaroli A, Fantini S, Schermerhorn P, Jacob RJK (2011) Sensing cognitive multitasking for a brain-based adaptive user interface. In: Proceedings of CHI

    Google Scholar 

  • Solovey ET, Schermerhorn P, Scheutz M, Sassaroli A, Fantini S, Jacob RJK (2012) Brainput: enhancing interactive systems with streaming fNIRS brain input. In: Proceedings of CHI

    Google Scholar 

  • Sweller J (1999) Instructional design in technical areas. ACER Press

    Google Scholar 

  • Swift B (2012) Becoming-sound: affect and assemblage in improvisational digital music making. In: Proceedings of the SIGCHI conference on human factors in computing systems 2012. ACM, pp 1815–1824

    Google Scholar 

  • Västfjäll D (2001) Emotion induction through music: a review of the musical mood induction procedure. Musicae Scientiae 5(1 suppl):173–211

    Article  Google Scholar 

  • Villon O, Lisetti C (2006) A user-modeling approach to build user’s psycho-physiological maps of emotions using BioSensors. In: Proceedings of IEEE ROMAN 2006, 15th IEEE international symposium robot and human interactive communication, session emotional cues in human robot interaction, pp 269–276

    Google Scholar 

  • Villringer A, Chance B (1997) Non-invasive optical spectroscopy and imaging of human brain function. Trends Neurosci 20

    Article  Google Scholar 

  • Wagner J, Kim NJ, Andre E (2005) From physiological signals to emotions: implementing and comparing selected methods for feature extraction and classification. In: Proceedings of IEEE international conference on multimedia and expo, pp 940–943

    Google Scholar 

  • Wang G (2016) Game design for expressive mobile music. In: New interfaces for musical expression

    Google Scholar 

  • Yuksel BF, Oleson KB, Harrison L, Peck EM, Afergan D, Chang R, Jacob RJK (2016) Learn piano with BACh: an adaptive learning interface that adjusts task difficulty based on brain state. In Proceedings of the SIGCHI conference on human factors in computing systems 2016, pp 5372–5384

    Google Scholar 

  • Yuksel BF, Afergan D, Peck EM, Griffin G, Harrison L, Chen N, Chang R, Jacob RJK (2015) BRAAHMS: a novel adaptive musical interface based on users’ cognitive state. In: New interfaces for musical expression (NIME), pp 136–139

    Google Scholar 

Download references

Acknowledgements

The authors would like to thank Evan M. Peck from Bucknell University, Daniel Afergan from Google Inc., and Paul Lehrman and Kathleen Kuo from Tufts University for discussions on this topic. We thank the National Science Foundation (grant nos. IIS-1065154, IIS-1218170) and Google Inc. for their support of this work.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Beste F. Yuksel .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Yuksel, B.F., Oleson, K.B., Chang, R., Jacob, R.J.K. (2019). Detecting and Adapting to Users’ Cognitive and Affective State to Develop Intelligent Musical Interfaces. In: Holland, S., Mudd, T., Wilkie-McKenna, K., McPherson, A., Wanderley, M. (eds) New Directions in Music and Human-Computer Interaction. Springer Series on Cultural Computing. Springer, Cham. https://doi.org/10.1007/978-3-319-92069-6_11

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-92069-6_11

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-92068-9

  • Online ISBN: 978-3-319-92069-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics