Skip to main content
Log in

Sound design for emotion and intention expression of socially interactive robots

  • Original Research Paper
  • Published:
Intelligent Service Robotics Aims and scope Submit manuscript

Abstract

The current concept of robots has been greatly influenced by the image of robots from science fiction. Since robots were introduced into human society as partners with them, the importance of human–robot interaction has grown. In this paper, we have designed seven musical sounds, five of which express intention and two that express emotion for the English teacher robot, Silbot. To identify the sound design considerations, we analyzed the sounds of robots, R2-D2 and Wall-E, from two popular movies, Star Wars and Wall-E, respectively. From the analysis, we found that intonation, pitch, and timbre are dominant musical parameters to express intention and emotion. To check the validity of these designed sounds for intention and emotion, we performed a recognition rate experiment. The experiment showed that the five designed sounds for intentions and the two for emotions are sufficient to deliver the intended emotions.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Lee C, Lee GG (2007) Emotion recognition for affective user interfaces using natural language dialogs. In: Proceedings of the IEEE international symposium on robot and human interactive communication, Jeju, pp 798–801

  2. Berg J, Wingstedt J (2005) Relations between selected musical parameters and expressed emotions: extending the potential of computer entertainment. In Proceedings of the international conferences on advances in computer entertainment technology, Valencia, pp 164–171

  3. Baumgartner T, Lutz K, Schmidt CF, Jancke L (2006) The emotional power of music: how music enhances the feeling of affective picture. Brain Res 1075: 151–164

    Article  Google Scholar 

  4. Schubert E (2004) Modeling perceived emotion with continuous musical features. Music Percept 21(4): 561–585

    Article  Google Scholar 

  5. Juslin PN, Sloboda JA (2001) Music and emotion. Oxford University Press, New York

    Google Scholar 

  6. Nakanishi T, Kitagawa T (2006) Visualization of music impression in facial expression to represent emotion. In Proceedings of the Asia–Pacific conference on conceptual modeling, Hobart, pp 55–64

  7. Jee E-S, Kim CH, Park S-Y, Lee K-W (2007) Composition of musical sound expressing an emotion of robot based on musical factors. In Proceedings of the IEEE international symposium on robot and human interactive communication, Jeju, pp 637–641

  8. Lerdahl F, Jackendoff R (1983) A generative theory of tonal music. MIT Press, Cambridge

    Google Scholar 

  9. Meyer LB (1956) Emotion and meaning in music. University of Chicago Press, Chicago

    Google Scholar 

  10. Pratt CC (1948) Music as a language of emotion. Bull Am Musicol Soc 11(1): 67–68

    Google Scholar 

  11. Kivy P (1999) Feeling the musical emotions. Br J Aesthet 39: 1–13

    Article  Google Scholar 

  12. Levinson J (1982) Music and negative emotion. Pac Philos Q 63: 327–346

    Google Scholar 

  13. Cook N, Dibben N (2001) Musicological approaches to emotion. In: Juslin P, Sloboda H (eds) Music and emotion: theory and research. Oxford University Press, New York, pp 45–70

    Google Scholar 

  14. Feld S (1982) Sound and sentiment: birds, weeping, poetics, and song in Kaluli expression. University of Pennsylvania Press, Philadelphia

    Google Scholar 

  15. Becker J (2001) Anthropological perspectives on music and emotion. In: Juslin P, Sloboda H (eds) Music and emotion: theory and research. Oxford University Press, New York, pp 135–160

    Google Scholar 

  16. Blood AJ, Zatorre RJ, Bermudez P, Evans AC (1999) Emotional responses to pleasant and unpleasant music correlate with activity in paralimbic brain regions. Nat Neurosci 2(4): 382–387

    Article  Google Scholar 

  17. Juslin PN, Vastfall D (2008) Emotional responses to music: the need to consider underlying mechanisms. Behav Brain Sci 31: 556–621

    Google Scholar 

  18. Livingstone SR, Thompson WF (2009) The emergence of music from the theory of mind. Music Sci 10: 83–115

    Google Scholar 

  19. Hevner K (1935) Expression in music: a discussion of experimental studies and theories. Psychol Rev 42: 186–204

    Article  Google Scholar 

  20. Hevner K (1935) The affective character of the major and minor modes in music. Am J Psychol 47(4): 103–118

    Article  Google Scholar 

  21. Hevner K (1936) Experimental studies of the elements of expression in music. Am J Psychol 48(2): 103–118

    Article  Google Scholar 

  22. Hevner K (1937) The affective value of pitch and tempo in music. Am J Psychol 49(4): 621–630

    Article  Google Scholar 

  23. Juslin PN (2000) Cue utilization in communication of emotion in music performance: relating performance to perception. J Exp Psychol 16(6): 1797–1813

    Google Scholar 

  24. Garbrielsson A, Lindstrom E (2001) The influence of musical structure on emotional expression. In: Juslin P, Sloboda H (eds) Music and emotion: theory and research. Oxford University Press, New York, pp 223–248

    Google Scholar 

  25. Juslin PN, Laukka P (2003) Communication of emotions in vocal expression and music performance: different channels, same code?. Psychol Bull 129(5): 770–814

    Article  Google Scholar 

  26. Schubert E (2004) Modeling perceived emotion with continuous musical features. Music Percept 21(4): 561–585

    Article  Google Scholar 

  27. Jee E-S, Cheong Y-J, Park S-Y, Kim CH, Kobayashi H (2009) Composition of musical sound to express robot’s emotion with intensity and synchronized expression with robot’s behavior. In: Proceedings of the IEEE international symposium on robot and human interactive communication, Toyama, pp 369–374

  28. Russel JA (1980) A circumplex model of affect. J Pers Soc Psychol 39: 1161–1178

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Eun-Sook Jee.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Jee, ES., Jeong, YJ., Kim, C.H. et al. Sound design for emotion and intention expression of socially interactive robots. Intel Serv Robotics 3, 199–206 (2010). https://doi.org/10.1007/s11370-010-0070-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11370-010-0070-7

Keywords

Navigation