ABSTRACT
This paper presents a survey in which participants were asked to interpret non-linguistic utterances made by two different types of robot, one humanoid robot and one pet-like robot. The study set out to answer the question of whether the interpretation of emotions differed across types of robots, participant parameters and classes of utterance. We found that both male and female subjects were consistently more coherent in interpreting human over animal utterances, and animal over technological utterances. This held true with regard to the emotional and intentional interpretation, as well as the perception of appropriateness of a particular utterance with respect to a particular type of robot. We also found that males and females frequently differed significantly in their emotional and intentional interpretations of utterances. Finally, our results indicate that the morphology of a robot influences peoples judgment of what class of utterance is deemed appropriate for a particular type of robot.
- A. D. Angeli, G. I. Johnson, and L. Coventry. The unfriendly user: exploring social reactions to chatterbots. In K. Helander and T. Helander, editors, Proceedings of The International Conference on Affective Human Factors Design. 2001.Google Scholar
- R. Banse and K. Scherer. Acoustic profiles in vocal emotion expression. Journal of personality and social psychology, 70(3):614--636, 1996.Google Scholar
- C. Breazeal. Designing Sociable Robots. The MIT Press, Cambridge, MA, 2002. Google ScholarDigital Library
- G. Bryant and H. Barrett. Recognizing intentions in infant-directed speech: Evidence for universals. Psychological Science, 18(8):746--751, 2007.Google ScholarCross Ref
- G. Bryant and H. Barrett. Vocal emotion recognition across disparate cultures. Journal of Cognition and Culture, 8, 1(2):135--148, 2008.Google ScholarCross Ref
- J. Cahn. Generating expression in synthesized speech. Master's thesis, MIT Media Lab, Cambridge, MA, 1990.Google Scholar
- R. Cowie and R. Cornelius. Describing the emotional states that are expressed in speech. Speech Communication, 40(1--2):5--32, Apr. 2003. Google ScholarDigital Library
- A. Fernald. Intonation and communicative intent in mothers' speech to infants: is the melody the message? Child development, 60(6):1497--1510, 1989.Google Scholar
- S. R. Fussell, S. Kiesler, L. D. Setlock, and V. Yew. How people anthropomorphize robots. In HRI '08: Proceedings of the 3rd ACM/IEEE international conference on Human robot interaction, pages 145--152, New York, NY, USA, 2008. ACM. Google ScholarDigital Library
- B. A. Knott and P. Kortum. Personification of voice user interfaces: Impacts on user performance. Human Factors and Ergonomics Society Annual Meeting Proceedings, 50(5):599--603, 2006.Google ScholarCross Ref
- T. Komatsu and S. Yamada. How appearance of robotic agents affects how people interpret the agents' attitudes. In ACE '07: Proceedings of the international conference on Advances in computer entertainment technology, pages 123--126, New York, NY, USA, 2007. ACM. Google ScholarDigital Library
- H. Kose-Bagci, E. Ferrari, K. Dautenhahn, D. S. Syrdal, and C. L. Nehaniv. Effects of Embodiment and Gestures on Social Interaction in Drumming Games with a Humanoid Robot. Advanced Robotics, 23(14):1951--1996, 2009.Google ScholarCross Ref
- K. Krippendorff. Content analysis: An introduction to its methodology. Sage, Thousand Oaks, CA, 2004.Google Scholar
- P. Lison and G. J. M. Kruijff. Robust processing of situated spoken dialogue. KI 2009: Advances in Artificial Intelligence, Proceedings, 5803:241--248, 2009. Google ScholarDigital Library
- M. Lohse, K. J. Rohlfing, B. Wrede, and G. Sagerer. "Try something else!" - when users change their discursive behavior in human-robot interaction. 2008 IEEE International Conference on Robotics and Automation, Vols 1--9, pages 3481--3486, 2008.Google ScholarCross Ref
- P.-Y. Oudeyer. The production and recognition of emotions in speech: features and algorithms. International Journal of Human-Computer Studies, 59(1--2):157--183, 2003. Google ScholarDigital Library
- R. Plutchik. The Psychology and Biology of Emotion. Harper-Collins College Publishers, New York, NY, 1994.Google Scholar
- M. Schröder. Speech and Emotion Research: An overview of research frameworks and a dimensional approach to emotional speech synthesis. PhD thesis, Institute of Phonetics, Saarland University, 2003.Google Scholar
- T. Shiwa, T. Kanda, M. Imai, H. Ishiguro, and N. Hagita. How quickly should a communication robot respond? delaying strategies and habituation effects. International Journal of Social Robotics, 1(2):141--155, 2009.Google ScholarCross Ref
- A. Thomaz, M. Berlin, and C. Breazeal. Robot science meets social science: An embodied computational model of social referencing. In Workshop toward social mechanisms of android science (CogSci), pages 7--17, 2005.Google Scholar
- K. Wada and T. Shibata. Living with Seal Robots in a Care House-Evaluations of Social and Physiological Influences. In Proceedings of the 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems, pages 4940--4945, 2006.Google ScholarCross Ref
- S. Yilmazyildiz, L. Latacz, W. Mattheyses, and W. Verhelst. Expressive gibberish speech synthesis for affective human-computer interaction. In Proceedings of the 13th International Conference on Text, Speech and Dialogue (TSD) 2010, 2010.. Google ScholarDigital Library
Index Terms
- Interpreting non-linguistic utterances by robots: studying the influence of physical appearance
Recommendations
Situational context directs how people affectively interpret robotic non-linguistic utterances
HRI '14: Proceedings of the 2014 ACM/IEEE international conference on Human-robot interactionThis paper presents an experiment investigating the influence that a situational context has upon how people affectively interpret Non-Linguistic Utterances made by a social robot. Subjects were presented five video conditions showing the robot making ...
Non-linguistic utterances should be used alongside language, rather than on their own or as a replacement
HRI '14: Proceedings of the 2014 ACM/IEEE international conference on Human-robot interactionThis paper presents the results of a small experiment aimed at determining whether people are comfortable with a social robot that uses robotic Non-Linguistic Utterances alongside Natural Language, rather than as a replacement. The results suggest that ...
How to use non-linguistic utterances to convey emotion in child-robot interaction
HRI '12: Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot InteractionVocal affective displays are vital for achieving engaging and effective Human-Robot Interaction. The same can be said for linguistic interaction also, however, while emphasis may be placed upon linguistic interaction, there are also inherent risks: ...
Comments