Abstract
Believable nonverbal behaviors for embodied conversational agents (ECA) can create a more immersive experience for users and improve the effectiveness of communication. This paper describes a nonverbal behavior generator that analyzes the syntactic and semantic structure of the surface text as well as the affective state of the ECA and annotates the surface text with appropriate nonverbal behaviors. A number of video clips of people conversing were analyzed to extract the nonverbal behavior generation rules. The system works in real-time and is user-extensible so that users can easily modify or extend the current behavior generation rules.
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Knapp, M., Hall, J.: Nonverbal Communication in Human Interaction, 4th edn. Harcourt Brace College Publishers (1997)
Fabri, M., Moore, D., Hobbs, D.: Expressive agents: Non-verbal communication in collaborative virtual environments. In: Proceedings of Autonomous Agents and Multi-Agent Systems, Bologna, Italy (2002)
Swartout, W., Hill, R., Gratch, J., Johnson, W., Kyriakakis, C., Labore, K., Lindheim, R., Marsella, S., Miraglia, D., Moore, B., Morie, J., Rickel, J., Thiebaux, M., Tuch, L., Whitney, R.: Toward the holodeck: Integrating graphics, sound, character and story. In: Proceedings of 5th International Conference on Autonomous Agents, Montreal, Canada (2001)
Durlach, N., Slater, M.: Presence in shared virtual environments and virtual togetherness. In: BT Workshop on Presence in Shared Virtual Environments, Ipswich, UK (1998)
Cassell, J., Vilhjálmsson, H., Chang, K., Bickmore, T., Campbell, L., Yan, H.: Requirements for an architecture for embodied conversational characters. In: Magnenat-Thalmann, N., Thalmann, D. (eds.) Computer Animation and Simulation 1999, pp. 109–120. Springer, Vinna (1999)
Becheiraz, P., Thalmann, D.: A behavioral animation system for autonomous actors personified by emotions. In: Proceedings of the 1st Workshop on Embodied Conversational Characters (WECC), Lake Tahoe, CA, pp. 57–65 (1998)
Striegnitz, K., Tepper, P., Lovett, A., Cassell, J.: Knowledge representation for generating locating gestures in route directions. In: Proceedings of Workshop on Spatial Language and Dialogue, Delmenhorst, Germany (2005)
Cassell, J., Vilhjálmsson, H., Bickmore, T.: BEAT: The behavior expression animation toolkit. In: Proceedings of ACM SIGGRAPH, pp. 477–486. ACM Press / ACM SIGGRAPH, New York (2001)
Vilhjálmsson, H., Marsella, S.: Social performance framework. In: Workshop on Modular Construction of Human-Like Intelligence at the AAAI 20th National Conference on Artificual Intelligence, Pittsburgh, PA (2005)
Ekman, P.: About brows: emotional and conversational signals. In: von Cranach, M., Foppa, K., Lepenies, W., Ploog, D. (eds.) Human Ethology, pp. 169–248. Cambridge University Press, Cambridge (1979)
Hadar, U., Steiner, T., Grant, E., Clifford Rose, F.: Kinematics of head movement accompanying speech during conversation. Human Movement Science 2, 35–46 (1983)
Heylen, D.: Challenges ahead. In: Proceedings of AISB Symposium on Social Virtual Agents (in press)
Kendon, A.: Some uses of head shake. Gesture (2), 147–182 (2003)
McClave, E.: Linguistic functions of head movements in the context of speech. Journal of Pragmatics (32), 855–878 (2000)
The HUMAINE Consortium: The HUMAINE portal (2006) (Retrieved April 7, 2006), http://emotion-research.net/
The HUMAINE Consortium: Multimodal data in action and interaction: a library of recordings and labelling schemes (2004) (Retrieved April 14, 2006), http://emotion-research.net/deliverables/
Weizenbaum, J.: ELIZA – a computer program for the study of natural language communication between man and machines. Communications of the Association for Computing Machinery 9, 36–45 (1996)
n.a.: Behavior markup language (BML) specification (2006) (Retrieved June 6, 2006), http://twiki.isi.edu/Public/BMLSpecification/
Kopp, S., Krenn, B., Marsella, S., Marshall, A., Pelachaud, C., Pirker, H., Thorisson, K., Vilhjálmsson, H.: Towards a common framework for multimodal generation in embodied conversation agents: a behavior markup language. In: International Conference on Virtual Agents, Marina del Rey, CA (submitted, 2006)
DeCarolis, B., Pelachaud, C., Poggi, I., Steedman, M.: APML, a mark-up language for believable behavior generation. In: Prendinger, H., Ishizuka, M. (eds.) Life-like Characters. Tools, Affective Functions and Applications, pp. 65–85. Springer, Heidelberg (2004)
Kopp, S., Wach Smuth, I.: Synthesizing multimodal utterances for conversational agents. Computer Animation and Virtual Worlds 15(1), 39–52 (2004)
Chariank, E.: A maximum-entropy-inspired parser. In: Proceedings of North American Chapter of the Association for Computational Linguistics (2000)
Kallmann, M., Marsella, S.C.: Hierarchical motion controllers for real-time autonomous virtual humans. In: Panayiotopoulos, T., Gratch, J., Aylett, R.S., Ballin, D., Olivier, P., Rist, T. (eds.) IVA 2005. LNCS (LNAI), vol. 3661, pp. 253–265. Springer, Heidelberg (2005)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2006 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Lee, J., Marsella, S. (2006). Nonverbal Behavior Generator for Embodied Conversational Agents. In: Gratch, J., Young, M., Aylett, R., Ballin, D., Olivier, P. (eds) Intelligent Virtual Agents. IVA 2006. Lecture Notes in Computer Science(), vol 4133. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11821830_20
Download citation
DOI: https://doi.org/10.1007/11821830_20
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-37593-7
Online ISBN: 978-3-540-37594-4
eBook Packages: Computer ScienceComputer Science (R0)