Abstract
Large-scale text-dialog corpora with emotion tags are required to generate a knowledge base for emotional reasoning from text. Annotating emotion tags is known to suffer from problems with instability. These are caused by the lack of non-linguistic expressions (e.g. speech and facial expressions) in the text dialog. We aimed to construct a stable, usable text-dialog corpus with emotion tags. We first focused on facial expression in comics. Some comics contain many text dialogs that are similar to everyday conversation, and it is worth analyzing their text. We therefore extracted 29,538 sentences from 10 comic books and annotated face tags and emotion tags. Two annotators independently placed “temporary face/emotion tags” on stories and then decided what the “correct face/emotion tags” were by discussing them with each other. They acquired 16,635 correct emotion tags as a result. We evaluated the stability and usability of the corpus. We evaluated the correspondence between temporary and correct tags to assess stability, and found precision was 83.8% and recall was 78.8%. These were higher than for annotation without facial expressions (precision = 56.2%, recall = 51.5%). We extracted emotional suffix expressions from the corpus using a probabilistic method to evaluate usability. We could thus construct a text-dialog corpus with emotion tags and confirm its stability and usability.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Chambers, N., Tetreault, J., Allen, J.: Approaches for automatically tagging affect. In: Exploring attitude and affect in text: Theories and applications, pp. 36–43. AAAI Press, Menlo Park (2004)
Ekman, P., Friesen, W.V.: Unmasking the face. Prentice-Hall, Inc., Englewood Cliffs (1975); Japanese translation version: Kudo, T., Matsumoto, D., Shimomura, Y., Ichimura, E., Shobou, S. (1990)
Grefenstette, G., Qu, Y., Evans, D.A., Shanahan, J.G.: Validating the coverage of lexical resources for affect analysis and automatically classifying new words along semantic axes. In: Exploring attitude and affect in text: Theories and applications, pp. 63–70. AAAI Press, Menlo Park (2004)
Litman, D., Forbes, K.: Recognizing emotions from student speech in tutoring dialogues. In: Automatic Speech Recognition and Understanding Workshop (2003)
Ortony, A., Clore, G.L., Collins, A.: The Cognitive Structure of Emotions. Cambridge University Press, Cambridge (1988)
Plutchik, R.: The multifactor-analytic theory of emotion. The Journal of Psychology 50, 153–171 (1960)
de Rosis, F., Grasso, F.: Affective natural language generation. In: Paiva, A.C.R. (ed.) IWAI 1999. LNCS, vol. 1814, pp. 204–218. Springer, Heidelberg (2000)
Tokuhisa, M., Tokuhisa, R., Inui, K., Okada, N.: Emotion recognition in dialogue. In: Hatano, G., et al. (eds.) Affective Minds, pp. 221–229. Elsevier Science, Amsterdam (2000)
Tokuhisa, M., Tanaka, T., Ikehara, S., Murakami, J.: Emotion reasoning based on valency patterns - prototype annotation of causal relationships. In: Human and Artificial Intelligence Systems, pp. 534–539 (2004)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2006 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Tokuhisa, M., Murakami, J., Ikehara, S. (2006). Construction and Evaluation of Text-Dialog Corpus with Emotion Tags Focusing on Facial Expression in Comics. In: Gabrys, B., Howlett, R.J., Jain, L.C. (eds) Knowledge-Based Intelligent Information and Engineering Systems. KES 2006. Lecture Notes in Computer Science(), vol 4253. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11893011_91
Download citation
DOI: https://doi.org/10.1007/11893011_91
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-46542-3
Online ISBN: 978-3-540-46544-7
eBook Packages: Computer ScienceComputer Science (R0)