Abstract
This paper treats methodology to build linked data from the relationships between facial action units and their states as emotional parameters for the facial emotion recognition. In this paper, the authors are especially focusing on building action unit-based linked data because it will be possible not only to use the data for the facial emotion recognition but also to enhance the usefulness of the data by merging them with other linked data. Although in general, the representation as linked data seems to make the accuracy of the facial emotion recognition lower than others, in practically the proposed method that uses action unit-based linked data has almost the same accuracy for the facial emotion recognition as those of other approaches like using Artificial Neural Network and using Support Vector Machine.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Strapparava, C., Valitutti, A.: WordNet-Affect: an Affective Extension of WordNet. In: Proceedings of the 4th International Conference on Language Resources and Evaluation, vol. 4, pp. 1083–1086 (2004)
Ptaszynski, M., Rzepka, R., Araki, K., Momouchi, Y.: A Robust Ontology of Emotion Objects. In: Proceedings of the Eighteenth Annual Meeting of The Association for Natural Language Processing (NLP 2012), pp. 719–722 (2012)
EmotionML, http://www.w3.org/TR/emotionml/
Grassi, M.: Developing HEO human emotions ontology. In: BioID_MultiComm 2009 Proceedings of the 2009 Joint COST 2101 and 2102 International Conference on Biometric ID Management and Multimodal Communication, pp. 244–251 (2009)
GarcÃa-Rojas, A., Vexo, F., Thalmann, D., Raouzaiou, A., Karpouzis, K., Kollias, S.: Emotional Body Expression Parameters In Virtual Human Ontology. In: Proceedings of 1st International Workshop on Shapes and Semantics, pp. 63–70 (2006)
Benta, K., Rarau, A., Cremene, M.: Ontology Based Affective Context Representation. In: Proceedings of the 2007 Euro American Conference on Telematics and Information Systems (EATIS 2007), Article No. 46 (2007)
Tsapatsoulis, N., Karpouzis, K., Stamou, G., Piat, F., Kollias, S.A.: A Fuzzy System for Emotion Classification Based on the MPEG-4 Facial Definition Parameter Set. In: Proceedings of the 10th European Signal Processing Conference (2000)
Tekalp, A.M., Ostermann, J.: Face and 2-D Mesh Animation in MPEG-4. Signal Processing: Image Communication 15(4), 387–421 (2000)
Azcarate, A., Hageloh, F., van de Sande, K., Valenti, R.: Automatic Facial Emotion Recognition, Univerity of Amsterdam (2005)
Bui, T.D., Heylen, D., Poe, M., Nijholt, A.: Generation of facial expressions from Emotion using a Fuzzy Rule Based System. In: Proceedings of the 14th Australian Joint Conference on Artificial Intelligence: Advances in Artificial Intelligence (AI 2001), pp. 83–94 (2001)
Kozasa, C., Fukutake, H., Notsu, H., Okada, Y., Niijima, K.: Facial Animation Using Emotional Model. In: International Conference on Computer Graphics, Imaging and Visualisation (CGIV 2006), pp. 428–433 (2006)
GarcÃa-Rojas, A., Vexo, F., Thalmann, D., Raouzaiou, A., Karpouzis, K., Kollias, S., Moccozet, L., Thalmann, N.M.: Emotional face expression profiles supported by virtual human ontology. In: Computer Animation and Virtual Worlds (CASA 2006), vol. 17(3-4), pp. 259–269 (2006)
Raouzaiou, A., Tsapatsoulis, N., Karpouzis, K., Kollias, S.: Parameterized facial expression synthesis based on MPEG-4. EURASIP Journal on Applied Signal Processing archive 2002(1), 1021–1038 (2002)
Candide3 model, http://www.icg.isy.liu.se/candide/
Ekman, P., Friesen, W.V., Ellsworth, P.: What emotion categories or dimensions can observers judge from facial behavior? In: Ekman, P. (ed.) Emotion in the Human Face, pp. 39–55 (1982)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2013 Springer International Publishing Switzerland
About this paper
Cite this paper
Kaneko, K., Okada, Y. (2013). Action Unit-Based Linked Data for Facial Emotion Recognition. In: Yoshida, T., Kou, G., Skowron, A., Cao, J., Hacid, H., Zhong, N. (eds) Active Media Technology. AMT 2013. Lecture Notes in Computer Science, vol 8210. Springer, Cham. https://doi.org/10.1007/978-3-319-02750-0_22
Download citation
DOI: https://doi.org/10.1007/978-3-319-02750-0_22
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-02749-4
Online ISBN: 978-3-319-02750-0
eBook Packages: Computer ScienceComputer Science (R0)