Skip to main content

Action Unit-Based Linked Data for Facial Emotion Recognition

  • Conference paper
Active Media Technology (AMT 2013)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 8210))

Included in the following conference series:

  • 1204 Accesses

Abstract

This paper treats methodology to build linked data from the relationships between facial action units and their states as emotional parameters for the facial emotion recognition. In this paper, the authors are especially focusing on building action unit-based linked data because it will be possible not only to use the data for the facial emotion recognition but also to enhance the usefulness of the data by merging them with other linked data. Although in general, the representation as linked data seems to make the accuracy of the facial emotion recognition lower than others, in practically the proposed method that uses action unit-based linked data has almost the same accuracy for the facial emotion recognition as those of other approaches like using Artificial Neural Network and using Support Vector Machine.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. RDF, http://www.w3.org/RDF

  2. OWL, http://www.w3.org/TR/owl-features/

  3. Strapparava, C., Valitutti, A.: WordNet-Affect: an Affective Extension of WordNet. In: Proceedings of the 4th International Conference on Language Resources and Evaluation, vol. 4, pp. 1083–1086 (2004)

    Google Scholar 

  4. Ptaszynski, M., Rzepka, R., Araki, K., Momouchi, Y.: A Robust Ontology of Emotion Objects. In: Proceedings of the Eighteenth Annual Meeting of The Association for Natural Language Processing (NLP 2012), pp. 719–722 (2012)

    Google Scholar 

  5. EmotionML, http://www.w3.org/TR/emotionml/

  6. Grassi, M.: Developing HEO human emotions ontology. In: BioID_MultiComm 2009 Proceedings of the 2009 Joint COST 2101 and 2102 International Conference on Biometric ID Management and Multimodal Communication, pp. 244–251 (2009)

    Google Scholar 

  7. García-Rojas, A., Vexo, F., Thalmann, D., Raouzaiou, A., Karpouzis, K., Kollias, S.: Emotional Body Expression Parameters In Virtual Human Ontology. In: Proceedings of 1st International Workshop on Shapes and Semantics, pp. 63–70 (2006)

    Google Scholar 

  8. Benta, K., Rarau, A., Cremene, M.: Ontology Based Affective Context Representation. In: Proceedings of the 2007 Euro American Conference on Telematics and Information Systems (EATIS 2007), Article No. 46 (2007)

    Google Scholar 

  9. Tsapatsoulis, N., Karpouzis, K., Stamou, G., Piat, F., Kollias, S.A.: A Fuzzy System for Emotion Classification Based on the MPEG-4 Facial Definition Parameter Set. In: Proceedings of the 10th European Signal Processing Conference (2000)

    Google Scholar 

  10. Tekalp, A.M., Ostermann, J.: Face and 2-D Mesh Animation in MPEG-4. Signal Processing: Image Communication 15(4), 387–421 (2000)

    Article  Google Scholar 

  11. Azcarate, A., Hageloh, F., van de Sande, K., Valenti, R.: Automatic Facial Emotion Recognition, Univerity of Amsterdam (2005)

    Google Scholar 

  12. Bui, T.D., Heylen, D., Poe, M., Nijholt, A.: Generation of facial expressions from Emotion using a Fuzzy Rule Based System. In: Proceedings of the 14th Australian Joint Conference on Artificial Intelligence: Advances in Artificial Intelligence (AI 2001), pp. 83–94 (2001)

    Google Scholar 

  13. Kozasa, C., Fukutake, H., Notsu, H., Okada, Y., Niijima, K.: Facial Animation Using Emotional Model. In: International Conference on Computer Graphics, Imaging and Visualisation (CGIV 2006), pp. 428–433 (2006)

    Google Scholar 

  14. García-Rojas, A., Vexo, F., Thalmann, D., Raouzaiou, A., Karpouzis, K., Kollias, S., Moccozet, L., Thalmann, N.M.: Emotional face expression profiles supported by virtual human ontology. In: Computer Animation and Virtual Worlds (CASA 2006), vol. 17(3-4), pp. 259–269 (2006)

    Google Scholar 

  15. Raouzaiou, A., Tsapatsoulis, N., Karpouzis, K., Kollias, S.: Parameterized facial expression synthesis based on MPEG-4. EURASIP Journal on Applied Signal Processing archive 2002(1), 1021–1038 (2002)

    Article  MATH  Google Scholar 

  16. Candide3 model, http://www.icg.isy.liu.se/candide/

  17. Ekman, P., Friesen, W.V., Ellsworth, P.: What emotion categories or dimensions can observers judge from facial behavior? In: Ekman, P. (ed.) Emotion in the Human Face, pp. 39–55 (1982)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2013 Springer International Publishing Switzerland

About this paper

Cite this paper

Kaneko, K., Okada, Y. (2013). Action Unit-Based Linked Data for Facial Emotion Recognition. In: Yoshida, T., Kou, G., Skowron, A., Cao, J., Hacid, H., Zhong, N. (eds) Active Media Technology. AMT 2013. Lecture Notes in Computer Science, vol 8210. Springer, Cham. https://doi.org/10.1007/978-3-319-02750-0_22

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-02750-0_22

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-02749-4

  • Online ISBN: 978-3-319-02750-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics