Skip to main content

Emotional Facial Expression Classification for Multimodal User Interfaces

  • Conference paper
Articulated Motion and Deformable Objects (AMDO 2006)

Part of the book series: Lecture Notes in Computer Science ((LNIP,volume 4069))

Included in the following conference series:

Abstract

We present a simple and computationally feasible method to perform automatic emotional classification of facial expressions. We propose the use of 10 characteristic points (that are part of the MPEG4 feature points) to extract relevant emotional information (basically five distances, presence of wrinkles and mouth shape). The method defines and detects the six basic emotions (plus the neutral one) in terms of this information and has been fine-tuned with a data-base of 399 images. For the moment, the method is applied to static images. Application to sequences is being now developed. The extraction of such information about the user is of great interest for the development of new multimodal user interfaces.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Bruce, V.: What the Human Face Tells the Human Mind: some Challenges for the Robot-Human Interface. In: Proc. Int’l. workshop Robot and Human Comm. A., pp. 44–51 (1992)

    Google Scholar 

  2. Edwards, G.J., Cootes, T.F., Taylor, C.J.: Face recognition using active appearance models. In: Burkhardt, H., Neumann, B. (eds.) ECCV 1998. LNCS, vol. 1407, pp. 581–595. Springer, Heidelberg (1998)

    Chapter  Google Scholar 

  3. Hong, H., Neven y, H., von der Malsburg, C.: Online Facial Expression Recognition Based on Personalized Galleries. In: Proc. Int’l. Conf. Automatic Face and Gesture Recognition, pp. 354–359 (1998)

    Google Scholar 

  4. Hong, H., Neven y, H., von der Malsburg, C.: Online Facial Expression Recognition Based on Personalized Galleries. In: Proc. Int’l Conf. Automatic Face and Gesture Recognition, pp. 354–359 (1998)

    Google Scholar 

  5. Lyons, M.J., Budynek, J., Akamatsu, S.: Automatic Classification of Single Facial Images. IEEE Trans. Pattern Analysis and Machine Intelligence 21(12), 1357–1362 (1999)

    Article  Google Scholar 

  6. Zhang, Z., Lyons, M., Schuster y, M., Akamatsu, S.: Comparison between Geometry-Based and Gabor Wavelets-Based Facial Expression Recognition Using Multi-Layer Perceptron. In: Proc. Int’l Conf. Automatic Face and Gesture Recognition, pp. 454–459 (1998)

    Google Scholar 

  7. Wallace, M., Raouzaiou, A., Tsapatsoulis y, N., Kollias, S.: Facial Expression Classification Based on MPEG-4 FAPs: The Use of Evidence and Prior Knowledge for Uncetainty Re-moval. In: IEEE International Conference on Fuzzy Systems (FUZZ-IEEE), Budapest, Hungría, vol. 1, pp. 51–54, 25-29 de Julio de (2004)

    Google Scholar 

  8. Pantic y, M., Rothkrantz, L.J.M.: Expert System for Automatic Analysis of Facial Expression. Image and Vision Computing J. 18(11), 881–905 (2000)

    Article  Google Scholar 

  9. Pantic, M., Rothkrantz, L.J.M.: Automatic Analysis of Facial Expressions: The State of the Art. IEEE Transactions on Pattern Analysis and Machine Intelligence 22(12), 1424–1445 (2000)

    Article  Google Scholar 

  10. Ekman, P.: Facial Expression, the Handbook of Cognition and Emotion. John Wiley et Sons, Chichester (1999)

    Google Scholar 

  11. Hammal, Z., Couvreur, L., Caplier, A., Rombaut, M.: Facial Expressions Recognition Based on the Belief Theory: Comparison with Diferent Classifiers. In: Roli, F., Vitulano, S. (eds.) ICIAP 2005. LNCS, vol. 3617, pp. 743–752. Springer, Heidelberg (2005)

    Chapter  Google Scholar 

  12. http://www.mmk.ei.tum.de/~waf/fgnet/feedtum.html (reviewed, February 2006)

  13. Bassili, J.N.: Emotion recognition: The role of facial movement and the relative importance of upper and lower areas of the face. Journal of Personality and Social Psychology (37), 2049–2059 (1997)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Cerezo, E., Hupont, I. (2006). Emotional Facial Expression Classification for Multimodal User Interfaces. In: Perales, F.J., Fisher, R.B. (eds) Articulated Motion and Deformable Objects. AMDO 2006. Lecture Notes in Computer Science, vol 4069. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11789239_42

Download citation

  • DOI: https://doi.org/10.1007/11789239_42

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-36031-5

  • Online ISBN: 978-3-540-36032-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics