Skip to main content
Log in

Facial emotional classification: from a discrete perspective to a continuous emotional space

  • Theoretical Advances
  • Published:
Pattern Analysis and Applications Aims and scope Submit manuscript

Abstract

User emotion detection is a very useful input to develop affective computing strategies in modern human computer interaction. In this paper, an effective system for facial emotional classification is described. The main distinguishing feature of our work is that the system does not simply provide a classification in terms of a set of discrete emotional labels, but that it operates in a continuous 2D emotional space enabling a wide range of intermediary emotional states to be obtained. As output, an expressional face is represented as a point in a 2D space characterized by evaluation and activation factors. The classification method is based on a novel combination of five classifiers and takes into consideration human assessment for the evaluation of the results. The system has been tested with an extensive universal database so that it is capable of analyzing any subject, male or female of any age and ethnicity. The results are very encouraging and show that our classification strategy is consistent with human brain emotional classification mechanisms.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Similar content being viewed by others

References

  1. Anderson K, McOwan P (2006) A real-time automated system for the recognition of human facial expressions. IEEE Trans Syst Man Cybern Part B 36:96–105

    Article  Google Scholar 

  2. Bassili J (1979) Emotion recognition: the role of facial movement and the relative importance of upper and lower areas of the face. J Pers Soc Psychol 37:2049–2058

    Article  Google Scholar 

  3. Boukricha H, Becker C, Wachsmuth I (2007) Simulating empathy for the virtual human max. In: Proceedings 2nd international workshop on emotion and computing in conjunction with the German conference on artificial intelligence (KI2007), pp 22–27

  4. Caridakis G, Malatesta L, Kessous L, Amir N, Paouzaiou A, Karpouzis K (2006) Modeling naturalistic affective states via facial and vocal expression recognition. In: International conference on multimodal interfaces, pp 146–154

  5. Chang C, Tsai J, Wang C, Chung P (2009) Emotion recognition with consideration of facial expression and physiological signals. In: Proceedings of the 6th annual IEEE conference on computational intelligence in bioinformatics and computational biology, pp 278–283

  6. Cohen I, Sebe N, Garg A, Chen L, Huang T (2003) Facial expression recognition from video sequences: temporal and static modeling. Comput Vis Image Underst 11(1–2):160–187

    Article  Google Scholar 

  7. Datcu D, Rothkrantz L (2007) Facial expression recognition in still pictures and videos using active appearance models: a comparison approach. In: Proceedings of the international conference on computer systems and technologies, pp 1–6

  8. Du Y, Bi W, Wang T, Zhang Y, Ai H (2007) Distributing expressional faces in 2-d emotional space. In: Proceedings of the 6th ACM international conference on image and video retrieval, pp 395–400

  9. Ekman P, Dalgleish T, Power M (1999) Handbook of cognition and emotion. Wiley, Chihester

    Google Scholar 

  10. Ekman P, Friesen W, Hager J (2002) Facial action coding system. Research Nexus eBook

  11. Fragopanagos N, Taylor J (2005) Emotion recognition in human–computer interaction. Neural Netw 18:389–405

    Article  Google Scholar 

  12. Fredrickson B (2003) The value of positive emotions. Am Sci 91:330–335

    Google Scholar 

  13. Gosselin F, Schyns P (2001) Bubbles: A technique to reveal the use of information in recognition tasks. Vis Res 41:2261–2271

    Article  Google Scholar 

  14. Gunes H, Schuller B, Pantic M, Cowie R (2011) Emotion representation, analysis and synthesis in continuous space: a survey. In: 2011 IEEE international conference on automatic face gesture recognition and workshops (FG 2011), pp 827–834

  15. Hall M (1998) Correlation-based feature selection for machine learning. PhD thesis, Hamilton, New Zealand

  16. Hammal Z, Couvreur L, Caplier A, Rombaut M (2007) Facial expression classification: An approach based on the fusion of facial deformations using the transferable belief model. Int J Approx Reason 46:542–567

    Article  Google Scholar 

  17. Ji Q, Lan P, Looney C (2006) A probabilistic framework for modeling and real-time monitoring human fatigue. IEEE Trans Syst Man Cybern Part A 36:862–875

    Article  Google Scholar 

  18. Kapoor A, Burleson W, Picard R (2007) Automatic prediction of frustration. Int J Hum Comput Stud 65:724–736

    Article  Google Scholar 

  19. Keltner D, Ekman P (2000) Facial expression of emotion. Handb Emot 2:236–249

    Google Scholar 

  20. Kotsia I, Pitas I (2007) Facial expression recognition in image sequences using geometric deformation features and support vector machines. IEEE Trans Image Process 16:172–187

    Article  MathSciNet  Google Scholar 

  21. Kumar P, Yildirim E (2005) Minimum-volume enclosing ellipsoids and core sets. J Optim Theory Appl 126:1–21

    Article  MathSciNet  MATH  Google Scholar 

  22. Littlewort G, Bartlett M, Fasel I, Chenu J, Movellan J (2004) Analysis of machine learning methods for real-time recognition of facial expressions from video. In: Computer vision and pattern recognition, face processing workshop

  23. Littlewort G, Bartlett M, Fasel I, Susskind J, Movellan J (2006) Dynamics of facial expression extracted automatically from video. Image Vis Comput 24:615–625

    Article  Google Scholar 

  24. Liu W, Lu J, Wang Z, Song H (2008) An expression space model for facial expression analysis. In: Congress on image and signal processing (CISP 08), vol 2, pp 680–684

  25. Lucey S, Ashraf A, Cohn J (2007) Investigating spontaneous facial action recognition through aam representations of the face. In: Handbook on Face Recognition, pp 275–286

  26. Maalej A, Amor BB, Daoudi M, Srivastava A, Berretti S (2011) Shape analysis of local facial patches for 3d facial expression recognition. Pattern Recogn 44(8):1581–1589

    Article  Google Scholar 

  27. Pantic M, Bartlett M (2007) Machine analysis of facial expressions. Face recognition pp 377–416

  28. Pantic M, Valstar M, Rademaker R, Maat L (2005) Web-based database for facial expression analysis. In: IEEE international conference on multimedia and expo, pp 317–321

  29. Plutchik R (1980) Emotion: a psychoevolutionary synthesis. Harper & Row, New York

    Google Scholar 

  30. SeeingMachines (2008) Face api technical specifications brochure. http://www.seeingmachines.com/pdfs/brochures/faceAPI-Brochure.pdf. Accessed 1 Mar 2010

  31. Shin Y (2007) Facial expression recognition based on emotion dimensions on manifold learning. In: International conference on computational science. Springer, Berlin, pp 81–88

  32. Soyel H, Demirel H (2007) Facial expression recognition using 3d facial feature distances. Lect Notes Comput Sci 4633:831–838

    Article  Google Scholar 

  33. Stoiber N, Seguier R, Breton G (2009) Automatic design of a control interface for a synthetic face. In: Proceedings of the 13th international conference on intelligent user interfaces, pp 207–216

  34. Tang H, Huang T (2008) 3D facial expression recognition based on automatically selected features. In: IEEE international conference on computer vision and pattern recognition, pp 1–8

  35. Wallhoff F (2006) Facial expressions and emotion database. http://www.mmk.ei.tum.de/waf/fgnet/feedtum.html, Technische Universität München

  36. Whissell C (1989) The dictionary of affect in language. In: Emotion: theory, research and experience. The measurement of emotions, vol 4. Academic, New York

  37. Whissell C (2000) Whissell’s dictionary of affect in language technical manual and user’s guide. http://www.hdcus.com/manuals/wdalman.pdf. Accessed 18 Dec 2010

  38. Whitehill J, Bartlett M, Movellan J (2008) Automated teacher feedback using facial expression recognition. In: Workshop on CVPR for human communicative behavior analysis, IEEE conference on computer vision and pattern recognition

  39. Witten I, Frank E (2005) Data Mining: practical machine learning tools and techniques, 2nd edn. Morgan Kaufmann, San Francisco

    MATH  Google Scholar 

  40. Wu Y, Liu H, Zha H (2005) Modeling facial expression space for recognition. In: 2005 IEEE/RSJ International conference on intelligent robots and systems (IROS 2005), pp 1968–1973

  41. Yeasin M, Bullot B, Sharma R (2006) Recognition of facial expressions and measurement of levels of interest from video. IEEE Trans Multimedia 8:500–508

    Article  Google Scholar 

  42. Zeng Z, Pantic M, Roisman G, Huang T (2009) A survey of affect recognition methods: audio, visual, and spontaneous expressions. IEEE Trans Pattern Anal Mach Intell 39–58

  43. Zhang Y, Ji Q (2005) Active and dynamic information fusion for facial expression understanding from image sequences. IEEE Trans Pattern Anal Mach Intell 699–714

  44. Zhang Z, Lyons M, Schuster M, Akamatsu S (1998) Comparison between geometry-based and gabor-wavelets-based facial expression recognition using multi-layer perceptron. In: Proceedings of the 3rd international conference on face and gesture recognition, pp 454–459

Download references

Acknowledgments

This work has been partly financed by the Spanish Government through the DGICYT contract TIN2011-24660, by the project FEDER ATIC, and the SISTRONIC Group of the Aragon Institute of Technology (Ref. T84).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Isabelle Hupont.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Hupont, I., Baldassarri, S. & Cerezo, E. Facial emotional classification: from a discrete perspective to a continuous emotional space. Pattern Anal Applic 16, 41–54 (2013). https://doi.org/10.1007/s10044-012-0286-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10044-012-0286-6

Keywords