Skip to main content
Log in

FAMOS: a framework for investigating the use of face features to identify spontaneous emotions

  • Theoretical Advances
  • Published:
Pattern Analysis and Applications Aims and scope Submit manuscript

Abstract

Emotion-based analysis has raised a lot of interest, particularly in areas such as forensics, medicine, music, psychology, and human-machine interface. Following this trend, the use of facial analysis (either automatic or human-based) is the most common subject to be investigated once this type of data can easily be collected and is well accepted in the literature as a metric for inference of emotional states. Despite this popularity, due to several constraints found in real-world scenarios (e.g. lightning, complex backgrounds, facial hair and so on), automatically obtaining affective information from face accurately is a very challenging accomplishment. This work presents a framework which aims to analyse emotional experiences through spontaneous facial expressions. The method consists of a new four-dimensional model, called FAMOS, to describe emotional experiences in terms of appraisal, facial expressions, mood, and subjective experiences using a semi-automatic facial expression analyser as ground truth for describing the facial actions. In addition, we present an experiment using a new protocol proposed to obtain spontaneous emotional reactions. The results have suggested that the initial emotional state described by the participants of the experiment was different from that described after the exposure to the eliciting stimulus, thus showing that the used stimuli were capable of inducing the expected emotional states in most individuals. Moreover, our results pointed out that spontaneous facial reactions to emotions are very different from those in prototypic expressions, especially in terms of expressiveness.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Source: http://biometrics.idealtest.org/. Access date: 04 April 2015

Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12

Similar content being viewed by others

Notes

  1. Source: https://support.office.com/en-US/Article/Add-a-trend-or-moving-average-line-to-a-chart-3c4323b1-e377-43b9-b54b-fae160d97965. Access date: 18 March 2016.

  2. The terms used to describe the strength of the correlation were obtained from Gerstman [28].

References

  1. Adolphs R (2002) Recognizing emotion from facial expressions: psychological and neurological mechanisms. Behav Cogn Neurosci Rev 1(1):21–62. https://doi.org/10.1177/1534582302001001003

    Article  Google Scholar 

  2. Bartlett M, Littlewort G, Frank M, Lainscsek C, Fasel I, Movellan J (2005) Recognizing facial expression: machine learning and application to spontaneous behavior. In: IEEE computer society conference on computer vision and pattern recognition, CVPR 2005, vol 2, pp 568–573. https://doi.org/10.1109/CVPR.2005.297

  3. Bazzo J, Lamar M (2004) Recognizing facial actions using gabor wavelets with neutral face average difference. In: Proceedings of IEEE international conference on automatic face and gesture recognition, pp 505–510. https://doi.org/10.1109/AFGR.2004.1301583

  4. Black MJ, Yacoob Y (1997) Recognizing facial expressions in image sequences using local parameterized models of image motion. Int J Comput Vis 25(1):23–48. https://doi.org/10.1023/A:1007977618277

    Article  Google Scholar 

  5. Bradski G, Kaehler A (2008) Learning OpenCV: computer vision with the OpenCV library, 2nd edn. O’Reilly Media, Inc., Sebastopol

    Google Scholar 

  6. Capurso A, Foundation MR (1952) Music and your emotions: a practical guide to music selections associated with desired emotional responses. Liveright Publishing Corporation, New York

    Google Scholar 

  7. Chanel G, Kronegg J, Grandjean D, Pun T (2006) Emotion assessment: arousal evaluation using EEG’s and peripheral physiological signals. In: Gunsel B, Jain A, Tekalp A, Sankur B (eds) Multimedia content representation, classification and security, vol 4105. Lecture notes in computer science. Springer, Berlin, pp 530–537. https://doi.org/10.1007/11848035_70

    Chapter  Google Scholar 

  8. Chuang C, Shih F (2006) Recognizing facial action units using independent component analysis and support vector machine. Pattern Recognit 39(9):1795–1798. https://doi.org/10.1016/j.patcog.2006.03.017

    Article  MATH  Google Scholar 

  9. Cohen I, Sebe N, Gozman F, Cirelo MC, Huang TS (2003) Learning Bayesian network classifiers for facial expression recognition both labeled and unlabeled data. In: Proceedings of the IEEE international conference on computer vision and pattern recognition, vol 1, pp I–595. https://doi.org/10.1109/CVPR.2003.1211408

  10. Cohn JF, Schmidt KL (2004) The timing of facial motion in posed and spontaneous smiles. J Wavelets Multi-resolution Inf Process 2:1–12

    Article  Google Scholar 

  11. Colibazzi T, Posner J, Wang Z, Gorman D, Gerber A, Yu S, Zhu H, Kangarlu A, Duan Y, Russell J, Peterson B (2010) Neural systems subserving valence and arousal during the experience of induced emotions. Emotion 10(3):377–389. https://doi.org/10.1037/a0018484

    Article  Google Scholar 

  12. Dael N, Mortillaro M, Scherer KR (2012) Emotion expression in body action and posture. Emotion 12(5):1085–1101. https://doi.org/10.1037/a0025737

    Article  Google Scholar 

  13. Daros AR, Zakzanis KK, Ruocco AC (2013) Facial emotion recognition in borderline personality disorder. Psychol Med 43:1953–1963. https://doi.org/10.1017/S0033291712002607

    Article  Google Scholar 

  14. Darwin C (1998) The expression of the emotions in man and animals, 3rd edn. Oxford University Press, Oxford

    Google Scholar 

  15. De la Torre F, Cohn J (2011) Facial expression analysis. In: Moeslund TB, Hilton A, Krüger V, Sigal L (eds) Visual analysis of humans, pp 377–409. https://doi.org/10.1007/978-0-85729-997-0_19

  16. Deruelle C, Rondan C, Gepner B, Tardif C (2004) Spatial frequency and face processing in children with autism and Asperger syndrome. J Autism Dev Disord 34(2):199–210. https://doi.org/10.1023/B:JADD.0000022610.09668.4c

    Article  Google Scholar 

  17. Donges US, Kersting A, Suslow T (2012) Women’s greater ability to perceive happy facial emotion automatically: gender differences in affective priming. PloS One 7(7):e41745. https://doi.org/10.1371/journal.pone.0041745

    Article  Google Scholar 

  18. Dornaika F, Moujahid A, Raducanu B (2013) Facial expression recognition using tracked facial actions: classifier performance analysis. Eng Appl Artif Intell 26(1):467–477. https://doi.org/10.1016/j.engappai.2012.09.002

    Article  Google Scholar 

  19. Douglas-Cowie E, Campbell N, Cowie R, Roach P (2003) Emotional speech: towards a new generation of databases. Speech Commun 40(1):33–60

    Article  MATH  Google Scholar 

  20. Duthoit CJ, Sztynda T, Lal SKL, Jap BT, Agbinya JI (2008) Optical flow image analysis of facial expressions of human emotion: Forensic applications. In: Proceedings of the 1st international conference on forensic applications and techniques in telecommunications, information, and multimedia and workshop, ICST (Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering), e-Forensics ’08, pp 5:1–5:6

  21. Eerola T, Vuoskoski J (2010) A comparison of the discrete and dimensional models of emotion in music. Psychol Music 39(1):18–49. https://doi.org/10.1177/0305735610362821

    Article  Google Scholar 

  22. Ekman P, Friesen W (1976) Measuring facial movement. Environ Psychol Nonverbal Behav 1(1):56–75. https://doi.org/10.1007/BF01115465

    Article  Google Scholar 

  23. Ekman P, Friesen WV, O’Sullivan M, Chan A, Diacoyanni-Tarlatzis I, Heider K, Krause R, LeCompte WA, Pitcairn T, Ricci-Bitti PE et al (1987) Universals and cultural differences in the judgments of facial expressions of emotion. J Personal Soc Psychol 53(4):712–717. https://doi.org/10.1037/0022-3514.53.4.712

    Article  Google Scholar 

  24. Ekman P, Friesen W, Hager J (2002) Facial action coding system (FACS): manual. A Human Face

  25. Ellsworth PC, Scherer KR (2003) Appraisal processes in emotion. Handb Affect Sci 572:572–595

    Google Scholar 

  26. Fontaine J, Scherer K, Roesch E, Ellsworth P (2007) The world of emotions is not two-dimensional. Psycholical Sci 18(12):1050–1057. https://doi.org/10.1111/j.1467-9280.2007.02024.x

    Article  Google Scholar 

  27. Gašpar T, Labor M, Jurić I, Dumančić D, Ilakovac V, Heffer M (2011) Comparison of emotion recognition from facial expression and music. Coll Antropol 35(1):163–167

    Google Scholar 

  28. Gerstman BB (2003) Statprimer. http://www.sjsu.edu/faculty/gerstman/StatPrimer/. Accessed 08 Nov 2014

  29. Girard JM, Cohn JF, Mahoor MH, Mavadati S, Rosenwald DP (2013) Social risk and depression: evidence from manual and automatic facial expression analysis. In: Proceedings of IEEE international conference on automatic face and gesture recognition, pp 1–8. https://doi.org/10.1109/FG.2013.6553748

  30. Gonzalez R, Woods R (2008) Digital image processing, 3rd edn. Pearson/Prentice Hall, Upper Saddle River

    Google Scholar 

  31. Gunes H, Schuller B (2013) Categorical and dimensional affect analysis in continuous input: current trends and future directions. Image Vis Comput 31(2):120–136. https://doi.org/10.1016/j.imavis.2012.06.016 (affect Analysis In Continuous Input)

    Article  Google Scholar 

  32. Hamm J, Kohler C, Gur R, Verma R (2011) Automated facial action coding system for dynamic analysis of facial expressions in neuropsychiatric disorders. J Neurosci Methods 200(2):237–256. https://doi.org/10.1016/j.jneumeth.2011.06.023

    Article  Google Scholar 

  33. Hess U, Philippot P, Blairy S (1998) Facial reactions to emotional facial expressions: Affect or cognition? Cogn Emotion 12(4):509–531. https://doi.org/10.1080/026999398379547

    Article  Google Scholar 

  34. Hoffmann H, Kessler H, Eppel T, Rukavina S, Traue HC (2010) Expression intensity, gender and facial emotion recognition: women recognize only subtle facial emotions better than men. Acta Psychol 135(3):278–283. https://doi.org/10.1016/j.actpsy.2010.07.012

    Article  Google Scholar 

  35. Hu X (2010) Music and mood: where theory and reality meet. In: Proceedings of iConference

  36. Huang CLC, Hsiao S, Hwu HG, Howng SL (2012) The chinese facial emotion recognition database (CFERD): a computer-generated 3-D paradigm to measure the recognition of facial emotional expressions at different intensities. Psychiatry Res 200(2–3):928–932. https://doi.org/10.1016/j.psychres.2012.03.038

    Article  Google Scholar 

  37. Jack RE, Garrod OG, Schyns PG (2014) Dynamic facial expressions of emotion transmit an evolving hierarchy of signals over time. Current Biol 24(2):187–192. https://doi.org/10.1016/j.cub.2013.11.064

    Article  Google Scholar 

  38. Jiang L, Qing Z, Wenyuan W (2000) A novel approach to analyze the result of polygraph. Proc IEEE Int Conf Syst Man Cybern 4:2884–2886. https://doi.org/10.1109/ICSMC.2000.884436

    Article  Google Scholar 

  39. Jongh E (2002) Fed: an online facial expression dictionary as a first step in the creation of a complete nonverbal dictionary. Master’s thesis, Delft University of Technology

  40. Juslin P (2013) From everyday emotions to aesthetic emotions: towards a unified theory of musical emotions. Phys Life Rev 10(3):235–266. https://doi.org/10.1016/j.plrev.2013.05.008

    Article  Google Scholar 

  41. Kim K, Bang S, Kim S (2004) Emotion recognition system using short-term monitoring of physiological signals. Med Biol Eng Comput 42(3):419–427. https://doi.org/10.1007/BF02344719

    Article  Google Scholar 

  42. Kobayashi H, Hara F (1991) The recognition of basic facial expressions by neural network. Proc IEEE Int Jt Conf Neural Netw 1:460–466. https://doi.org/10.1109/IJCNN.1991.170444

    Article  Google Scholar 

  43. Korsakova-Kreyn M, Dowling WJ (2012) Emotion in music: affective responses to motion in tonal space. In: Proceedings of the 12th international conference on music perception and cognition and the 8th triennial conference of the European society for the cognitive sciences of music, pp 23–28

  44. Krumhansl C (1997) An exploratory study of musical emotions and psychophysiology. Can J Exp Psychol/Rev Can Psychol Exp 51(4):336–353. https://doi.org/10.1037/1196-1961.51.4.336

    Article  Google Scholar 

  45. Laurier C, Grivolla J, Herrera P (2008) Multimodal music mood classification using audio and lyrics. In: Proceedings of the international conference on machine learning and applications, San Diego, California, USA, pp 688–693. https://doi.org/10.1109/ICMLA.2008.96

  46. Le Groux S, Valjamae A, Manzolli J, Verschure PF (2008) Implicit physiological interaction for the generation of affective musical sounds. In: Proceedings of the international computer music conference. Pompeu Fabra University, SemanticScholar, Barcelona, Spain

  47. Lucey S, Ashraf AB, Cohn J (2007) Investigating spontaneous facial action recognition through aam representations of the face. In: Kurihara K (ed) Face recognition book, pp 275–286

  48. Mehrabian A (1968) Communication without words, vol 2. Psychological Today, New York

    Google Scholar 

  49. Morris JD, Klahr NJ, Shen F, Villegas J, Wright P, He G, Liu Y (2009) Mapping a multidimensional emotion in response to television commercials. Hum Brain Mapp 30(3):789–796. https://doi.org/10.1002/hbm.20544

    Article  Google Scholar 

  50. Nakanishi T, Kitagawa T (2006) Visualization of music impression in facial expression to represent emotion. Proc Asia Pac Conf Concept Model 53:55–64

    Google Scholar 

  51. Nauert R (2009) Women recognize emotions better. Psych Central http://psychcentral.com/news/2009/10/22/women-recognize-emotions-better/9100.html. Accessed 11 Oct 2014

  52. OT́oole AJ, Harms J, Snow SL, Hurst DR, Pappas MR, Ayyad JH, Abdi H (2005) A video database of moving faces and people. IEEE Trans Pattern Anal Mach Intell 27(5):812–816. https://doi.org/10.1109/TPAMI.2005.90

    Article  Google Scholar 

  53. Pantic M, Valstar M, Rademaker R, Maat L (2005) Web-based database for facial expression analysis. In: IEEE international conference on multimedia and Expo, ICME, pp 317–321. https://doi.org/10.1109/ICME.2005.1521424

  54. Pease A, Pease B (2008) The definitive book of body language. Random House LLC, New York

    MATH  Google Scholar 

  55. Picard R, Vyzas E, Healey J (2001) Toward machine emotional intelligence: analysis of affective physiological state. IEEE Trans Pattern Anal Mach Intell 23(10):1175–1191. https://doi.org/10.1109/34.954607

    Article  Google Scholar 

  56. Posner J, Russell J, Peterson B (2005) The circumplex model of affect: an integrative approach to affective neuroscience, cognitive development, and psychopathology. Dev Psychopathol 17(3):715–734. https://doi.org/10.1017/S0954579405050340

    Article  Google Scholar 

  57. Quan W, Matuszewski B, Shark L, Frowd C (2011) Methodology and performance analysis of 3-D facial expression recognition using statistical shape representation. Int J Grid Distrib Comput 4(3):79–88

    Google Scholar 

  58. Robin M, Pham-Scottez A, Curt F, Dugre-Le Bigre C, Speranza M, Sapinho D, Corcos M, Berthoz S, Kedia G (2012) Decreased sensitivity to facial emotions in adolescents with borderline personality disorder. Psychiatry Res 200(2):417–421. https://doi.org/10.1016/j.psychres.2012.03.032

    Article  Google Scholar 

  59. Sariyanidi E, Gunes H, Cavallaro A (2014) Automatic analysis of facial affect: a survey of registration, representation and recognition. IEEE Trans Pattern Anal Mach Intell 99(PrePrints):1. https://doi.org/10.1109/TPAMI.2014.2366127

    Article  Google Scholar 

  60. Savran A, Sankur B, Taha Bilge M (2012) Regression-based intensity estimation of facial action units. Image Vis Comput 30(10):774–784. https://doi.org/10.1016/j.imavis.2011.11.008

    Article  Google Scholar 

  61. Scherer KR (2005) What are emotions? And how can they be measured? Soc Sci Inf 44(4):695–729

    Article  Google Scholar 

  62. Scherer KR (2009) The dynamic architecture of emotion: evidence for the component process model. Cogn Emotion 23(7):1309–1316. https://doi.org/10.1080/02699930902928969

    Article  Google Scholar 

  63. Schimmack U, Grob A (2000) Dimensional models of core affect: a quantitative comparison by means of structural equation modeling. Eur J Person 14(4):325–345. https://doi.org/10.1002/1099-0984(200007/08)14:4%3c325::AID-PER380%3e3.0.CO;2-I

    Article  Google Scholar 

  64. Schubert E (1996) Continuous response to music using a two dimensional emotion space. In: Proceedings of the international conference of music perception and cognition, pp 263–268

  65. Sloboda J, Juslin P (2001) Psychological perspectives on music and emotion. In: Music and emotion: theory and research, pp 71–104

  66. Smeaton AF, Rothwell S (2009) Biometric responses to music-rich segments in films: the cdvplex. In: International workshop on content-based multimedia indexing, pp 162–168. https://doi.org/10.1109/CBMI.2009.21

  67. Sun S, Ge C (2014) A new method of 3D facial expression animation. J Appl Math 2014:1–6. https://doi.org/10.1155/2014/706159

    Article  Google Scholar 

  68. Tian Y, Kanade T, Cohn J (2001) Recognizing action units for facial expression analysis. IEEE Trans Pattern Anal Mach Intell 23(2):97–115. https://doi.org/10.1109/34.908962

    Article  Google Scholar 

  69. Tian YL, Kanade T, Cohn JF (2005) Facial expression analysis. In: Handbook of face recognition, chap 11. Springer, pp 247–275

  70. Trkulja M, Janković D (2012) Towards three-dimensional model of affective experience of music. Emotion 17:25–40

    Google Scholar 

  71. Tyler P (1996) Developing a two-dimensional continuous response space for emotions perceived in music. Ph.D. thesis, Florida State University

  72. Valstar M, Pantic M (2012) Fully automatic recognition of the temporal phases of facial actions. IEEE Trans Syst Man Cybern 42(1):28–43. https://doi.org/10.1109/TSMCB.2011.2163710

    Article  Google Scholar 

  73. Valstar MF, Pantic M, Ambadar Z, Cohn JF (2006) Spontaneous vs. posed facial behavior: automatic analysis of brow actions. In: Proceedings of ACM Int’l conference on multimodal interfaces, pp 162–170

  74. Veloso L, Carvalho J, Cavalvanti C, Moura E, Coutinho F, Gomes H (2007) Neural network classification of photogenic facial expressions based on fiducial points and gabor features. In: Mery D, Rueda L (eds) Advances in image and video technology, lecture notes in computer science, vol 4872, pp 166–179. https://doi.org/10.1007/978-3-540-77129-6_18

  75. Viera AJ, Garrett JM et al (2005) Understanding interobserver agreement: the kappa statistic. Fam Med 37(5):360–363

    Google Scholar 

  76. Viola P, Jones M (2001) Rapid object detection using a boosted cascade of simple features. Proc IEEE Comput Soc Conf Comput Vis Pattern Recognit 1:511–518. https://doi.org/10.1109/CVPR.2001.990517

    Article  Google Scholar 

  77. Vukadinovic D, Pantic M (2005) Fully automatic facial feature point detection using gabor feature based boosted classifiers. IEEE Trans Syst Man Cybern 2:1692–1698. https://doi.org/10.1109/ICSMC.2005.1571392

    Article  Google Scholar 

  78. Vuoskoski JK, Eerola T (2011) Measuring music-induced emotion a comparison of emotion models, personality biases, and intensity of experiences. Music Sci 15(2):159–173. https://doi.org/10.1177/1029864911403367

    Article  Google Scholar 

  79. Wallbott HG, Scherer KR (1989) Assessing emotion by questionnaire. Emotion Theory Res Exp 4:55–82

    Google Scholar 

  80. Wimmer M, MacDonald B, Jayamuni D, Yadav A (2008) Facial expression recognition for human-robot interaction—a prototype. In: Sommer G, Klette R (eds) Robot vision, Lecture notes in computer science, vol 4931, pp 139–152. https://doi.org/10.1007/978-3-540-78157-8_11

  81. Yang P, Liu Q, Metaxas DN (2007) Boosting coded dynamic features for facial action units and facial expression recognition. In: Proceedings of the IEEE computer society conference on computer vision and pattern recognition, pp 1–6. https://doi.org/10.1109/CVPR.2007.383059

  82. Zeng Z, Fu Y, Roisman GI, Wen Z, Hu Y, Huang TS (2006) Spontaneous emotional facial expression detection. J Multimed 1(5):1–8

    Article  Google Scholar 

  83. Zeng Z, Pantic M, Roisman G, Huang T (2009) A survey of affect recognition methods: audio, visual, and spontaneous expressions. IEEE Trans Pattern Anal Mach Intell 31(1):39–58. https://doi.org/10.1109/TPAMI.2008.52

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Márjoy Da Costa-Abreu.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Costa-Abreu, M.D., Bezerra, G.S. FAMOS: a framework for investigating the use of face features to identify spontaneous emotions. Pattern Anal Applic 22, 683–701 (2019). https://doi.org/10.1007/s10044-017-0675-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10044-017-0675-y

Keywords