Abstract
Emotion-based analysis has raised a lot of interest, particularly in areas such as forensics, medicine, music, psychology, and human-machine interface. Following this trend, the use of facial analysis (either automatic or human-based) is the most common subject to be investigated once this type of data can easily be collected and is well accepted in the literature as a metric for inference of emotional states. Despite this popularity, due to several constraints found in real-world scenarios (e.g. lightning, complex backgrounds, facial hair and so on), automatically obtaining affective information from face accurately is a very challenging accomplishment. This work presents a framework which aims to analyse emotional experiences through spontaneous facial expressions. The method consists of a new four-dimensional model, called FAMOS, to describe emotional experiences in terms of appraisal, facial expressions, mood, and subjective experiences using a semi-automatic facial expression analyser as ground truth for describing the facial actions. In addition, we present an experiment using a new protocol proposed to obtain spontaneous emotional reactions. The results have suggested that the initial emotional state described by the participants of the experiment was different from that described after the exposure to the eliciting stimulus, thus showing that the used stimuli were capable of inducing the expected emotional states in most individuals. Moreover, our results pointed out that spontaneous facial reactions to emotions are very different from those in prototypic expressions, especially in terms of expressiveness.



Source: http://biometrics.idealtest.org/. Access date: 04 April 2015









Similar content being viewed by others
Notes
Source: https://support.office.com/en-US/Article/Add-a-trend-or-moving-average-line-to-a-chart-3c4323b1-e377-43b9-b54b-fae160d97965. Access date: 18 March 2016.
The terms used to describe the strength of the correlation were obtained from Gerstman [28].
References
Adolphs R (2002) Recognizing emotion from facial expressions: psychological and neurological mechanisms. Behav Cogn Neurosci Rev 1(1):21–62. https://doi.org/10.1177/1534582302001001003
Bartlett M, Littlewort G, Frank M, Lainscsek C, Fasel I, Movellan J (2005) Recognizing facial expression: machine learning and application to spontaneous behavior. In: IEEE computer society conference on computer vision and pattern recognition, CVPR 2005, vol 2, pp 568–573. https://doi.org/10.1109/CVPR.2005.297
Bazzo J, Lamar M (2004) Recognizing facial actions using gabor wavelets with neutral face average difference. In: Proceedings of IEEE international conference on automatic face and gesture recognition, pp 505–510. https://doi.org/10.1109/AFGR.2004.1301583
Black MJ, Yacoob Y (1997) Recognizing facial expressions in image sequences using local parameterized models of image motion. Int J Comput Vis 25(1):23–48. https://doi.org/10.1023/A:1007977618277
Bradski G, Kaehler A (2008) Learning OpenCV: computer vision with the OpenCV library, 2nd edn. O’Reilly Media, Inc., Sebastopol
Capurso A, Foundation MR (1952) Music and your emotions: a practical guide to music selections associated with desired emotional responses. Liveright Publishing Corporation, New York
Chanel G, Kronegg J, Grandjean D, Pun T (2006) Emotion assessment: arousal evaluation using EEG’s and peripheral physiological signals. In: Gunsel B, Jain A, Tekalp A, Sankur B (eds) Multimedia content representation, classification and security, vol 4105. Lecture notes in computer science. Springer, Berlin, pp 530–537. https://doi.org/10.1007/11848035_70
Chuang C, Shih F (2006) Recognizing facial action units using independent component analysis and support vector machine. Pattern Recognit 39(9):1795–1798. https://doi.org/10.1016/j.patcog.2006.03.017
Cohen I, Sebe N, Gozman F, Cirelo MC, Huang TS (2003) Learning Bayesian network classifiers for facial expression recognition both labeled and unlabeled data. In: Proceedings of the IEEE international conference on computer vision and pattern recognition, vol 1, pp I–595. https://doi.org/10.1109/CVPR.2003.1211408
Cohn JF, Schmidt KL (2004) The timing of facial motion in posed and spontaneous smiles. J Wavelets Multi-resolution Inf Process 2:1–12
Colibazzi T, Posner J, Wang Z, Gorman D, Gerber A, Yu S, Zhu H, Kangarlu A, Duan Y, Russell J, Peterson B (2010) Neural systems subserving valence and arousal during the experience of induced emotions. Emotion 10(3):377–389. https://doi.org/10.1037/a0018484
Dael N, Mortillaro M, Scherer KR (2012) Emotion expression in body action and posture. Emotion 12(5):1085–1101. https://doi.org/10.1037/a0025737
Daros AR, Zakzanis KK, Ruocco AC (2013) Facial emotion recognition in borderline personality disorder. Psychol Med 43:1953–1963. https://doi.org/10.1017/S0033291712002607
Darwin C (1998) The expression of the emotions in man and animals, 3rd edn. Oxford University Press, Oxford
De la Torre F, Cohn J (2011) Facial expression analysis. In: Moeslund TB, Hilton A, Krüger V, Sigal L (eds) Visual analysis of humans, pp 377–409. https://doi.org/10.1007/978-0-85729-997-0_19
Deruelle C, Rondan C, Gepner B, Tardif C (2004) Spatial frequency and face processing in children with autism and Asperger syndrome. J Autism Dev Disord 34(2):199–210. https://doi.org/10.1023/B:JADD.0000022610.09668.4c
Donges US, Kersting A, Suslow T (2012) Women’s greater ability to perceive happy facial emotion automatically: gender differences in affective priming. PloS One 7(7):e41745. https://doi.org/10.1371/journal.pone.0041745
Dornaika F, Moujahid A, Raducanu B (2013) Facial expression recognition using tracked facial actions: classifier performance analysis. Eng Appl Artif Intell 26(1):467–477. https://doi.org/10.1016/j.engappai.2012.09.002
Douglas-Cowie E, Campbell N, Cowie R, Roach P (2003) Emotional speech: towards a new generation of databases. Speech Commun 40(1):33–60
Duthoit CJ, Sztynda T, Lal SKL, Jap BT, Agbinya JI (2008) Optical flow image analysis of facial expressions of human emotion: Forensic applications. In: Proceedings of the 1st international conference on forensic applications and techniques in telecommunications, information, and multimedia and workshop, ICST (Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering), e-Forensics ’08, pp 5:1–5:6
Eerola T, Vuoskoski J (2010) A comparison of the discrete and dimensional models of emotion in music. Psychol Music 39(1):18–49. https://doi.org/10.1177/0305735610362821
Ekman P, Friesen W (1976) Measuring facial movement. Environ Psychol Nonverbal Behav 1(1):56–75. https://doi.org/10.1007/BF01115465
Ekman P, Friesen WV, O’Sullivan M, Chan A, Diacoyanni-Tarlatzis I, Heider K, Krause R, LeCompte WA, Pitcairn T, Ricci-Bitti PE et al (1987) Universals and cultural differences in the judgments of facial expressions of emotion. J Personal Soc Psychol 53(4):712–717. https://doi.org/10.1037/0022-3514.53.4.712
Ekman P, Friesen W, Hager J (2002) Facial action coding system (FACS): manual. A Human Face
Ellsworth PC, Scherer KR (2003) Appraisal processes in emotion. Handb Affect Sci 572:572–595
Fontaine J, Scherer K, Roesch E, Ellsworth P (2007) The world of emotions is not two-dimensional. Psycholical Sci 18(12):1050–1057. https://doi.org/10.1111/j.1467-9280.2007.02024.x
Gašpar T, Labor M, Jurić I, Dumančić D, Ilakovac V, Heffer M (2011) Comparison of emotion recognition from facial expression and music. Coll Antropol 35(1):163–167
Gerstman BB (2003) Statprimer. http://www.sjsu.edu/faculty/gerstman/StatPrimer/. Accessed 08 Nov 2014
Girard JM, Cohn JF, Mahoor MH, Mavadati S, Rosenwald DP (2013) Social risk and depression: evidence from manual and automatic facial expression analysis. In: Proceedings of IEEE international conference on automatic face and gesture recognition, pp 1–8. https://doi.org/10.1109/FG.2013.6553748
Gonzalez R, Woods R (2008) Digital image processing, 3rd edn. Pearson/Prentice Hall, Upper Saddle River
Gunes H, Schuller B (2013) Categorical and dimensional affect analysis in continuous input: current trends and future directions. Image Vis Comput 31(2):120–136. https://doi.org/10.1016/j.imavis.2012.06.016 (affect Analysis In Continuous Input)
Hamm J, Kohler C, Gur R, Verma R (2011) Automated facial action coding system for dynamic analysis of facial expressions in neuropsychiatric disorders. J Neurosci Methods 200(2):237–256. https://doi.org/10.1016/j.jneumeth.2011.06.023
Hess U, Philippot P, Blairy S (1998) Facial reactions to emotional facial expressions: Affect or cognition? Cogn Emotion 12(4):509–531. https://doi.org/10.1080/026999398379547
Hoffmann H, Kessler H, Eppel T, Rukavina S, Traue HC (2010) Expression intensity, gender and facial emotion recognition: women recognize only subtle facial emotions better than men. Acta Psychol 135(3):278–283. https://doi.org/10.1016/j.actpsy.2010.07.012
Hu X (2010) Music and mood: where theory and reality meet. In: Proceedings of iConference
Huang CLC, Hsiao S, Hwu HG, Howng SL (2012) The chinese facial emotion recognition database (CFERD): a computer-generated 3-D paradigm to measure the recognition of facial emotional expressions at different intensities. Psychiatry Res 200(2–3):928–932. https://doi.org/10.1016/j.psychres.2012.03.038
Jack RE, Garrod OG, Schyns PG (2014) Dynamic facial expressions of emotion transmit an evolving hierarchy of signals over time. Current Biol 24(2):187–192. https://doi.org/10.1016/j.cub.2013.11.064
Jiang L, Qing Z, Wenyuan W (2000) A novel approach to analyze the result of polygraph. Proc IEEE Int Conf Syst Man Cybern 4:2884–2886. https://doi.org/10.1109/ICSMC.2000.884436
Jongh E (2002) Fed: an online facial expression dictionary as a first step in the creation of a complete nonverbal dictionary. Master’s thesis, Delft University of Technology
Juslin P (2013) From everyday emotions to aesthetic emotions: towards a unified theory of musical emotions. Phys Life Rev 10(3):235–266. https://doi.org/10.1016/j.plrev.2013.05.008
Kim K, Bang S, Kim S (2004) Emotion recognition system using short-term monitoring of physiological signals. Med Biol Eng Comput 42(3):419–427. https://doi.org/10.1007/BF02344719
Kobayashi H, Hara F (1991) The recognition of basic facial expressions by neural network. Proc IEEE Int Jt Conf Neural Netw 1:460–466. https://doi.org/10.1109/IJCNN.1991.170444
Korsakova-Kreyn M, Dowling WJ (2012) Emotion in music: affective responses to motion in tonal space. In: Proceedings of the 12th international conference on music perception and cognition and the 8th triennial conference of the European society for the cognitive sciences of music, pp 23–28
Krumhansl C (1997) An exploratory study of musical emotions and psychophysiology. Can J Exp Psychol/Rev Can Psychol Exp 51(4):336–353. https://doi.org/10.1037/1196-1961.51.4.336
Laurier C, Grivolla J, Herrera P (2008) Multimodal music mood classification using audio and lyrics. In: Proceedings of the international conference on machine learning and applications, San Diego, California, USA, pp 688–693. https://doi.org/10.1109/ICMLA.2008.96
Le Groux S, Valjamae A, Manzolli J, Verschure PF (2008) Implicit physiological interaction for the generation of affective musical sounds. In: Proceedings of the international computer music conference. Pompeu Fabra University, SemanticScholar, Barcelona, Spain
Lucey S, Ashraf AB, Cohn J (2007) Investigating spontaneous facial action recognition through aam representations of the face. In: Kurihara K (ed) Face recognition book, pp 275–286
Mehrabian A (1968) Communication without words, vol 2. Psychological Today, New York
Morris JD, Klahr NJ, Shen F, Villegas J, Wright P, He G, Liu Y (2009) Mapping a multidimensional emotion in response to television commercials. Hum Brain Mapp 30(3):789–796. https://doi.org/10.1002/hbm.20544
Nakanishi T, Kitagawa T (2006) Visualization of music impression in facial expression to represent emotion. Proc Asia Pac Conf Concept Model 53:55–64
Nauert R (2009) Women recognize emotions better. Psych Central http://psychcentral.com/news/2009/10/22/women-recognize-emotions-better/9100.html. Accessed 11 Oct 2014
OT́oole AJ, Harms J, Snow SL, Hurst DR, Pappas MR, Ayyad JH, Abdi H (2005) A video database of moving faces and people. IEEE Trans Pattern Anal Mach Intell 27(5):812–816. https://doi.org/10.1109/TPAMI.2005.90
Pantic M, Valstar M, Rademaker R, Maat L (2005) Web-based database for facial expression analysis. In: IEEE international conference on multimedia and Expo, ICME, pp 317–321. https://doi.org/10.1109/ICME.2005.1521424
Pease A, Pease B (2008) The definitive book of body language. Random House LLC, New York
Picard R, Vyzas E, Healey J (2001) Toward machine emotional intelligence: analysis of affective physiological state. IEEE Trans Pattern Anal Mach Intell 23(10):1175–1191. https://doi.org/10.1109/34.954607
Posner J, Russell J, Peterson B (2005) The circumplex model of affect: an integrative approach to affective neuroscience, cognitive development, and psychopathology. Dev Psychopathol 17(3):715–734. https://doi.org/10.1017/S0954579405050340
Quan W, Matuszewski B, Shark L, Frowd C (2011) Methodology and performance analysis of 3-D facial expression recognition using statistical shape representation. Int J Grid Distrib Comput 4(3):79–88
Robin M, Pham-Scottez A, Curt F, Dugre-Le Bigre C, Speranza M, Sapinho D, Corcos M, Berthoz S, Kedia G (2012) Decreased sensitivity to facial emotions in adolescents with borderline personality disorder. Psychiatry Res 200(2):417–421. https://doi.org/10.1016/j.psychres.2012.03.032
Sariyanidi E, Gunes H, Cavallaro A (2014) Automatic analysis of facial affect: a survey of registration, representation and recognition. IEEE Trans Pattern Anal Mach Intell 99(PrePrints):1. https://doi.org/10.1109/TPAMI.2014.2366127
Savran A, Sankur B, Taha Bilge M (2012) Regression-based intensity estimation of facial action units. Image Vis Comput 30(10):774–784. https://doi.org/10.1016/j.imavis.2011.11.008
Scherer KR (2005) What are emotions? And how can they be measured? Soc Sci Inf 44(4):695–729
Scherer KR (2009) The dynamic architecture of emotion: evidence for the component process model. Cogn Emotion 23(7):1309–1316. https://doi.org/10.1080/02699930902928969
Schimmack U, Grob A (2000) Dimensional models of core affect: a quantitative comparison by means of structural equation modeling. Eur J Person 14(4):325–345. https://doi.org/10.1002/1099-0984(200007/08)14:4%3c325::AID-PER380%3e3.0.CO;2-I
Schubert E (1996) Continuous response to music using a two dimensional emotion space. In: Proceedings of the international conference of music perception and cognition, pp 263–268
Sloboda J, Juslin P (2001) Psychological perspectives on music and emotion. In: Music and emotion: theory and research, pp 71–104
Smeaton AF, Rothwell S (2009) Biometric responses to music-rich segments in films: the cdvplex. In: International workshop on content-based multimedia indexing, pp 162–168. https://doi.org/10.1109/CBMI.2009.21
Sun S, Ge C (2014) A new method of 3D facial expression animation. J Appl Math 2014:1–6. https://doi.org/10.1155/2014/706159
Tian Y, Kanade T, Cohn J (2001) Recognizing action units for facial expression analysis. IEEE Trans Pattern Anal Mach Intell 23(2):97–115. https://doi.org/10.1109/34.908962
Tian YL, Kanade T, Cohn JF (2005) Facial expression analysis. In: Handbook of face recognition, chap 11. Springer, pp 247–275
Trkulja M, Janković D (2012) Towards three-dimensional model of affective experience of music. Emotion 17:25–40
Tyler P (1996) Developing a two-dimensional continuous response space for emotions perceived in music. Ph.D. thesis, Florida State University
Valstar M, Pantic M (2012) Fully automatic recognition of the temporal phases of facial actions. IEEE Trans Syst Man Cybern 42(1):28–43. https://doi.org/10.1109/TSMCB.2011.2163710
Valstar MF, Pantic M, Ambadar Z, Cohn JF (2006) Spontaneous vs. posed facial behavior: automatic analysis of brow actions. In: Proceedings of ACM Int’l conference on multimodal interfaces, pp 162–170
Veloso L, Carvalho J, Cavalvanti C, Moura E, Coutinho F, Gomes H (2007) Neural network classification of photogenic facial expressions based on fiducial points and gabor features. In: Mery D, Rueda L (eds) Advances in image and video technology, lecture notes in computer science, vol 4872, pp 166–179. https://doi.org/10.1007/978-3-540-77129-6_18
Viera AJ, Garrett JM et al (2005) Understanding interobserver agreement: the kappa statistic. Fam Med 37(5):360–363
Viola P, Jones M (2001) Rapid object detection using a boosted cascade of simple features. Proc IEEE Comput Soc Conf Comput Vis Pattern Recognit 1:511–518. https://doi.org/10.1109/CVPR.2001.990517
Vukadinovic D, Pantic M (2005) Fully automatic facial feature point detection using gabor feature based boosted classifiers. IEEE Trans Syst Man Cybern 2:1692–1698. https://doi.org/10.1109/ICSMC.2005.1571392
Vuoskoski JK, Eerola T (2011) Measuring music-induced emotion a comparison of emotion models, personality biases, and intensity of experiences. Music Sci 15(2):159–173. https://doi.org/10.1177/1029864911403367
Wallbott HG, Scherer KR (1989) Assessing emotion by questionnaire. Emotion Theory Res Exp 4:55–82
Wimmer M, MacDonald B, Jayamuni D, Yadav A (2008) Facial expression recognition for human-robot interaction—a prototype. In: Sommer G, Klette R (eds) Robot vision, Lecture notes in computer science, vol 4931, pp 139–152. https://doi.org/10.1007/978-3-540-78157-8_11
Yang P, Liu Q, Metaxas DN (2007) Boosting coded dynamic features for facial action units and facial expression recognition. In: Proceedings of the IEEE computer society conference on computer vision and pattern recognition, pp 1–6. https://doi.org/10.1109/CVPR.2007.383059
Zeng Z, Fu Y, Roisman GI, Wen Z, Hu Y, Huang TS (2006) Spontaneous emotional facial expression detection. J Multimed 1(5):1–8
Zeng Z, Pantic M, Roisman G, Huang T (2009) A survey of affect recognition methods: audio, visual, and spontaneous expressions. IEEE Trans Pattern Anal Mach Intell 31(1):39–58. https://doi.org/10.1109/TPAMI.2008.52
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Costa-Abreu, M.D., Bezerra, G.S. FAMOS: a framework for investigating the use of face features to identify spontaneous emotions. Pattern Anal Applic 22, 683–701 (2019). https://doi.org/10.1007/s10044-017-0675-y
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10044-017-0675-y