Skip to main content
Log in

Fuzzy triangulation signature for detection of change in human emotion from face video image sequence

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

The present article proposes a geometry-based fuzzy relational technique for capturing gradual change in human emotion over time available from relevant face image sequences. As associated features, we make use of fuzzy membership arising out of five triangle signatures such as - (i) Fuzzy Isosceles Triangle Signature (FIS), (ii) Fuzzy Right Triangle Signature (FRS), (iii) Fuzzy Right Isosceles Triangle Signature (FIRS), (iv) Fuzzy Equilateral Triangle Signature (FES), and (v) Other Fuzzy Triangles Signature (OFS) to achieve the task of appropriate classification of facial transition from neutrality to one among the six expressions viz. anger (AN), disgust (DI), fear (FE), happiness (HA), sadness (SA) and surprise (SU). The effectiveness of the Multilayer Perceptron (MLP) classifier is tested and validated through 10 fold cross-validation method on three benchmark image sequence datasets namely Extended Cohn-Kanade (CK+), M&M Initiative (MMI), and Multimedia Understanding Group (MUG). Experimental outcomes are found to have achieved accuracy to the tune of 98.47%, 93.56%, and 99.25% on CK+, MMI, and MUG respectively vindicating the effectiveness by exhibiting the superiority of our proposed technique in comparison to other state-of-the-art methods in this regard.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12

Similar content being viewed by others

References

  1. Ahn B, Han Y, Kweon IS (2012) Real-time facial landmarks tracking using active shape model and lk optical flow. In: 2012 9th international conference on ubiquitous robots and ambient intelligence (URAI). IEEE, pp 541–543

  2. Aifanti N, Papachristou C, Delopoulos A (2010) The mug facial expression database. In: 11Th international workshop on image analysis for multimedia interactive services WIAMIS 10. IEEE, pp 1–4

  3. Barman A (2020) Human Emotion Recognition from Face Images. Springer, Berlin

    Google Scholar 

  4. Barman A, Dutta P (2017) Facial expression recognition using distance and shape signature features. Pattern Recognit Lett

  5. Barman A, Dutta P (2019) Facial expression recognition using distance and texture signature relevant features. Appl Soft Comput 77:88–105

    Article  Google Scholar 

  6. Barman A, Dutta P (2019) Influence of shape and texture features on facial expression recognition. IET Image Process 13(8):1349–1363

    Article  Google Scholar 

  7. Bastanfard A, Bastanfard O, Takahashi H, Nakajima M (2004) Toward anthropometrics simulation of face rejuvenation and skin cosmetic. Comput Animat Virt Worlds 15(3-4):347–352

    Article  Google Scholar 

  8. Bastanfard A, Takahashi H, Nakajima M (2004) Toward e-appearance of human face and hair by age, expression and rejuvenation. In: 2004 International conference on cyberworlds. IEEE, pp 306– 311

  9. Choi H-C, Oh S-Y (2006) Realtime facial expression recognition using active appearance model and multilayer perceptron. In: 2006 SICE-ICASE International joint conference. IEEE, pp 5924–5927

  10. Cootes TF, Taylor JC, Cooper DH, Graham J (1995) Active shape models-their training and application. Comput Vision Image Understand 61(1):38–59

    Article  Google Scholar 

  11. Ekman P, Friesen WV (2003) Unmasking the face: A guide to recognizing emotions from facial clues. Ishk

  12. Ghimire D, Lee J (2014) Extreme learning machine ensemble using bagging for facial expression recognition. JIPS 10(3):443–458

    Google Scholar 

  13. Ghimire D, Lee J, Li Z-N, Jeong S (2017) Recognition of facial expressions based on salient geometric features and support vector machines. Multimed Tools Appl 76(6):7921–7946

    Article  Google Scholar 

  14. Gross R, Matthews I, Baker S (2004) Appearance-based face recognition and light-fields. IEEE Trans Pattern Anal Mach Intell 26(4):449–465

    Article  Google Scholar 

  15. Happy SL, Routray A (2015) Robust facial expression classification using shape and appearance features. In: 2015 Eighth international conference on advances in pattern recognition (ICAPR). IEEE, pp 1–5

  16. Kotsia I, Pitas I (2006) Facial expression recognition in image sequences using geometric deformation features and support vector machines. IEEE Trans Image Process 16(1):172–187

    Article  MathSciNet  Google Scholar 

  17. Kumari J, Rajesh R, Pooja KM (2015) Facial expression recognition: A survey. Procedia Comput Sci 58(1):486–491

    Article  Google Scholar 

  18. Li Y, Wang S, Zhao Y, Ji Q (2013) Simultaneous facial feature tracking and facial expression recognition. IEEE Trans Image Process 22(7):2559–2573

    Article  Google Scholar 

  19. Lucey P, Cohn JF, Kanade T, Saragih J, Ambadar Z, Matthews I (2010) The extended cohn-kanade dataset (ck+): a complete dataset for action unit and emotion-specified expression. In: 2010 Ieee computer society conference on computer vision and pattern recognition-workshops. IEEE, pp 94–101

  20. Majumder A, Behera L, Subramanian VK (2014) Emotion recognition from geometric facial features using self-organizing map. Pattern Recogn 47 (3):1282–1293

    Article  Google Scholar 

  21. Meftah IT, Thanh NL, Amar CB (2012) Emotion recognition using knn classification for user modeling and sharing of affect states. In: International conference on neural information processing. Springer, pp 234–242

  22. Mehrabian A, Russell JA (1974) An approach to environmental psychology. MIT Press, Cambridge

    Google Scholar 

  23. Mohammadian A, Aghaeinia H, Towhidkhah F (2015) Video-based facial expression recognition by removing the style variations. IET Image Process 9(7):596–603

    Article  Google Scholar 

  24. Ojala T, Pietikäinen M., Harwood D (1996) A comparative study of texture measures with classification based on featured distributions. Pattern Recognit 29(1):51–59

    Article  Google Scholar 

  25. Pantic M, Pentland A, Nijholt A, Huang TS (2007) Machine computing and understanding of human behavior Human A survey. In: Artifical intelligence for human computing. Springer, pp 47–71

  26. Perikos I, Ziakopoulos E, Hatzilygeroudis I (2014) Recognizing emotions from facial expressions using neural network. In: IFIP International conference on artificial intelligence applications and innovations. Springer, pp 236–245

  27. Rahul M, Kohli N, Agarwal R, Mishra S (2019) Facial expression recognition using geometric features and modified hidden markov model. Int J Grid Util Comput 10(5):488–496

    Article  Google Scholar 

  28. Rahulamathavan Y, Phan RC-W, Chambers JA, Parish DJ (2012) Facial expression recognition in the encrypted domain based on local fisher discriminant analysis. IEEE Trans Affect Comput 4(1):83–92

    Article  Google Scholar 

  29. Saeed A, Al-Hamadi A, Niese R, Elzobi M (2014) Frame-based facial expression recognition using geometrical features. Adv Human-Comput Interact 2014

  30. Samadiani N, Huang G, Cai B, Luo W, Chi C-H, Xiang Y, He J (2019) A review on automatic facial expression recognition systems assisted by multimodal sensor data. Sensors 19(8):1863

    Article  Google Scholar 

  31. Shan C, Gong S, McOwan PW (2009) Facial expression recognition based on local binary patterns: A comprehensive study. Image Vision Comput 27 (6):803–816

    Article  Google Scholar 

  32. Sharma G, Singh L, Gautam S (2019) Automatic facial expression recognition using combined geometric features. 3D Res 10(2):14

    Article  Google Scholar 

  33. Terzopoulos D, Waters K (1993) Analysis and synthesis of facial image sequences using physical and anatomical models. IEEE Trans Pattern Anal Mach Intell 15(6):569–579

    Article  Google Scholar 

  34. Tian Y-L, Kanade T, Cohn JF (2005) Facial expression analysis. In: Handbook of face recognition. Springer, pp 247–275

  35. Tzimiropoulos G, Pantic M (2013) Optimization problems for fast aam fitting in-the-wild. In: Proceedings of the IEEE international conference on computer vision, pp 593–600

  36. Valstar M, Pantic M (2010) Induced disgust, happiness and surprise: an addition to the mmi facial expression database. In: Proc. 3rd intern. Workshop on EMOTION (satellite of LREC): Corpora for research on emotion and affect, Paris, France, p 65

  37. Vishnu Priya R (2019) Emotion recognition from geometric fuzzy membership functions. Multimed Tools Appl 78(13):17847–17878

    Article  Google Scholar 

  38. Wolf L (2009) Face recognition, geometric vs. appearance-based. Encycloped Biomet 2

  39. Yaddaden Y, Adda M, Bouzouane A, Gaboury S, Bouchard B (2017) Facial expression recognition from video using geometric features

  40. Yeasin M, Bullot B, Sharma R (2006) Recognition of facial expressions and measurement of levels of interest from video. IEEE Trans Multimed 8 (3):500–508

    Article  Google Scholar 

  41. Zangeneh E, Moradi A (2018) Facial expression recognition by using differential geometric features. The Imaging Sci J 66(8):463–470

    Article  Google Scholar 

  42. Zhang S, Pan X, Cui Y, Zhao X, Liu L (2019) Learning affective video features for facial expression recognition via hybrid deep learning. IEEE Access 7:32297–32304

    Article  Google Scholar 

  43. Zhang S, Zhao X, Lei B (2012) Robust facial expression recognition via compressive sensing. Sensors 12(3):3747–3761

    Article  Google Scholar 

  44. Zhao L, Wang Z, Zhang G (2017) Facial expression recognition from video sequences based on spatial-temporal motion local binary pattern and gabor multiorientation fusion histogram. Math Probl Eng 2017

Download references

Acknowledgment

The authors want to state their gratitude to Prof. Maja Pantic and Dr. A. Delopoulos for making available to use the MMI and MUG databases. The authors also like to express thanks to the Department of Science and Technology, Ministry of Science and Technology, Government of India for supporting with DST-INSPIRE Fellowship (INSPIRE Reg. no. If160285, Ref. No.: DST/INSPIRE Fellowship/[If160285]) to carry out this research work apart from the Department of Computer & System Sciences, Visva-Bharati University for providing infrastructure support.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Md Nasir.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Nasir, M., Dutta, P. & Nandi, A. Fuzzy triangulation signature for detection of change in human emotion from face video image sequence. Multimed Tools Appl 80, 31993–32022 (2021). https://doi.org/10.1007/s11042-021-11196-1

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-021-11196-1

Keywords

Navigation