Skip to main content

Model of the Facial Emotions Expressions Based on Grouping Classes of Feature Vectors

  • Conference paper
  • First Online:
Lecture Notes in Computational Intelligence and Decision Making (ISDMCI 2020)

Abstract

The characteristic forms of facial expressions of the emotional states of a human are typical of a rather large degree of generalization on the basis of common physiological structures and the location of the muscles that form the human face. This circumstance is one of the main reasons for the commonality of human display of emotions that are reflected in the face. By the nature and form of facial expressions on the face it is possible, with high probability, to determine the emotional state of a human with some correction on the part of cultural characteristics and traditions of certain groups. In accordance with the existence of common mimic forms of emotional displays, an approach is proposed to create a model of recognition of emotional displays on a human’s face with relatively low requirements for the means of photo, video-fixation. The creation of the model is based on the implementation of the hyperplane classification of mimic displays of major emotional states. One of the main advantages of the proposed approach is the low computational complexity, which makes it possible to implement a system of recognition of changes in the emotional state of a human by mimic displays without the use of specialized equipment. In addition, the model formed on the basis of the proposed approach allows obtaining proper recognition accuracy with low requirements for quality image characteristics, which allows extending the scope of practical application to a great extent. One example of practical application is the control of the driver while driving a transport, complex production operator and other automatic visual surveillance systems. The set of identified emotional states is formed in accordance with the assigned tasks and gives the opportunity to focus on the recognition of facial expressions and to group characteristic structural displays based on the set of distinguished characteristic features.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Barmak, A.V., Krak, Y.V., Manziuk, E.A., Kasianiuk, V.S.: Information technology of separating hyperplanes synthesis for linear classifiers. J. Autom. Inf. Sci. 51(5), 54–64 (2019). https://doi.org/10.1615/JAutomatInfScien.v51.i5.50

    Article  Google Scholar 

  2. Duchenne de Boulogne, G.B., Cuthbertson, A.: The Mechanism of Human Facial Expression. Cambridge University Press, Cambridge (1990)

    Google Scholar 

  3. Brunelli, R., Poggio, T.: Face recognition: features versus templates. IEEE Trans. Pattern Anal. Mach. Intell. 15(10), 1042–1052 (1993). https://doi.org/10.1109/34.254061

    Article  Google Scholar 

  4. Cox, T.F., Cox, M.A.A.: Multidimensional Scaling, 2nd edn. Chapman and Hall/CRC, Boca Raton (2001)

    Google Scholar 

  5. Ekman, P., Friesen, W.V.: Manual for the Facial Action Code. Consulting Psychologist Press, Palo Alto, CA (1978)

    Google Scholar 

  6. Ekman, P., Friesen, W.V., Hager, J.C.: The Facial Action Coding System. Salt Lake City, UT Research Nexus eBook (2002)

    Google Scholar 

  7. Hamm, J., Kohler, C.G., Gur, R.C., Verma, R.: Automated facial action coding system for dynamic analysis of facial expressions in neuropsychiatric disorders. J. Neurosci. Methods 200(2), 237–256 (2011). https://doi.org/10.1016/j.jneumeth.2011.06.023

  8. Hjortsjo, C.H.: Man’s Face and Mimic Language. Studentlitteratur/Lund, Sweden (1970)

    Google Scholar 

  9. Krak, Y.V., Barmak, A.V., Baraban, E.M.: Usage of NURBS-approximation for construction of spatial model of human face. J. Autom. Inf. Sci. 43(2), 71–81 (2011). https://doi.org/10.1615/JAutomatInfScien.v43.i2.70

    Article  Google Scholar 

  10. Kryvonos, I.G., Krak, I.V.: Modeling human hand movements, facial expressions, and articulation to synthesize and visualize gesture information. Cybern. Syst. Anal. 47(4), 501–505 (2011). https://doi.org/10.1007/s10559-011-9332-4

    Article  Google Scholar 

  11. Kryvonos, I.G., Krak, I.V., Barmak, O.V., Ternov, A., Kuznetsov, V.A.: Information technology for the analysis of mimic expressions of human emotional states. Cybern. Syst. Anal. 51(1), 25–33 (2015). https://doi.org/10.1007/s10559-015-9693-1

  12. de Leeuw, J., Mair, P.: Multidimensional scaling using majorization: SMACOF in R. J. Stat. Softw. 31(3), 1–30 (2009). https://doi.org/10.18637/jss.v031.i03

  13. Li, S.Z., Anil, K.L.: Handbook of Face Recognition. Springer, New York (2005)

    Google Scholar 

  14. van der Maaten, L., Postma, E., van den Herik, H.: Dimensionality reduction: a comparative review. Technical report TiCC-TR 2009-005. Tilburg University (2009)

    Google Scholar 

  15. Manziuk, E.A., Barmak, A.V., Krak, Y.V., Kasianiuk, V.S.: Definition of information core for documents classification. J. Autom. Inf. Sci. 50(4), 25–34 (2018). https://doi.org/10.1615/JAutomatInfScien.v50.i4.30

    Article  Google Scholar 

  16. Martinez, A., Du, S.: A model of perception of facial expressions of emotion by human: research overview and perspectives. J. Mach. Learn. Res. 13(1), 1589–1608 (2012)

    MathSciNet  Google Scholar 

  17. Romanyuk, O., Vyatkin, S., Pavlov, S., et al.: Face recognition techniques. Informatyka, Automatyka, Pomiary W Gospodarce I Ochronie Srodowiska 10(1), 52–57 (2020). https://doi.org/10.35784/iapgos.922

  18. Vorontsov, K.: Lectures on algorithms of clustering and multidimensional scaling (2020). http://www.ccas.ru/voron/download/Clustering.pdf

  19. Wingenbach, T.S.H., Ashwin, C., Brosnan, M.: Validation of the Amsterdam dynamic facial expression set - bath intensity variations (ADFES-BIV): a set of videos expressing low, intermediate, and high intensity emotions. PLoS ONE 11(1), 1–28 (2016). https://doi.org/10.1371/journal.pone.0147112

    Article  Google Scholar 

  20. Zhambyu, M.: Hierarchical cluster analysis and matching. Finance Stat. (1988)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Iurii Krak .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Barmak, O., Kalyta, O., Krak, I., Manziuk, E., Kuznetsov, V. (2021). Model of the Facial Emotions Expressions Based on Grouping Classes of Feature Vectors. In: Babichev, S., Lytvynenko, V., Wójcik, W., Vyshemyrskaya, S. (eds) Lecture Notes in Computational Intelligence and Decision Making. ISDMCI 2020. Advances in Intelligent Systems and Computing, vol 1246. Springer, Cham. https://doi.org/10.1007/978-3-030-54215-3_5

Download citation

Publish with us

Policies and ethics