Skip to main content

Advertisement

Log in

Automatic reaction emotion estimation in a human–human dyadic setting using Deep Neural Networks

  • Original Paper
  • Published:
Signal, Image and Video Processing Aims and scope Submit manuscript

Abstract

While Facial Expression Recognition (FER) has become a broadly applied technology in tracking individual people’s emotions, its application in estimating emotion in a human-to-human dyadic interaction is still relatively sparse. This paper describes a study where FER is applied to a dyadic video-mediated interaction to collect facial interaction data that will be used to predict the emotions of one of the interlocutors. To realize this, we utilized the histogram of oriented gradients algorithm to detect human faces from videos by analyzing every frame. Then, we used a Deep Neural Networks (DNN) model to detect the facial expressions of two people who are having a conversation on videos. We measured the facial patterns as indicators of emotions of both interlocutors during the whole interaction. Afterward, we trained a Long Short-Term Memory (LSTM) model to estimate one person’s emotions from the video. We performed the analysis on videos of a specific psychiatrist and his patients; then, we performed the patients’ emotion estimation. This work shows how our multi-stage DNN (Mini-Xception) and LSTM models can predict the reaction emotions using the patient’s facial expressions during the interaction. We believe that the proposed method can be applied to the future generation of facial expressions of virtual characters or social robots when they interact with humans.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

Notes

  1. https://www.youtube.com/channel/UClHVl2N3jPEbkNJVx-ItQIQ.

  2. http://dlib.net/.

References

  1. Barrett, L.F., Adolphs, R., Marsella, S., Martinez, A.M., Pollak, S.D.: Emotional expressions reconsidered: challenges to inferring emotion from human facial movements. Psychol. Sci. Public Interest 20(1), 1–68 (2019)

    Article  Google Scholar 

  2. Cowen, A.S., Keltner, D., Schroff, F., Jou, B., Adam, H., Prasad, G.: Sixteen facial expressions occur in similar contexts worldwide. Nature 589(7841), 251–257 (2021)

    Article  Google Scholar 

  3. Herrando, C., Constantinides, E.: Emotional contagion: a brief overview and future directions. Front. Psychol. 2881 (2021)

  4. Lang, P.J., Greenwald, M.K., Bradley, M.M., Hamm, A.O.: Looking at pictures: evaluative, facial, visceral and behavioral responses. Psychophysiology 30(3), 261–273 (1993)

    Article  Google Scholar 

  5. Dimberg, U.: For distinguished early career contribution to psychophysiology: award address, 1988: facial electromyography and emotional reactions. Psychophysiology 27(5), 481–494 (1990)

    Article  Google Scholar 

  6. Bartlett, M.S., Viola, P.A., Sejnowski, T.J., Golomb, B.A., Larsen, J., Hager, J.C., Ekman, P.: Classifying facial action. In: Advances in Neural Information Processing Systems, pp. 823–829 (1996)

  7. Calder, A.J., Young, A.W., Rowland, D., Perrett, D.I.: Computer-enhanced emotion in facial expressions. Proc. R. Soc. Lond. B 264(1383), 919–925 (1997)

    Article  Google Scholar 

  8. Hess, U., Philippot, P., Blairy, S.: Facial reactions to emotional facial expressions: affect or cognition? Cognit. Emot. 12(4), 509–531 (1998)

    Article  Google Scholar 

  9. Kragel, P.A., Reddan, M.C., LaBar, K.S., Wager, T.D.: Emotion schemas are embedded in the human visual system. Sci. Adv. 5(7), eaaw4358 (2019)

    Article  Google Scholar 

  10. Volynets, S., Smirnov, D., Saarimäki, H., Nummenmaa, L.: Statistical pattern recognition reveals shared neural signatures for displaying and recognizing specific facial expressions. Soc. Cognit. Affect. Neurosci. 15(8), 803–813 (2020)

    Article  Google Scholar 

  11. Spezialetti, M., Placidi, G., Rossi, S.: Emotion recognition for human–robot interaction: recent advances and future perspectives. Front. Robot. AI, 145 (2020)

  12. Li, S., Deng, W.: Deep facial expression recognition: a survey. IEEE Trans. Affect. Comput. (2020)

  13. Takalkar, M., Xu, M., Wu, Q., Chaczko, Z.: A survey: facial micro-expression recognition. Multimedia Tools Appl. 77(15), 19 301-19 325 (2018)

    Article  Google Scholar 

  14. Vinola, C., Vimaladevi, K.: A survey on human emotion recognition approaches, databases and applications. ELCVIA Electron. Lett. Comput. Vis. Image Anal. 14(2), 24–44 (2015)

    Article  Google Scholar 

  15. Ebrahimi Kahou, S., Michalski, V., Konda, K., Memisevic, R., Pal, C.: Recurrent neural networks for emotion recognition in video. In: Proceedings of the 2015 ACM on International Conference on Multimodel Interaction, pp. 467–474 (2015)

  16. Ronghe, N., Nakashe, S., Pawar, A., Bobde, S.: Emotion recognition and reaction prediction in videos. In: 2017 Third International Conference on Research in Computational Intelligence and Communication Networks (ICRCICN). IEEE, pp. 26–32 (2017)

  17. Huang, Y., Khan, S.M.: Dyadgan: generating facial expressions in dyadic interactions. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops, pp. 11–18 (2017)

  18. Varni, G., Hupont, I., Clavel, C., Chetouani, M.: Computational study of primitive emotional contagion in dyadic interactions. IEEE Trans. Affect. Comput. 11(2), 258–271 (2017)

    Article  Google Scholar 

  19. Kossaifi, J., Walecki, R., Panagakis, Y., Shen, J., Schmitt, M., Ringeval, F., Han, J., Pandit, V., Toisoul, A., Schuller, B.W., et al.: Sewa db: a rich database for audio-visual emotion and sentiment research in the wild. IEEE Trans. Pattern Anal. Mach. Intell

  20. Haamer, R.E., Rusadze, E., Lsi, I., Ahmed, T., Escalera, S., Anbarjafari, G.: Review on emotion recognition databases. Hum. Robot Interact. Theor. Appl 3, 39–63 (2017)

    Google Scholar 

  21. Scherer, K.R., Shuman, V., Fontaine, J., Soriano Salinas, C.: The grid meets the wheel: Assessing emotional feeling via self-report. Components of emotional meaning: a sourcebook (2013)

  22. Limbong, A.: To help gamers on twitch, dr. k balances mental health advice with medical ethics (2021). https://www.npr.org/2021/01/13/956315576/psychiatrist-criticized-for-addressing-mental-health-issues-on-twitch?t=1623190067844

  23. Dalal, N., Triggs, B.: Histograms of oriented gradients for human detection. In: 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’05), vol. 1. IEEE, pp. 886–893 (2005)

  24. Giannopoulos, P., Perikos, I., Hatzilygeroudis, I.: Deep learning approaches for facial emotion recognition: a case study on fer-2013. In: Advances in Hybridization of Intelligent Methods. Springer, pp. 1–16 (2018)

  25. Arriaga, O., Valdenegro-Toro, M., Plöger, P.: Real-time convolutional neural networks for emotion and gender classification. arXiv preprintarXiv:1710.07557 (2017)

  26. Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)

    Article  Google Scholar 

  27. Selvin, S., Vinayakumar, R., Gopalakrishnan, E., Menon, V.K., Soman, K.: Stock price prediction using lstm, rnn and cnn-sliding window model. In: 2017 International Conference on Advances in Computing, Communications and Informatics (icacci). IEEE, pp. 1643–1647 (2017)

  28. Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. arXiv preprintarXiv:1412.6980 (2014)

  29. Maurya, R.: Mauryaritesh/facial-expression-detection (2018). [Online]. Available: https://github.com/MauryaRitesh/Facial-Expression-Detection

  30. Serengil, S.I.: Facial expression recognition with keras - sefik ilkin serengil (2020). [Online]. Available: http://sefiks.com/2018/01/01/facial-expression-recognition-with-keras/

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Gholamreza Anbarjafari.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

This work has been supported by the EU Mobilitas Pluss grant (MOBTT90) of Dr. Pia Tikka, Enactive Virtuality Lab, Tallinn University (2017–2022).

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Sham, A.H., Tikka, P., Lamas, D. et al. Automatic reaction emotion estimation in a human–human dyadic setting using Deep Neural Networks. SIViP 17, 527–534 (2023). https://doi.org/10.1007/s11760-022-02257-5

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11760-022-02257-5

Keywords

Navigation