Abstract
Emotion and facial expression recognition are a common topic in artificial intelligence. In particular, main efforts focus on constructing models to classify within the six universal emotions. In this paper, we present the first attempt to classify within 33 different facial expressions. We define and train a simple convolutional neural network with a low number of intermediate layers, to recognize the 32 facial expressions (plus the neutral one) contained in an extension of UIBVFED, a virtual facial expression dataset. We obtained a global accuracy of 0.8, which is comparable to the 0.79 accuracy we got when training the neural network with only the six universal emotions. Taking advantage of this trained model, we explore the approach of classifying the images within the six universal emotions, translating the facial expression predicted by the model into its associated emotion. With this novel approach, we reach an accuracy level of 0.95, a value comparable to the best results present in the literature with the plus of using a very simple neural network.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Kaur, P., Krishan, K., Sharma, S.K., Kanchan, T.: Facial-recognition algorithms: a literature review. Med. Sci. Law 60(2), 131–139 (2020). https://doi.org/10.1177/0025802419893168
Bisogni, C., Castiglione, A., Hossain, S., Narducci, F., Umer, S.: Impact of deep learning approaches on facial expression recognition in healthcare industries. IEEE Trans. Ind. Inform. 18(8), 5619–5627 (2022). https://doi.org/10.1109/TII.2022.3141400
Sun, X., Zheng, S., Fu, H.: ROI-attention vectorized CNN model for static facial expression recognition. IEEE Access 8, 7183–7194 (2020). https://doi.org/10.1109/ACCESS.2020.2964298
Khan, G., Samyan, S., Khan, M.U.G., Shahid, M., Wahla, S.Q.: A survey on analysis of human faces and facial expressions datasets. Int. J. Mach. Learn. Cybern. 11(3), 553–571 (2020). https://doi.org/10.1007/s13042-019-00995-6
Mollahosseini, A., Hasani, B., Mahoor, M.H.: AffectNet: a database for facial expression, valence, and arousal computing in the wild. IEEE Trans. Affect. Comput. 10(1), 18–31 (2019). https://doi.org/10.1109/TAFFC.2017.2740923
Oliver, M.M., Alcover, E.A.: UIBVFED: virtual facial expression dataset. PLoS ONE 15(4), e0231266 (2020). https://doi.org/10.1371/journal.pone.0231266
Benitez-Quiroz, C.F., Srinivasan, R., Martinez, A.M.: EmotioNet: an accurate, real-time algorithm for the automatic annotation of a million facial expressions in the wild. In: 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Jun. 2016, pp. 5562–5570 (2016). https://doi.org/10.1109/CVPR.2016.600
Ekman, P., Friesen, W.V.: Facial Action Coding System: A Technique for the Measurement of Facial Movement. Consulting Psychologists Press (1978)
Jain, N., Kumar, S., Kumar, A., Shamsolmoali, P., Zareapoor, M.: Hybrid deep neural networks for face emotion recognition. Pattern Recogn. Lett. 115, 101–106 (2018). https://doi.org/10.1016/j.patrec.2018.04.010
Jain, D.K., Shamsolmoali, P., Sehdev, P.: Extended deep neural network for facial emotion recognition. Pattern Recogn. Lett. 120, 69–74 (2019). https://doi.org/10.1016/j.patrec.2019.01.008
Liu, S., Tang, X., Wang, D.: Facial expression recognition based on sobel operator and improved CNN-SVM. In: 2020 IEEE 3rd International Conference on Information Communication and Signal Processing (ICICSP), September 2020, pp. 236–240 (2020). https://doi.org/10.1109/ICICSP50920.2020.9232063
Faigin, G.: The Artist’s Complete Guide to Facial Expression. Watson-Guptill (2012)
‘Contempt: Paul Ekman Group. https://www.paulekman.com/universal-emotions/what-is-contempt/. Accessed 08 November 2022
Ribeiro, M.T., Singh, S., Guestrin, C.: Why should i trust you?”: Explaining the predictions of any classifier. In: Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, New York, NY, USA, Aug. 2016, pp. 1135–1144 (2016). https://doi.org/10.1145/2939672.2939778
Carreto Picón, G., Roig-Maimó, M.F., Mascaró Oliver, M., Amengual Alcover, E., Mas-Sansó, R.: Do machines better understand synthetic facial expressions than people? In: Proceedings of the XXII International Conference on Human Computer Interaction, New York, NY, USA, pp. 1–5, September 2022. https://doi.org/10.1145/3549865.3549908
Ramis, S., Buades, J.M., Perales, F.J., Manresa-Yee, C.: A novel approach to cross dataset studies in facial expression recognition. Multimed. Tools Appl. 81(27), 39507–39544 (2022). https://doi.org/10.1007/s11042-022-13117-2
Acknowledgments
The authors acknowledge the Project EXPLainable Artificial INtelligence systems for health and well-beING (EXPLAINING) funded by PID2019-104829RA-I00 / MCIN/ AEI / https://doi.org/10.13039/501100011033. We also thank the University of the Balearic Islands, and the Department of Mathematics and Computer Science for their support.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Mascaró-Oliver, M., Mas-Sansó, R., Amengual-Alcover, E., Roig-Maimó, M.F. (2024). On the Convenience of Using 32 Facial Expressions to Recognize the 6 Universal Emotions. In: Rocha, A., Adeli, H., Dzemyda, G., Moreira, F., Colla, V. (eds) Information Systems and Technologies. WorldCIST 2023. Lecture Notes in Networks and Systems, vol 800. Springer, Cham. https://doi.org/10.1007/978-3-031-45645-9_60
Download citation
DOI: https://doi.org/10.1007/978-3-031-45645-9_60
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-45644-2
Online ISBN: 978-3-031-45645-9
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)