Abstract
Affect computing or Automatic affect sensing has aroused extensive interests of researchers in the area of machine learning and pattern recognition. Most previous research focused on face detection and emotion recognition while our research explores facial intensity estimation, which cares more about the dynamic changes on a face. CK+ database and Real-world Affective Face Database (RAF-DB) are used to test and implement the algorithms in this paper. To settle the problem of intensity estimation, classification and ranking algorithms are used for training and testing intensity levels. Meanwhile, the performance of five different feature representations is evaluated using the accuracy results obtained from classification approach. By using the optimum feature representation as the input to the next designed training model, ranking results can be attained. Techniques of Learning to Rank in the area of information retrieval are utilized to combat the situation of intensity ranking. RankSVM and RankBoost are used as frameworks to estimate the ranking scores based on sequences of images. The experimental results of scoring are evaluated by the indexes used in information retrieval. Algorithms used in the research are well organized and compared to generate an optimal model for the ranking task.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Delannoy, J., McDonald, J.: Automatic estimation of the dynamics of facial expression using a three-level model of intensity. In: IEEE International Conference on Automatic Face & Gesture Recognition (2008)

Yang, P., Liu, Q., Metaxas, D.N.: IEEE rankboost with l-1 regularization for facial expression recognition and intensity estimation. In: International Conference of Computer Vision (ICCV) (2009)
Mahoor, M., Cadavid, S., Messinger, D., Cohn, J.: A framework for automated measurement of the intensity of non-posed facial action units. In: IEEE CVPR Workshop on Human Communicative Behaviour Analysis (2009)
Chang, K.Y., Chen, C.S., Hung, Y.P.: Intensity rank estimation of facial expressions based on a single image. In: IEEE International Conference on Systems, Man, and Cybernetics, pp. 3157–3162 (2013)
Mavadati, S., Mahoor, M., Bartlett, K., Trinh, P., Cohn, J.: DISFA: a spontaneous facial action intensity database. IEEE Trans. Affect. Comput. 4(2), 151–160 (2013)
Valstar, M.F., Almaev, T., et al.: FERA 2015 - second facial expression recognition and analysis challenge. In: 2015 IEEE International Conference on Automatic Face & Gesture Recognition and Workshops (FG 2015). IEEE (2015)

Acknowledgement
This work was partially sponsored by supported by the NSFC (National Natural Science Foundation of China) under Grant No. 61375031, No. 61573068, No. 61471048, and No. 61273217, the Fundamental Research Funds for the Central Universities under Grant No. 2014ZD03-01, This work was also supported by Beijing Nova Program, CCF-Tencent Open Research Fund, and the Program for New Century Excellent Talents in University.
Author information
Authors and Affiliations
Corresponding authors
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Gao, Y., Li, S., Deng, W. (2016). Intensity Estimation of the Real-World Facial Expression. In: Tan, T., Li, X., Chen, X., Zhou, J., Yang, J., Cheng, H. (eds) Pattern Recognition. CCPR 2016. Communications in Computer and Information Science, vol 662. Springer, Singapore. https://doi.org/10.1007/978-981-10-3002-4_7
Download citation
DOI: https://doi.org/10.1007/978-981-10-3002-4_7
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-10-3001-7
Online ISBN: 978-981-10-3002-4
eBook Packages: Computer ScienceComputer Science (R0)