Loading [MathJax]/extensions/MathZoom.js
Toward Facial Expression Recognition in the Wild via Noise-Tolerant Network | IEEE Journals & Magazine | IEEE Xplore

Toward Facial Expression Recognition in the Wild via Noise-Tolerant Network


Abstract:

Facial Expression Recognition (FER) has recently emerged as a crucial area in Human-Computer Interaction (HCI) system for understanding the user’s inner state and intenti...Show More

Abstract:

Facial Expression Recognition (FER) has recently emerged as a crucial area in Human-Computer Interaction (HCI) system for understanding the user’s inner state and intention. However, feature- and label-noise constitute the major challenge for FER in the wild due to the ambiguity of facial expressions worsened by low-quality images. To deal with this problem, in this paper, we propose a simple but effective Facial Expression Noise-tolerant Network (FENN) which explores the inter-class correlations for mitigating ambiguity that usually happens between morphologically similar classes. Specifically, FENN leverages a multivariate normal distribution to model such correlations at the final hidden layer of the neural network to suppress the heteroscedastic uncertainty caused by inter-class label noise. Furthermore, the discriminative ability of deep features is weakened by the subtle differences between expressions and the presence of feature noise. FENN utilizes a feature-noise mitigation module to extract compact intra-class feature representations under feature noise while preserving the intrinsic inter-class relationships. We conduct extensive experiments to evaluate the effectiveness of FENN on both original annotated images and synthetic noisy annotated images from RAF-DB, AffectNet, and FERPlus in-the-wild facial expression datasets. The results show that FENN significantly outperforms state-of-the-art FER methods.
Page(s): 2033 - 2047
Date of Publication: 07 November 2022

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.