Loading [a11y]/accessibility-menu.js
WSCFER: Improving Facial Expression Representations by Weak Supervised Contrastive Learning | IEEE Conference Publication | IEEE Xplore

WSCFER: Improving Facial Expression Representations by Weak Supervised Contrastive Learning


Abstract:

The major challenge of Facial Expression Recog-nition (FER) is to learn class discriminative representations, and the existing works mainly address it by designing variou...Show More

Abstract:

The major challenge of Facial Expression Recog-nition (FER) is to learn class discriminative representations, and the existing works mainly address it by designing various classification networks from class level. However, learning representations at class level is limited due to the inconspicuous class discrimination among different facial expressions. Thus, in this paper, we propose a Weak Supervised Contrastive learning FER (WSCFER) method to improve facial expression representations by simultaneously learning instance-level representations which are highly complementary to the general class-level representations. Specifically, our proposed WSCFER consists of three components: a major task for FER classification, an auxiliary task for Weak Supervised Contrastive (WSC) learning which pulls augmented samples of the same image together while pushing apart instance samples from different classes, and a Partial Consistency Loss (PCL) for optimizing the two embedding spaces from both the class level and the instance level. We compare WSC with some state-of-the-art contrastive methods and find that it can efficiently learn instance-level representations but avoid overemphasizing irrelevant parts, which is crucial for FER. WSCFER achieves superior performance on several in-the-wild databases, and it also shows the promising potential for learning representations under noisy annotations.
Date of Conference: 01-05 October 2023
Date Added to IEEE Xplore: 13 December 2023
ISBN Information:

ISSN Information:

Conference Location: Detroit, MI, USA

Funding Agency:


References

References is not available for this document.