Loading [a11y]/accessibility-menu.js
Stable Knowledge Transfer for Contrastive Distillation | IEEE Conference Publication | IEEE Xplore

Stable Knowledge Transfer for Contrastive Distillation


Abstract:

Contrastive representation learning has been validated its effectiveness to knowledge distillation. However, its dynamically updated representations largely impede the im...Show More

Abstract:

Contrastive representation learning has been validated its effectiveness to knowledge distillation. However, its dynamically updated representations largely impede the improvement of student performance. We propose to append a compact but effective project layer to both teacher and student models. With this additional layer, the retrained teacher is able to transfer stable and sufficient knowledge. To make the student model more robust, mutual category contrastive representation in student space is formulated so as the student can correctly discriminate different categories. Extensive experiments on image classification under various combinations well demonstrate the superiority of our proposed approach.
Date of Conference: 14-19 April 2024
Date Added to IEEE Xplore: 18 March 2024
ISBN Information:

ISSN Information:

Conference Location: Seoul, Korea, Republic of

Contact IEEE to Subscribe

References

References is not available for this document.