Abstract:
Contrastive representation learning has been validated its effectiveness to knowledge distillation. However, its dynamically updated representations largely impede the im...Show MoreMetadata
Abstract:
Contrastive representation learning has been validated its effectiveness to knowledge distillation. However, its dynamically updated representations largely impede the improvement of student performance. We propose to append a compact but effective project layer to both teacher and student models. With this additional layer, the retrained teacher is able to transfer stable and sufficient knowledge. To make the student model more robust, mutual category contrastive representation in student space is formulated so as the student can correctly discriminate different categories. Extensive experiments on image classification under various combinations well demonstrate the superiority of our proposed approach.
Published in: ICASSP 2024 - 2024 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
Date of Conference: 14-19 April 2024
Date Added to IEEE Xplore: 18 March 2024
ISBN Information: