Loading [a11y]/accessibility-menu.js
Collaborative Knowledge Distillation | IEEE Journals & Magazine | IEEE Xplore

Abstract:

Existing research on knowledge distillation has primarily concentrated on the task of facilitating student networks in acquiring the complete knowledge imparted by teache...Show More

Abstract:

Existing research on knowledge distillation has primarily concentrated on the task of facilitating student networks in acquiring the complete knowledge imparted by teacher networks. However, recent studies have shown that good networks are not suitable for acting as teachers, and there is a positive correlation between distillation performance and teacher prediction uncertainty. To address this finding, this paper thoroughly analyzes in depth the reasons why the teacher network affects the distillation performance, gives full play to the participation of the student network in the process of knowledge distillation, and assists the teacher network in distilling the knowledge that is suitable for their learning. In light of this premise, a novel approach known as Collaborative Knowledge Distillation (CKD) is introduced, which is founded upon the concept of “Tailoring the Teaching to the Individual”. Compared with Baseline, this paper’s method improves students’ accuracy by an average of 3.42% in CIFAR-100 experiments, and by an average of 1.71% compared with the classical Knowledge Distillation (KD) method. The ImageNet experiments conducted revealed a significant improvement of 2.04% in the Top-1 accuracy of the students.
Page(s): 7601 - 7613
Date of Publication: 13 March 2024

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.