Distilling object detectors with mask-guided feature and relation-based knowledge
by Liang Zeng; Liyan Ma; Xiangfeng Luo; Yinsai Guo; Xue Chen
International Journal of Computational Science and Engineering (IJCSE), Vol. 27, No. 2, 2024

Abstract: Knowledge distillation (KD) is an effective technique for network compression and model accuracy enhancement in image classification, semantic segmentation, pre-trained language model, and so on. However, existing KD methods are specialised for image classification and cannot be used effectively for object detection tasks, with the following two limitations: the imbalance of foreground and background instances and the neglect distillation of relation-based knowledge. In this paper, we present a general mask-guided feature and relation-based knowledge distillation framework (MAR) consisting of two components, mask-guided distillation, and relation-based distillation, to address the above problems. The mask-guided distillation is designed to emphasise students' learning of close-to-object features via multi-value masks, while relation-based distillation is proposed to mimic the relational information between different feature pixels on the classification head. Extensive experiments show that our methods achieve excellent AP improvements on both one-stage and two-stage detectors. Specifically, faster R-CNN with ResNet50 backbone achieves 40.6% in mAP under 1 × schedule on the COCO dataset, which is 3.2% higher than the baseline and even surpasses the teacher detector.

Online publication date: Mon, 11-Mar-2024

The full text of this article is only available to individual subscribers or to users at subscribing institutions.

 
Existing subscribers:
Go to Inderscience Online Journals to access the Full Text of this article.

Pay per view:
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.

Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Computational Science and Engineering (IJCSE):
Login with your Inderscience username and password:

    Username:        Password:         

Forgotten your password?


Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.

If you still need assistance, please email subs@inderscience.com