Abstract:
Although these methods (e.g., lightweight structural design, model pruning (MP), and model quantization) can reduce the deployment difficulty of deep-learning models in i...Show MoreMetadata
Abstract:
Although these methods (e.g., lightweight structural design, model pruning (MP), and model quantization) can reduce the deployment difficulty of deep-learning models in insulator defect (ID) detection, they significantly reduce the detection accuracy. In response to the above issues, this article proposes a dynamic-focused knowledge distillation (DFKD) approach to construct a knowledge transfer path from the large model to the lightweight small model. First, the important sample focusing mechanism introduces dual focus weight factors and adaptive sample matching to encourage the student model to focus on high-quality difficult samples, to reduce the adverse effects of low-quality simple samples. Second, the adversarial training process of the temperature dynamic learning mechanism constructs soft labels of appropriate difficulty based on different stages of distillation training. This helps improve the learning and generalization abilities of the student model toward higher order knowledge. Finally, this article combines the DFKD with the MP to establish an insulator defect detection model [DFKD-MP-You only look once (YOLO)] suitable for edge devices with different computing resources. Experiments show that the DFKD method proposed in this article outperforms existing knowledge distillation (KD) methods in insulator defect detection. Moreover, compared with existing methods (see, e.g., BiFusion-YOLOv3, InsuDet, and ID-YOLO), the DFKD-MP-YOLO not only has a lighter structure, but also achieves higher accuracy and faster speed.
Published in: IEEE Transactions on Instrumentation and Measurement ( Volume: 73)