Abstract:
Data are the essential component in the pipeline of training a model that determines the performance of the model. However, there may not be enough data that meet the req...Show MoreMetadata
Abstract:
Data are the essential component in the pipeline of training a model that determines the performance of the model. However, there may not be enough data that meet the requirements of some tasks. In this paper, we introduce a knowledge distillation-based approach that mitigates the disadvantages of data scarcity. Specifically, we propose a method that boosts the pixel domain performance of a model, by utilizing compressed domain knowledge via cross distillation between these two modalities. To evaluate our approach, we conduct experiments on two computer vision tasks which are object detection and recognition. Results indicate that compressed domain features can be utilized for a task in the pixel domain via our approach, where data are scarce or not completely available due to privacy or copyright issues.
Date of Conference: 19-22 May 2024
Date Added to IEEE Xplore: 02 July 2024
ISBN Information: