Abstract:
Unsupervised Domain Adaptive Object Detection (DAOD) task can relax the domain shift problem between source and target domains, which requires to train models on labeled ...Show MoreMetadata
Abstract:
Unsupervised Domain Adaptive Object Detection (DAOD) task can relax the domain shift problem between source and target domains, which requires to train models on labeled source and unlabeled target domains jointly. However, due to limitations of data privacy protection, the source domain data is usually inaccessible, which poses significant challenges for the DAOD task. Hence, Source-Free Object Detection (SFOD) task has been developed that aims to fine-tune a pre-trained source model with only unlabeled target domain data. Most of the existing SFOD methods are based on pseudo labeling using the student-teacher framework, where the teacher model is the Exponential Moving Average (EMA) of the student models in different time steps. However, these methods always exist a knowledge bias problem due to class imbalance, and therefore, a fixed EMA update rate is no longer suitable for different classes. For high-quality classes, a fast EMA rate can accelerate knowledge updating and promote model convergence, while for low-quality classes, a fast EMA rate can accelerate the accumulation of knowledge bias and lead to the collapse of such categories. To solve this problem, we propose a novel SFOD method called Slow-Fast Adaptation which develops two different teacher models, a slow teacher, and a fast teacher model, to jointly guide the student training. The slow and fast teacher models can provide richer supervision information and complement each other. The experiments on four benchmark datasets show that our method achieves state-of-the-art results and even outperforms DAOD methods in some cases, which demonstrate the effectiveness of our method on the SFOD task.
Date of Conference: 15-19 July 2024
Date Added to IEEE Xplore: 30 September 2024
ISBN Information: