Loading [a11y]/accessibility-menu.js
Scale-Disentangled and Uncertainty-Guided Alignment for Domain-Adaptive Object Detection | IEEE Journals & Magazine | IEEE Xplore

Scale-Disentangled and Uncertainty-Guided Alignment for Domain-Adaptive Object Detection


Abstract:

Unsupervised domain adaptive object detection methods aim to transfer knowledge from the label-sufficient domain to the unlabeled domain. Most existing works minimize dom...Show More

Abstract:

Unsupervised domain adaptive object detection methods aim to transfer knowledge from the label-sufficient domain to the unlabeled domain. Most existing works minimize domain disparity by concentrating on different levels through adversarial learning. However, adversarial learning do not consider the different influences on under-aligned and well-aligned samples as they merely match distinct distributions with consistent weight. To address this issue, we design a novel scale-disentangled and uncertainty-guided alignment for domain-adaptive object detection (SDUGA), consisting of three main components: (1) Disentangled scale coarse module, which decouples scale information from global image features and performs individual alignment across domains for the corresponding scale by training domain classifiers in an adversarial learning manner; (2) Disentangled scale fine module, which generalizes the disentangled scale alignment to instance-level adaptation, reinforcing the distribution alignment across domains from multi-scale local instance level; (3) Uncertainty-guided coarse-to-fine attention alignment, which adjusts weights for various samples adaptively by generating the uncertainty-guided attention map, thus enforcing the detector to converge more on alignment for under-aligned samples and avoid misaligning well-aligned ones. Extensive experiments over three challenging domain-shift object detection scenarios demonstrate that SDUGA gains superior performance compared to state-of-the-art methods.
Published in: IEEE Transactions on Intelligent Transportation Systems ( Volume: 25, Issue: 12, December 2024)
Page(s): 19507 - 19521
Date of Publication: 29 August 2024

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.