Loading web-font TeX/Main/Regular
Adaptive Refining-Aggregation-Separation Framework for Unsupervised Domain Adaptation Semantic Segmentation | IEEE Journals & Magazine | IEEE Xplore

Adaptive Refining-Aggregation-Separation Framework for Unsupervised Domain Adaptation Semantic Segmentation


Abstract:

Unsupervised domain adaptation has attracted widespread attention as a promising method to solve the labeling difficulties of semantic segmentation tasks. It trains a seg...Show More

Abstract:

Unsupervised domain adaptation has attracted widespread attention as a promising method to solve the labeling difficulties of semantic segmentation tasks. It trains a segmentation network for unlabeled real target images using easily available labeled virtual source images. To improve performance, clustering is used to obtain domain-invariant feature representations. However, most clustering-based methods indiscriminately cluster all features mapped by category from both domains, causing the centroid shift and affecting the generation of discriminative features. We propose a novel clustering-based method that uses an adaptive refining-aggregation-separation framework, which learns the discriminative features by designing different adaptive schemes for different domains and features. The clustering does not require any tunable thresholds. To estimate more accurate domain-invariant centroids, we design different ways to guide the adaptive refinement of different domain features. A critic is proposed to directly evaluate the confidence of target features to solve the absence of target labels. We introduce a domain-balanced aggregation loss and two adaptive separation losses for distance and similarity respectively, which can discriminate clustering features by combining the refinement strategy to improve segmentation performance. Experimental results on GTA 5\rightarrow Cityscapes and SYNTHIA \rightarrow Cityscapes benchmarks show that our method outperforms existing state-of-the-art methods.
Page(s): 3822 - 3832
Date of Publication: 08 February 2023

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.