Loading [a11y]/accessibility-menu.js
Unsupervised Prototype-Wise Contrastive Learning for Domain Adaptive Semantic Segmentation in Remote Sensing Image | IEEE Journals & Magazine | IEEE Xplore
Scheduled Maintenance: On Monday, 27 January, the IEEE Xplore Author Profile management portal will undergo scheduled maintenance from 9:00-11:00 AM ET (1400-1600 UTC). During this time, access to the portal will be unavailable. We apologize for any inconvenience.

Unsupervised Prototype-Wise Contrastive Learning for Domain Adaptive Semantic Segmentation in Remote Sensing Image


Abstract:

Labeling data in the field of remote sensing is time-consuming and labor-intensive, making domain adaptation between different domains an urgently needed solution. To add...Show More

Abstract:

Labeling data in the field of remote sensing is time-consuming and labor-intensive, making domain adaptation between different domains an urgently needed solution. To address the domain gap between diverse datasets in the remote-sensing domain, numerous methods tailored for domain adaptation in high-resolution remote-sensing imagery (RSI) have emerged. Some of the existing methods focus on reducing the domain gap at either the feature level or the pixel level, often overlooking their underlying connection. To tackle this issue, we introduce a prototype-wise contrastive feature alignment (PCFA) paradigm aimed at bridging the representations between the feature and pixel levels. By dynamically updating, we acquire prototype information encompassed by different mini-batches and employ an optimal transport mechanism to reasonably apply the prototype feature distribution in guiding the learning of target domain features. We conduct extensive domain adaptation semantic segmentation (DASS) experiments on the ISPRS Vaihingen and Potsdam datasets, achieving an improvement of about 4%–5% in mean intersection over union (mIoU) compared to previous methods using the DeepLabV2 framework.
Article Sequence Number: 5626614
Date of Publication: 17 November 2023

ISSN Information:

Funding Agency:


References

References is not available for this document.