Loading [MathJax]/extensions/MathZoom.js
PA-Former: Learning Prior-Aware Transformer for Remote Sensing Building Change Detection | IEEE Journals & Magazine | IEEE Xplore

PA-Former: Learning Prior-Aware Transformer for Remote Sensing Building Change Detection


Abstract:

Building change detection (BCD) is significant for urban planning and environmental protection. In view of the interclass similarity and intraclass difference of building...Show More

Abstract:

Building change detection (BCD) is significant for urban planning and environmental protection. In view of the interclass similarity and intraclass difference of building changes in complex built-up area, specialized solutions have been introduced in BCD. Mainstream methods include extracting building prior information in advance and enhancing long-range context information. These methods often require additional processing and ignore the construction of cross-temporal context information, resulting in deficiencies on CD performance and efficiency. Therefore, an end-to-end PA-Former for BCD is proposed in this letter, which combines prior extraction and contextual fusion together by learning prior-aware Transformer. Specifically, the PA-Former adopts a prior-feature extractor (PFE) to capture prior and deep features from the bitemporal images, in which a prior interpreter (PI) is integrated to obtain priori structural information of buildings. Besides, a prior-aware Transformer module (PATM) is designed to obtain contextual tokens with spatiotemporal information from the prior features and integrate into the deep features. Extensive experiments with state-of-the-art methods are conducted for comparison. In particular, the PA-Former surpasses the baselines with an F1 of 88.79% on the BCDD dataset and that of 85.32% on the Google dataset.
Published in: IEEE Geoscience and Remote Sensing Letters ( Volume: 19)
Article Sequence Number: 6515305
Date of Publication: 19 August 2022

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.