PCDASNet: Position-Constrained Differential Attention Siamese Network for Building Damage Assessment | IEEE Journals & Magazine | IEEE Xplore

PCDASNet: Position-Constrained Differential Attention Siamese Network for Building Damage Assessment


Abstract:

Sudden natural disasters and man-made disasters pose a threat to human life and property safety, and real-time semantic segmentation of high-resolution remote sensing ima...Show More

Abstract:

Sudden natural disasters and man-made disasters pose a threat to human life and property safety, and real-time semantic segmentation of high-resolution remote sensing images is crucial for disaster damage assessment applications. In recent years, with the wide application of high spatial resolution (HSR) remote sensing images and semantic change detection methods based on deep learning (DL), the acquisition of information on damaged areas has become more and more convenient and accurate. However, due to the black box characteristics of existing methods, the lack of interpretability and prior knowledge embedding (such as building positioning information), as well as the low utilization of damage conditions around the building, lead to automatically learned feature representations that still need to be improved. To solve these problems, we proposed the position-constrained differential attention Siamese network (PCDASNet). The main idea is to merge building extraction and disaster damage assessment into a cascaded framework to improve building damage recognition results under the constraints of building positioning information. In particular, the proposed differential attention module (DAM) adaptively extracts change information corresponding to buildings and surrounding environments from dual-temporal images, with interpretability and theoretical guarantee, which enables the integration of prior positioning knowledge into the design of network architecture. The objective metrics of the method on the building damage dataset show that the method achieves a test F1 score of more than 73% compared with other baseline methods, and also outperforms several state-of-the-art methods in terms of visual results.
Article Sequence Number: 5622318
Date of Publication: 24 April 2024

ISSN Information:

Funding Agency:


References

References is not available for this document.