Abstract:
High-precision image matching techniques are required to fully utilize complementary information from optical and synthetic aperture radar (SAR) images. However, there ar...Show MoreMetadata
Abstract:
High-precision image matching techniques are required to fully utilize complementary information from optical and synthetic aperture radar (SAR) images. However, there are notable nonlinear radiometric differences (NRDs) between optical and SAR images because of the various imaging techniques used by the various sensors. The existing template matching method based on the Siamese structure underutilizes the phase structure information, which is less susceptible to NRD, resulting in subpar matching precision. To address this problem, this letter proposes an optical and SAR image-matching method based on phase structure convolutional features that use the log-Gabor filter (LGF) to extract the multiorientation phase structure information (MoPSI) of the image. It constructs a multiscale fusion SiamUNet-7 (MSF SiamUNet-7) network to extract the phase structure convolutional features to fully fuse the local texture information at a large scale and the global structure information at a small scale. The phase structure convolutional features of the optical and SAR images are used to generate the image pair similarity map using the mutual correlation layer, and the peak position in the similarity map is regarded as the best matching result. Experiments showed that, on the cropped Tiny-SEN1-2 dataset, the correct matching rate (CMR) and mean matching error (mME) of the threshold T\le 4 of the proposed method were 92.24% and 1.348, respectively, which improved the CMR( T\leq 4 ) by 4.51% and reduced the mME by 0.046 compared with the original SiamUNet-9 model. The proposed method can effectively overcome the large NRD between the optical and SAR images and achieve high-precision matching.
Published in: IEEE Geoscience and Remote Sensing Letters ( Volume: 20)