Abstract:
In synthetic aperture radar automatic target recognition (SAR-ATR), the limitations of imaging environment and observation conditions make it challenging to acquire a sub...View moreMetadata
Abstract:
In synthetic aperture radar automatic target recognition (SAR-ATR), the limitations of imaging environment and observation conditions make it challenging to acquire a substantial amount of high-value targets, resulting in a severe shortage of datasets. This scarcity leads to poor performance and instability in few-shot SAR target recognition. To address these shortcomings, this article proposes meta-adaptive stochastic gradient descent (Mada-SGD), a novel inner loop parameter update approach based on meta-adaptive hyperparameter learning. By considering the correlation information between multiple update steps, Mada-SGD learns the weight distribution information of initialization parameters across previous and current update steps, akin to a memory mechanism. This approach enhances feature extraction and representation ability for few-shot SAR targets. In addition, an adaptive hyperparameter update strategy is introduced to simultaneously learn the initialization, weight factor, update factor, and update direction in the meta-learner. This effectively resolves parameter updating issues in meta-learning models while improving fast adaptation for few-shot SAR targets. Experimental results on the specialized moving and stationary target acquisition and recognition few-shot learning (MSTAR-FSL) dataset demonstrate that Mada-SGD outperforms the latest few-shot SAR target recognition model in terms of SAR target recognition performance, validating its advancement and superiority.
Published in: IEEE Transactions on Geoscience and Remote Sensing ( Volume: 61)