Loading [a11y]/accessibility-menu.js
A Class-Incremental Approach With Self-Training and Prototype Augmentation for Specific Emitter Identification | IEEE Journals & Magazine | IEEE Xplore

A Class-Incremental Approach With Self-Training and Prototype Augmentation for Specific Emitter Identification


Abstract:

Specific emitter identification (SEI) is a non-cryptographic authentication technique to provide an extra security layer for wireless devices, which has promising applica...Show More

Abstract:

Specific emitter identification (SEI) is a non-cryptographic authentication technique to provide an extra security layer for wireless devices, which has promising applications. However, the traditional methods of SEI are only available in limited equipments. In actual application scenarios, new devices (as new classes) are constantly appearing. In this paper, an effective class incremental learning (CIL) method is proposed for SEI, named class-incremental with self-training and prototype augmentation (CISP). It is a teacher-student network. Firstly, the teacher network trained by the old-class data is utilized to instruct the student network to adapt the new classes while retaining the old-class knowledge through the knowledge distillation (KD) techniques. Secondly, in order to mitigate the problem of favoring the new classes, weight aligning (WA) method is introduced to balance the weights of the new-class and old-class classification layers in the student network. Lastly, the old-class samples are recalled from the unlabeled dataset by the student network and input into the teacher network. Then the feature prototypes of the old classes are constructed and augmented. This would further ease the imbalance between the old and new classes and alleviate the problem of noisy pseudo-labels. Experiment results on the real AIS-100 dataset and ADS-B-100 dataset with the number of the initial classes being 20 and 20 classes per incremental step demonstrate that the proposed method can achieve an average accuracy of 95.29% and 95.84%, respectively. It effectively mitigates the catastrophic forgetting of the model and is superior to the state-of-the-art incremental learning approaches of not saving the old-class samples.
Page(s): 1714 - 1727
Date of Publication: 14 December 2023

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.