Loading [a11y]/accessibility-menu.js
Connection-Based Knowledge Transfer for Class Incremental Learning | IEEE Conference Publication | IEEE Xplore

Connection-Based Knowledge Transfer for Class Incremental Learning


Abstract:

We consider the problem of class incremental learning (CIL), where an agent aims to learn new classes continually without forgetting previous ones. As one of the mainstre...Show More

Abstract:

We consider the problem of class incremental learning (CIL), where an agent aims to learn new classes continually without forgetting previous ones. As one of the mainstream paradigms of incremental learning, parameter isolation methods prevent forgetting by allocating different model parameters to each task, but knowledge transfer across tasks is difficult and usually overlooked. As a consequence, the discriminability between old and new classes is limited, especially when training data of old classes is not accessible. In this paper, we propose a new data-free approach named Twin Contrastive Networks (TCN) for CIL by utilizing the connections among tasks and network parameters. Specifically, we treat CIL as a sequence of one-class classification tasks and train separate classifiers to identify each class. To facilitate knowledge transfer and make full use of accumulated knowledge, a twin network structure is adopted to learn different feature representations for future use. While encountering new classes, previous twin networks are utilized directly by a contrastive loss to improve the model's discriminability. TCN avoids catastrophic forgetting by fixing all learnt parameters and leverages prior knowledge contained in networks. Experiments on three widely used incremental learning benchmarks verify the effectiveness of TCN.
Date of Conference: 18-23 June 2023
Date Added to IEEE Xplore: 02 August 2023
ISBN Information:

ISSN Information:

Conference Location: Gold Coast, Australia

Funding Agency:


References

References is not available for this document.