Loading [a11y]/accessibility-menu.js
Mutually Promoted Hierarchical Learning for Incremental Implicitly-Refined Classification | IEEE Conference Publication | IEEE Xplore

Mutually Promoted Hierarchical Learning for Incremental Implicitly-Refined Classification


Abstract:

Class incremental learning devotes to learning a classification model from incrementally arriving training data. Existing methods tend to use a single-headed layout due t...Show More

Abstract:

Class incremental learning devotes to learning a classification model from incrementally arriving training data. Existing methods tend to use a single-headed layout due to the lack of task delimiter while testing. However, this is not suitable for Incremental Implicitly-Refined Classification (IIRC), an extension to class incremental learning where different classes could have two granularity levels. In IIRC, each sample could have a coarse label and a fine label. Without considering hierarchical relations among classes, it's difficult to distinguish a subclass from its siblings and assign them to the same superclass while alleviating catastrophic forgetting. In this paper, we propose a new framework called Mutually Promoted Hierarchical Learning (MPHL) to solve IIRC. MPHL learns separate latent spaces for superclasses and subclasses to coordinate the representations of two granularities and utilizes the hierarchy to facilitate the training process of each other. While using knowledge distillation, subclasses are treated as replay samples of their parents, and coarse labels are used to eliminate interference from non-siblings. To reduce the risk of selecting unwarranted fine-grained labels, we further propose a self-adaptive threshold strategy to detect out-of-distribution samples. Extensive experiments on IIRC-CIFAR100 and IIRC-ImageNet show that MPHL achieves state-of-the-art results.
Date of Conference: 18-23 June 2023
Date Added to IEEE Xplore: 02 August 2023
ISBN Information:

ISSN Information:

Conference Location: Gold Coast, Australia

Funding Agency:


References

References is not available for this document.