Skip to main content

Class Incremental Learning with Important and Diverse Memory

  • Conference paper
  • First Online:
Image and Graphics (ICIG 2023)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 14358))

Included in the following conference series:

  • 325 Accesses

Abstract

Class incremental learning (CIL) has been attracting increasing attention in computer vision and machine learning communities, where a well-known issue is catastrophic forgetting. To mitigate this issue, a popular approach is to utilize the replay-based strategy, which stores a small portion of past data and replays it when learning new tasks. However, selecting valuable samples from previous classes for replaying remains an open problem in class incremental learning. In this paper, we propose a novel sample selection strategy aimed at maintaining effective samples from old classes to address the catastrophic forgetting issue. Specifically, we employ the influence function to evaluate the impact of each sample on model performance, and then select important samples for replay. However, given the potential redundancy among selected samples when only considering importance, we also develop a diversity strategy to select not only important but also diverse samples from old classes. We conduct extensive empirical validations on the CIFAR10 and CIFAR100 datasets and the results demonstrate that our proposed method outperforms the baselines, effectively alleviating the catastrophic forgetting issue in class incremental learning.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 59.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 79.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Li, X.: Referencing unlabelled world data to prevent catastrophic forgetting in class-incremental learning, Ph.D. thesis, Virginia Tech (2022)

    Google Scholar 

  2. Mai, Z., Li, R., Jeong, J., Quispe, D., Kim, H., Sanner, S.: Online continual learning in image classification: an empirical survey. Neurocomputing 469, 28–51 (2022)

    Article  Google Scholar 

  3. Tang, Y.M., Peng, Y.X., Zheng, W.S.: Learning to imagine: diversify memory for incremental learning using unlabeled data. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 9549–9558 (2022)

    Google Scholar 

  4. Leo, J., Kalita, J.: Incremental deep neural network learning using classification confidence thresholding. IEEE Trans. Neural Networks Learn. Syst. 33(12), 7706–7716 (2021)

    Article  MathSciNet  Google Scholar 

  5. Feng, K., Li, C., Zhang, X., Zhou, J.: Towards open temporal graph neural networks. arXiv preprint arXiv:2303.15015 (2023)

  6. Li, Z., Hoiem, D.: Learning without forgetting. IEEE Trans. Pattern Anal. Mach. Intell. 40(12), 2935–2947 (2017)

    Article  Google Scholar 

  7. Yoon, J., Yang, E., Lee, J., Hwang, S.J.: Lifelong learning with dynamically expandable networks. arXiv preprint arXiv:1708.01547 (2017)

  8. Aljundi, R., Chakravarty, P., Tuytelaars, T.: Expert gate: lifelong learning with a network of experts. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 3366–3375 (2017)

    Google Scholar 

  9. Robins, A.: Catastrophic forgetting, rehearsal and pseudorehearsal. Connect. Sci. 7(2), 123–146 (1995)

    Article  Google Scholar 

  10. Shin, H., Lee, J.K., Kim, J., Kim, J.: Continual learning with deep generative replay. In: Advances in Neural Information Processing Systems, vol. 30 (2017)

    Google Scholar 

  11. Lopez-Paz, D., Ranzato, M.: Gradient episodic memory for continual learning. In: Advances in Neural Information Processing Systems, vol. 30 (2017)

    Google Scholar 

  12. Saha, G., Roy, K.: Saliency guided experience packing for replay in continual learning. In: Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision, pp. 5273–5283 (2023)

    Google Scholar 

  13. Tiwari, R., Killamsetty, K., Iyer, R., Shenoy, P.: GCR: gradient coreset based replay buffer selection for continual learning. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 99–108 (2022)

    Google Scholar 

  14. Hao, Y., Fu, Y., Jiang, Y.G., Tian, Q.: An end-to-end architecture for class-incremental object detection with knowledge distillation. In: 2019 IEEE International Conference on Multimedia and Expo (ICME), pp. 1–6. IEEE (2019)

    Google Scholar 

  15. Rusu, A.A., et al.: Progressive neural networks. arXiv preprint arXiv:1606.04671 (2016)

  16. Mallya, A., Lazebnik, S.: PackNet: adding multiple tasks to a single network by iterative pruning. In: Proceedings of the IEEE conference on Computer Vision and Pattern Recognition, pp. 7765–7773 (2018)

    Google Scholar 

  17. Saha, G., Garg, I., Ankit, A., Roy, K.: Space: structured compression and sharing of representational space for continual learning. IEEE Access 9, 150480–150494 (2021)

    Article  Google Scholar 

  18. Serra, J., Suris, D., Miron, M., Karatzoglou, A.: Overcoming catastrophic forgetting with hard attention to the task. In: International Conference on Machine Learning, pp. 4548–4557. PMLR (2018)

    Google Scholar 

  19. Lee, S., Ha, J., Zhang, D., Kim, G.: A neural dirichlet process mixture model for task-free continual learning. arXiv preprint arXiv:2001.00689 (2020)

  20. Chandra, D.S., Varshney, S., Srijith, P., Gupta, S.: Continual learning with dependency preserving hypernetworks. In: Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision, pp. 2339–2348 (2023)

    Google Scholar 

  21. Lee, S.W., Kim, J.H., Jun, J., Ha, J.W., Zhang, B.T.: Overcoming catastrophic forgetting by incremental moment matching. In: Advances in Neural Information Processing Systems, vol. 30 (2017)

    Google Scholar 

  22. Chaudhry, A., Ranzato, M., Rohrbach, M., Elhoseiny, M.: Efficient lifelong learning with a-gem. arXiv preprint arXiv:1812.00420 (2018)

  23. Li, Z., Hoiem, D.: Learning without forgetting. IEEE Trans. Pattern Anal. Mach. Intell. 40(12), 2935–2947 (2017)

    Article  Google Scholar 

  24. Feng, K., Li, C., Yuan, Y., Wang, G.: FreeKD: free-direction knowledge distillation for graph neural networks. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 357–366 (2022)

    Google Scholar 

  25. Zenke, F., Poole, B., Ganguli, S.: Continual learning through synaptic intelligence. In: International Conference on Machine Learning, pp. 3987–3995. PMLR (2017)

    Google Scholar 

  26. Dhar, P., Singh, R.V., Peng, K.C., Wu, Z., Chellappa, R.: Learning without memorizing. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 5138–5146 (2019)

    Google Scholar 

  27. Nguyen, C.V., Li, Y., Bui, T.D., Turner, R.E.: Variational continual learning. arXiv preprint arXiv:1710.10628 (2017)

  28. Aljundi, R., Caccia, L., Belilovsky, E., Caccia, M., Lin, M., Charlin, L., Tuytelaars, T.: Online continual learning with maximally interfered retrieval. CoRR abs/1908.04742 (2019), http://arxiv.org/abs/1908.04742

  29. Riemer, M., et al.: Learning to learn without forgetting by maximizing transfer and minimizing interference. arXiv preprint arXiv:1810.11910 (2018)

  30. Buzzega, P., Boschini, M., Porrello, A., Abati, D., Calderara, S.: Dark experience for general continual learning: a strong, simple baseline. Adv. Neural. Inf. Process. Syst. 33, 15920–15930 (2020)

    Google Scholar 

  31. Chaudhry, A., Gordo, A., Dokania, P., Torr, P., Lopez-Paz, D.: Using hindsight to anchor past knowledge in continual learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 35, pp. 6993–7001 (2021)

    Google Scholar 

  32. Ebrahimi, S., et al.: Remembering for the right reasons: explanations reduce catastrophic forgetting. Appl. AI Lett. 2(4), e44 (2021)

    Article  Google Scholar 

  33. Chaudhry, A., et al.: Continual learning with tiny episodic memories. arXiv:1902.10486 (2019)

  34. Rebuffi, S.A., Kolesnikov, A., Sperl, G., Lampert, C.H.: ICaRL: incremental classifier and representation learning. In: Proceedings of the IEEE conference on Computer Vision and Pattern Recognition, pp. 2001–2010 (2017)

    Google Scholar 

  35. Feng, K., Li, C., Zhang, X., Zhou, J.: Towards open temporal graph neural networks. In: The Eleventh International Conference on Learning Representations (2023). https://openreview.net/forum?id=N9Pk5iSCzAn

  36. Lapedriza, À., Pirsiavash, H., Bylinskii, Z., Torralba, A.: Are all training examples equally valuable? CoRR abs/1311.6510 (2013), http://arxiv.org/abs/1311.6510

  37. Koh, P.W., Liang, P.: Understanding black-box predictions via influence functions (2020)

    Google Scholar 

  38. Zhang, J., et al.: Class-incremental learning via deep model consolidation. In: Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision, pp. 1131–1140 (2020)

    Google Scholar 

  39. Wu, Y., et al.: Large scale incremental learning. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 374–382 (2019)

    Google Scholar 

  40. Yoon, J., Madaan, D., Yang, E., Hwang, S.J.: Online coreset selection for rehearsal-based continual learning. arXiv preprint arXiv:2106.01085 (2021)

  41. Castro, F.M., Marín-Jiménez, M.J., Guil, N., Schmid, C., Alahari, K.: End-to-end incremental learning. In: Proceedings of the European Conference on Computer Vision (ECCV), pp. 233–248 (2018)

    Google Scholar 

  42. Liu, Z., Ding, H., Zhong, H., Li, W., Dai, J., He, C.: Influence selection for active learning. CoRR abs/2108.09331 (2021), https://arxiv.org/abs/2108.09331

Download references

Acknowledgements

This work was supported by the NSFC under Grants 62122013, U2001211. This work was also supported by the Innovative Development Joint Fund Key Projects of Shandong NSF under Grants ZR2022LZH007.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Changsheng Li .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Li, M., Yan, Z., Li, C. (2023). Class Incremental Learning with Important and Diverse Memory. In: Lu, H., et al. Image and Graphics . ICIG 2023. Lecture Notes in Computer Science, vol 14358. Springer, Cham. https://doi.org/10.1007/978-3-031-46314-3_13

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-46314-3_13

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-46313-6

  • Online ISBN: 978-3-031-46314-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics