Skip to main content

Knowledge Lock: Overcoming Catastrophic Forgetting in Federated Learning

  • Conference paper
  • First Online:
Advances in Knowledge Discovery and Data Mining (PAKDD 2022)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 13280))

Included in the following conference series:

Abstract

Federated Learning (FL) aims to train machine learning models by decentralized data without direct data sharing. Nevertheless, the heterogeneity of data across FL participants has significantly prevented federated models from competitive performance. In this paper, we consider this issue as the consequence of knowledge forgetting, since the local update process in FL may result in catastrophic forgetting of the knowledge learned from other participants. Motivated by the recent advance in incremental learning techniques, we address this issue by overcoming the sever knowledge forgetting caused by data isolation. We propose a novel method called FedKL (Federated Learning with Knowledge Lock), in which knowledge distillation techniques are employed to maintain the previously learned knowledge. Our extensive experiment results demonstrate that FedKL achieves superior performance than prior methods, with over 3.4% and 3.5% accuracy improvements on CIFAR-10 and CIFAR-100 respectively, compared with the popular FL algorithm FedAvg. Furthermore, we also explore the benefits of introducing shared exemplars (a fraction of local data) to FedKL. In the experiments, we select and share 10 samples per class for FedKL and the baseline methods. As a result, FedKL obtains 2.56% accuracy increase on CIFAR-10, instead of the marginal improvements on prior methods (less than 1.5%

Supported by the Science and Technology Innovation 2030-Key Project under Grant 2021ZD0201404.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    \(\text {Sigmoid}(x)=1/(1+e^{-x})\).

  2. 2.

    http://yann.lecun.com/exdb/mnist/.

  3. 3.

    \(\varTheta \) denotes the parameters of the model and \(\varTheta ^t\) denotes the parameters at the t-th round.

References

  1. Arivazhagan, M.G., Aggarwal, V., Singh, A.K., Choudhary, S.: Federated learning with personalization layers. arXiv preprint arXiv:1912.00818 (2019)

  2. Duan, M., et al.: Astraea: self-balancing federated learning for improving classification accuracy of mobile deep learning applications. In: 2019 IEEE 37th International Conference on Computer Design (ICCD), pp. 246–254. IEEE (2019)

    Google Scholar 

  3. Fallah, A., Mokhtari, A., Ozdaglar, A.: Personalized federated learning with theoretical guarantees: a model-agnostic meta-learning approach. Adv. Neural Inf. Process. Syst. 33, 3557–3568 (2020)

    Google Scholar 

  4. Ghosh, A., Chung, J., Yin, D., Ramchandran, K.: An efficient framework for clustered federated learning. Adv. Neural Inf. Process. Syst. 33, 19586–19597 (2020)

    Google Scholar 

  5. Ghosh, A., Hong, J., Yin, D., Ramchandran, K.: Robust federated learning in a heterogeneous environment. arXiv preprint arXiv:1906.06629 (2019)

  6. Hanzely, F., Richtárik, P.: Federated learning of a mixture of global and local models. arXiv preprint arXiv:2002.05516 (2020)

  7. He, K., Zhang, X., Ren, S., Sun, J.: Deep residual learning for image recognition. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 770–778 (2016)

    Google Scholar 

  8. Hinton, G., Vinyals, O., Dean, J.: Distilling the knowledge in a neural network. arXiv preprint arXiv:1503.02531 (2015)

  9. Huang, Y., et al.: Personalized cross-silo federated learning on Non-IID data (2021)

    Google Scholar 

  10. Ji, S., Pan, S., Long, G., Li, X., Jiang, J., Huang, Z.: Learning private neural language modeling with attentive aggregation. In: 2019 International Joint Conference on Neural Networks (IJCNN), pp. 1–8. IEEE (2019)

    Google Scholar 

  11. Karimireddy, S.P., Kale, S., Mohri, M., Reddi, S., Stich, S., Suresh, A.T.: Scaffold: stochastic controlled averaging for federated learning. In: International Conference on Machine Learning, pp. 5132–5143. PMLR (2020)

    Google Scholar 

  12. Kirkpatrick, J., et al.: Overcoming catastrophic forgetting in neural networks. Proc. Natl. Acad. Sci. 114(13), 3521–3526 (2017)

    Article  MathSciNet  Google Scholar 

  13. Kopparapu, K., Lin, E.: FedFMC: sequential efficient federated learning on non-iid data. arXiv preprint arXiv:2006.10937 (2020)

  14. Krizhevsky, A., Hinton, G., et al.: Learning multiple layers of features from tiny images (2009)

    Google Scholar 

  15. Li, T., Sahu, A.K., Zaheer, M., Sanjabi, M., Talwalkar, A., Smith, V.: Federated optimization in heterogeneous networks. arXiv preprint arXiv:1812.06127 (2018)

  16. Li, Z., Hoiem, D.: Learning without forgetting. IEEE Trans. Pattern Anal. Mach. Intell. 40(12), 2935–2947 (2017)

    Article  Google Scholar 

  17. Liang, P.P., et al.: Think locally, act globally: Federated learning with local and global representations. arXiv preprint arXiv:2001.01523 (2020)

  18. Lin, T., Kong, L., Stich, S.U., Jaggi, M.: Ensemble distillation for robust model fusion in federated learning. arXiv preprint arXiv:2006.07242 (2020)

  19. Liu, Y., Kang, Y., Xing, C., Chen, T., Yang, Q.: A secure federated transfer learning framework. IEEE Intell. Syst. 35(4), 70–82 (2020)

    Article  Google Scholar 

  20. McMahan, B., Moore, E., Ramage, D., Hampson, S., y Arcas, B.A.: Communication-efficient learning of deep networks from decentralized data. In: Artificial Intelligence and Statistics, pp. 1273–1282. PMLR (2017)

    Google Scholar 

  21. Peng, X., Huang, Z., Zhu, Y., Saenko, K.: Federated adversarial domain adaptation. In: International Conference on Learning Representations (2019)

    Google Scholar 

  22. Rebuffi, S.A., Kolesnikov, A., Sperl, G., Lampert, C.H.: ICaRL: incremental classifier and representation learning. In: Proceedings of the IEEE conference on Computer Vision and Pattern Recognition, pp. 2001–2010 (2017)

    Google Scholar 

  23. Reddi, S.J., et al.: Adaptive federated optimization. In: International Conference on Learning Representations (2020)

    Google Scholar 

  24. Sattler, F., Müller, K.R., Samek, W.: Clustered federated learning: model-agnostic distributed multitask optimization under privacy constraints. IEEE Trans. Neural Networks Learn. Syst. 39, 3710–3722 (2020)

    MathSciNet  Google Scholar 

  25. Shin, M., Hwang, C., Kim, J., Park, J., Bennis, M., Kim, S.L.: XOR mixup: privacy-preserving data augmentation for one-shot federated learning. arXiv preprint arXiv:2006.05148 (2020)

  26. Shoham, N., et al.: Overcoming forgetting in federated learning on Non-IID data. arXiv preprint arXiv:1910.07796 (2019)

  27. Simonyan, K., Zisserman, A.: Very deep convolutional networks for large-scale image recognition. arXiv preprint arXiv:1409.1556 (2014)

  28. Smith, V., Chiang, C.K., Sanjabi, M., Talwalkar, A.: Federated multi-task learning. In: Proceedings of the 31st International Conference on Neural Information Processing Systems, pp. 4427–4437 (2017)

    Google Scholar 

  29. Tuor, T., Wang, S., Ko, B.J., Liu, C., Leung, K.K.: Overcoming noisy and irrelevant data in federated learning. In: 2020 25th International Conference on Pattern Recognition (ICPR), pp. 5020–5027. IEEE (2021)

    Google Scholar 

  30. Wang, S., et al.: Adaptive federated learning in resource constrained edge computing systems. IEEE J. Sel. Areas Commun. 37(6), 1205–1221 (2019)

    Article  Google Scholar 

  31. Yang, Q., Liu, Y., Cheng, Y., Kang, Y., Chen, T., Yu, H.: Federated learning. Synth. Lect. Artif. Intell. Mach. Learn. 13(3), 1–207 (2019)

    Google Scholar 

  32. Yoon, J., Jeong, W., Lee, G., Yang, E., Hwang, S.J.: Federated continual learning with weighted inter-client transfer. In: International Conference on Machine Learning, pp. 12073–12086. PMLR (2021)

    Google Scholar 

  33. Yoshida, N., Nishio, T., Morikura, M., Yamamoto, K., Yonetani, R.: Hybrid-FL: cooperative learning mechanism using Non-IID data in wireless networks. arXiv preprint arXiv:1905.07210 (2019)

  34. Zhao, Y., Li, M., Lai, L., Suda, N., Civin, D., Chandra, V.: Federated learning with Non-IID data. arXiv preprint arXiv:1806.00582 (2018)

  35. Zhu, H., Xu, J., Liu, S., Jin, Y.: Federated learning on Non-IID data: a survey. arXiv preprint arXiv:2106.06843 (2021)

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xiu Li .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Wei, G., Li, X. (2022). Knowledge Lock: Overcoming Catastrophic Forgetting in Federated Learning. In: Gama, J., Li, T., Yu, Y., Chen, E., Zheng, Y., Teng, F. (eds) Advances in Knowledge Discovery and Data Mining. PAKDD 2022. Lecture Notes in Computer Science(), vol 13280. Springer, Cham. https://doi.org/10.1007/978-3-031-05933-9_47

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-05933-9_47

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-05932-2

  • Online ISBN: 978-3-031-05933-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics