Skip to main content

FedSOKD-TFA: Federated Learning with Stage-Optimal Knowledge Distillation and Three-Factor Aggregation

  • Conference paper
  • First Online:
Pattern Recognition (ICPR 2024)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 15302))

Included in the following conference series:

  • 178 Accesses

Abstract

Federated learning is a model training method that protects user data and privacy, making it a feasible solution for multi-user collaborative training. However, due to the heterogeneity of data among clients, the optimization direction of each model is different, resulting in poor model training effects and accuracy fluctuations during training. To solve this problem, we introduce a stage-optimal strategy and propose a stage-optimal knowledge distillation method. The proposed method keeps the optimal local models and optimizes the subsequent training of the models through knowledge distillation to reduce the loss of learned knowledge. Additionally, we propose a new aggregation method that considers both static and dynamic factors. For evaluation, we conducted experiments on the CIFAR10 and CIFAR100 datasets. The proposed method significantly improved performance, achieving a maximum accuracy gain of \(13.07\%\) over the baseline model of FedPer and attaining state-of-the-art performance. The code is available at the following link: https://github.com/FedSOKD-TFA/FedSOKD-TFA.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Xu, J., Glicksberg, B.S., Su, C., Walker, P., Bian, J., Wang, F.: Federated learning for healthcare informatics. J. Healthc. Inf. Res. 5, 1–19 (2021)

    Article  Google Scholar 

  2. Zhuang, W., et al.: Performance optimization of federated person re-identification via benchmark analysis. In: Proceedings of the 28th ACM International Conference on Multimedia, pp. 955–963 (2020)

    Google Scholar 

  3. Li, Z., et al.: Data heterogeneity-robust federated learning via group client selection in industrial IoT. IEEE Internet Things J. 9(18), 17844–17857 (2022)

    Article  Google Scholar 

  4. Su, X., Zhou, Y., Cui, L., Liu, J.: On model transmission strategies in federated learning with lossy communications. IEEE Trans. Parallel Distrib. Syst. 34(4), 1173–1185 (2023)

    Google Scholar 

  5. Zhao, Y., Li, M., Lai, L., Suda, N., Civin, D., Chandra, V.: Federated learning with non-IID data. arXiv preprint arXiv:1806.00582 (2018)

  6. Li, T., Sahu, A.K., Zaheer, M., Sanjabi, M., Talwalkar, A., Smith, V.: Federated optimization in heterogeneous networks. Proc. Mach. Learn. Syst. 2, 429–450 (2020)

    Google Scholar 

  7. Karimireddy, S.P., Kale, S., Mohri, M., Reddi, S., Stich, S., Suresh, A.T.: Scaffold: stochastic controlled averaging for federated learning. In: International Conference on Machine Learning, pp. 5132–5143. PMLR (2020)

    Google Scholar 

  8. Li, Q., He, B., Song, D.: Model-contrastive federated learning. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 10713–10722 (2021)

    Google Scholar 

  9. Yoon, T., Shin, S., Hwang, S.J., Yang, E.: FedMix: approximation of mixup under mean augmented federated learning. arXiv preprint arXiv:2107.00233 (2021)

  10. Zhang, H., Hou, Q., Wu, T., Cheng, S., Liu, J.: Data augmentation based federated learning. IEEE Internet Things J., 1 (2023). https://doi.org/10.1109/JIOT.2023.3303889

  11. Wu, Y., et al.: FedCG: leverage conditional GAN for protecting privacy and maintaining competitive performance in federated learning, pp. 2309–2315 (2022). https://doi.org/10.24963/ijcai.2022/321

  12. Zhang, L., Shen, L., Ding, L., Tao, D., Duan, L.Y.: Fine-tuning global model via data-free knowledge distillation for non-IID federated learning. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 10174–10183 (2022)

    Google Scholar 

  13. Jiang, D., Shan, C., Zhang, Z.: Federated learning algorithm based on knowledge distillation. In: 2020 International Conference on Artificial Intelligence and Computer Engineering (ICAICE), pp. 163–167. IEEE (2020)

    Google Scholar 

  14. Wang, H., Li, Y., Xu, W., Li, R., Zhan, Y., Zeng, Z.: DaFKD: domain-aware federated knowledge distillation. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 20412–20421 (2023)

    Google Scholar 

  15. Han, S., et al.: FedX: unsupervised federated learning with cross knowledge distillation. In: Avidan, S., Brostow, G., Cissé, M., Farinella, G.M., Hassner, T. (eds.) ECCV 2022. LNCS, vol. 13690, pp. 691–707. Springer, Cham (2022). https://doi.org/10.1007/978-3-031-20056-4_40

    Chapter  Google Scholar 

  16. McMahan, B., Moore, E., Ramage, D., Hampson, S., y Arcas, B.A.: Communication-efficient learning of deep networks from decentralized data. In: Artificial Intelligence and Statistics, pp. 1273–1282. PMLR (2017)

    Google Scholar 

  17. Ye, R., Xu, M., Wang, J., Xu, C., Chen, S., Wang, Y.: FedDisco: federated learning with discrepancy-aware collaboration. In: Proceedings of the 40th International Conference on Machine Learning, ICML 2023. JMLR (2023)

    Google Scholar 

  18. Tan, J., Zhou, Y., Liu, G., Wang, J.H., Yu, S.: pFedSim: similarity-aware model aggregation towards personalized federated learning. arXiv preprint arXiv:2305.15706 (2023)

  19. Ye, R., Ni, Z., Wu, F., Chen, S., Wang, Y.: Personalized federated learning with inferred collaboration graphs. In: International Conference on Machine Learning, pp. 39801–39817. PMLR (2023)

    Google Scholar 

  20. Arivazhagan, M.G., Aggarwal, V., Singh, A.K., Choudhary, S.: Federated learning with personalization layers. arXiv preprint arXiv:1912.00818 (2019)

  21. Dinh, C.T., Tran, N., Nguyen, J.: Personalized federated learning with Moreau envelopes. Adv. Neural Inf. Process. Syst. 33, 21394–21405 (2020)

    Google Scholar 

  22. Li, X., Jiang, M., Zhang, X., Kamp, M., Dou, Q.: FedBN: Federated learning on non-IID features via local batch normalization. In: International Conference on Learning Representations (2021), https://openreview.net/forum?id=6YEQUn0QICG

  23. Li, Z., Lin, T., Shang, X., Wu, C.: Revisiting weighted aggregation in federated learning with neural networks. In: Proceedings of the 40th International Conference on Machine Learning, ICML 2023. JMLR (2023)

    Google Scholar 

  24. Liu, Y., et al.: FedVision: an online visual object detection platform powered by federated learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 13172–13179 (2020)

    Google Scholar 

  25. Liu, Q., Chen, C., Qin, J., Dou, Q., Heng, P.A.: FedDG: federated domain generalization on medical image segmentation via episodic learning in continuous frequency space. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 1013–1023 (2021)

    Google Scholar 

  26. Lin, B.Y., et al.: FedNLP: benchmarking federated learning methods for natural language processing tasks. arXiv preprint arXiv:2104.08815 (2021)

Download references

Acknowledgements

Wenjuan Gong acknowledges the support by the Natural Science Foundation of Shandong Province under Grant ZR2023MF041. Jordi Gonzàlez acknowledges the support of the Spanish Ministry of Economy and Competitiveness (MINECO) and the European Regional Development Fund (ERDF) under Project No. PID2020-120611RBI00/AEI/10.13039/501100011033.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Wenjuan Gong .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2025 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Liu, J., Gong, W., Shi, T., Li, K., Jin, Y., Gonzàlez, J. (2025). FedSOKD-TFA: Federated Learning with Stage-Optimal Knowledge Distillation and Three-Factor Aggregation. In: Antonacopoulos, A., Chaudhuri, S., Chellappa, R., Liu, CL., Bhattacharya, S., Pal, U. (eds) Pattern Recognition. ICPR 2024. Lecture Notes in Computer Science, vol 15302. Springer, Cham. https://doi.org/10.1007/978-3-031-78166-7_2

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-78166-7_2

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-78165-0

  • Online ISBN: 978-3-031-78166-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics