Skip to main content

Clustering-based Curriculum Construction for Sample-Balanced Federated Learning

  • Conference paper
  • First Online:
Artificial Intelligence (CICAI 2022)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 13606))

Included in the following conference series:

Abstract

Federated learning is a distributed machine learning scheme that provides data privacy-preserving solution. A key challenge is data distribution heterogeneity of on different parties in federated learning. Existing methods only focus on the training rule of local model rather than data itself. In this paper, we reveal an fact that improving the performance of the local model can bring performance gain to the global model. Motivated by this finding, this paper proposes a Clustering-based curriculum construction method to rank the complexity of instances, and develops a Federation curriculum learning algorithm (FedAC). Specifically, FedAC assigns different weights to training samples of different complexity, which is able to take full advantage of the valuable learning knowledge from a noisy and uneven-quality data. Experiments were conducted on two datasets in terms of performance comparison, ablation studies, and case studies, and the results verified that FedAC can improve the performance of the state-of-the-art Federated learning methods.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 119.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 159.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Liu, Y., et al.: Fedvision: an online visual object detection platform powered by federated learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, pp. 13172–13179 (2020)

    Google Scholar 

  2. Yang, Q., Liu, Y., Chen, T., Tong, Y.: Federated machine learning: concept and applications. ACM Trans. Intell. Syst. Technol. (TIST), 10(2), 1–19 (2019)

    Google Scholar 

  3. Kairouz, P., et al.: Advances and open problems in federated learning. arXiv preprint arXiv:1912.04977 (2019)

  4. Kaissis, G.A., Makowski, M.R., Ruckert, D., Braren., R.F.: Secure, privacy-preserving and federated machine learning in medical imaging. Nat. Mach. Intell. 1–7 (2020)

    Google Scholar 

  5. Li, Q., He, B., Song, D.: Model-contrastive federated learning. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 10713–10722 (2021)

    Google Scholar 

  6. Hao, W., El-Khamy, M., Lee, J., et al.: Towards fair federated learning with zero-shot data augmentation. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 3310–3319 (2021)

    Google Scholar 

  7. Lin, T., Kong, L., Stich, S.U., et al.: Ensemble distillation for robust model fusion in federated learning. Adv. Neural Inf. Process. Syst. 23512363 (2020)

    Google Scholar 

  8. Duan, M., Liu, D., Chen, X., et al.: Astraea: self-balancing federated learning for improving classification accuracy of mobile deep learning applications. In: 2019 IEEE 37th International Conference on Computer Design (ICCD). IEEE, pp. 246–254 (2019)

    Google Scholar 

  9. Shen, T., Zhang, J., Jia, X., et al.: Federated mutual learning. arXiv preprint arXiv:2006.16765 (2020)

  10. Zhu, Z., Hong, J., Zhou, J.: Data-free knowledge distillation for heterogeneous federated learning. In: International Conference on Machine Learning. PMLR, pp. 12878–12889 (2021)

    Google Scholar 

  11. Yao, X., Sun, L.: Continual local training for better initialization of federated models. In: 2020 IEEE International Conference on Image Processing (ICIP). IEEE, pp. 1736–1740 (2020)

    Google Scholar 

  12. Li, T., Sahu, A.K., Zaheer, M., Sanjabi, M., Talwalkar, A., Smith, V.: Federated optimization in heterogeneous networks. In: Third Conference on Machine Learning and Systems (MLSys), vol. 2, pp. 429–450 (2020)

    Google Scholar 

  13. Wu, C., Wu, F., Liu, R., et al.: FedKD: Communication Efficient Federated Learning via Knowledge Dis tillation. ArXiv, abs/2108.13323 (2021)

    Google Scholar 

  14. Li, D., Wang, J.: FEDMD: Heterogenous federated learning via model distillation. arXiv preprint arXiv:1910.03581 (2019)

  15. Bengio, Y., Louradour, J., Collobert, R., et al.: Curriculum learning. In: Proceedings of the 26th Annual International Conference on Machine Learning, pp. 41–48 (2009)

    Google Scholar 

  16. Soviany, P., Ionescu, R.T., Rota, P., et al.: Curriculum learning: a survey. Int. J. Comput. Vis. 1–40 (2022)

    Google Scholar 

  17. Jiang, L., Meng, D., Zhao, Q., et al.: Self-paced curriculum learning. In: Twenty-Ninth AAAI Conference on Artificial Intelligence (2015)

    Google Scholar 

  18. Braun, S., Neil, D., Liu, S.C.: A curriculum learning method for improved noise robustness in automatic speech recognition. In: 2017 25th European Signal Processing Conference (EUSIPCO). IEEE, pp. 548–552 (2017)

    Google Scholar 

  19. Yao, D., Pan, W., Dai, Y., et al.: LocalGlobal Knowledge Distillation in Heterogeneous Federated Learning with NonIID Data. ArXiv, abs/2107.00051 (2021)

    Google Scholar 

  20. Wang, H., Yurochkin, M., Sun, Y., Papailiopoulos, D., Khazaeni, Y.: Federated learning with matched averaging. In International Conference on Learning Representations (2020)

    Google Scholar 

  21. Wang, J., Liu, Q., Liang, H., Joshi, G., Poor, H.V.: Tackling the objective inconsistency problem in heterogeneous federated optimization. Adv. Neural Inf. Process. Syst. 33 (2020)

    Google Scholar 

  22. Meng, L., Tan, A.H., Miao, C.: Salience-aware adaptive resonance theory for large-scale sparse data clustering. Neural Netw. 120, 143–157 (2019)

    Article  Google Scholar 

  23. He, K., Zhang, X., Ren, S., Sun, J.: Deep residual learning for image recognition. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 770–778 (2016)

    Google Scholar 

  24. McMahan, B., Moore, E., Ramage, D., Hampson, S.Y., Arcas, B.A., et al.: Communication-efficient learning of deep networks from decentralized data. arXiv preprint arXiv:1602.05629 (2016)

  25. van der Maaten, L., Hinton, G.: Visualizing data using t-sne. J. Mach. Learn. Res. 9, 2579–2605 (2008)

    Google Scholar 

Download references

Acknowledgments

This work is supported in part by the Excellent Youth Scholars Program of Shandong Province (Grant no. 2022HWYQ-048) and the Oversea Innovation Team Project of the “20 Regulations for New Universities" funding program of Jinan (Grant no. 2021GXRC073).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Lei Meng .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Qi, Z., Wang, Y., Chen, Z., Wang, R., Meng, X., Meng, L. (2022). Clustering-based Curriculum Construction for Sample-Balanced Federated Learning. In: Fang, L., Povey, D., Zhai, G., Mei, T., Wang, R. (eds) Artificial Intelligence. CICAI 2022. Lecture Notes in Computer Science(), vol 13606. Springer, Cham. https://doi.org/10.1007/978-3-031-20503-3_13

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-20503-3_13

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-20502-6

  • Online ISBN: 978-3-031-20503-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics