Abstract
Federated learning is a distributed machine learning scheme that provides data privacy-preserving solution. A key challenge is data distribution heterogeneity of on different parties in federated learning. Existing methods only focus on the training rule of local model rather than data itself. In this paper, we reveal an fact that improving the performance of the local model can bring performance gain to the global model. Motivated by this finding, this paper proposes a Clustering-based curriculum construction method to rank the complexity of instances, and develops a Federation curriculum learning algorithm (FedAC). Specifically, FedAC assigns different weights to training samples of different complexity, which is able to take full advantage of the valuable learning knowledge from a noisy and uneven-quality data. Experiments were conducted on two datasets in terms of performance comparison, ablation studies, and case studies, and the results verified that FedAC can improve the performance of the state-of-the-art Federated learning methods.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Liu, Y., et al.: Fedvision: an online visual object detection platform powered by federated learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, pp. 13172–13179 (2020)
Yang, Q., Liu, Y., Chen, T., Tong, Y.: Federated machine learning: concept and applications. ACM Trans. Intell. Syst. Technol. (TIST), 10(2), 1–19 (2019)
Kairouz, P., et al.: Advances and open problems in federated learning. arXiv preprint arXiv:1912.04977 (2019)
Kaissis, G.A., Makowski, M.R., Ruckert, D., Braren., R.F.: Secure, privacy-preserving and federated machine learning in medical imaging. Nat. Mach. Intell. 1–7 (2020)
Li, Q., He, B., Song, D.: Model-contrastive federated learning. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 10713–10722 (2021)
Hao, W., El-Khamy, M., Lee, J., et al.: Towards fair federated learning with zero-shot data augmentation. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 3310–3319 (2021)
Lin, T., Kong, L., Stich, S.U., et al.: Ensemble distillation for robust model fusion in federated learning. Adv. Neural Inf. Process. Syst. 23512363 (2020)
Duan, M., Liu, D., Chen, X., et al.: Astraea: self-balancing federated learning for improving classification accuracy of mobile deep learning applications. In: 2019 IEEE 37th International Conference on Computer Design (ICCD). IEEE, pp. 246–254 (2019)
Shen, T., Zhang, J., Jia, X., et al.: Federated mutual learning. arXiv preprint arXiv:2006.16765 (2020)
Zhu, Z., Hong, J., Zhou, J.: Data-free knowledge distillation for heterogeneous federated learning. In: International Conference on Machine Learning. PMLR, pp. 12878–12889 (2021)
Yao, X., Sun, L.: Continual local training for better initialization of federated models. In: 2020 IEEE International Conference on Image Processing (ICIP). IEEE, pp. 1736–1740 (2020)
Li, T., Sahu, A.K., Zaheer, M., Sanjabi, M., Talwalkar, A., Smith, V.: Federated optimization in heterogeneous networks. In: Third Conference on Machine Learning and Systems (MLSys), vol. 2, pp. 429–450 (2020)
Wu, C., Wu, F., Liu, R., et al.: FedKD: Communication Efficient Federated Learning via Knowledge Dis tillation. ArXiv, abs/2108.13323 (2021)
Li, D., Wang, J.: FEDMD: Heterogenous federated learning via model distillation. arXiv preprint arXiv:1910.03581 (2019)
Bengio, Y., Louradour, J., Collobert, R., et al.: Curriculum learning. In: Proceedings of the 26th Annual International Conference on Machine Learning, pp. 41–48 (2009)
Soviany, P., Ionescu, R.T., Rota, P., et al.: Curriculum learning: a survey. Int. J. Comput. Vis. 1–40 (2022)
Jiang, L., Meng, D., Zhao, Q., et al.: Self-paced curriculum learning. In: Twenty-Ninth AAAI Conference on Artificial Intelligence (2015)
Braun, S., Neil, D., Liu, S.C.: A curriculum learning method for improved noise robustness in automatic speech recognition. In: 2017 25th European Signal Processing Conference (EUSIPCO). IEEE, pp. 548–552 (2017)
Yao, D., Pan, W., Dai, Y., et al.: LocalGlobal Knowledge Distillation in Heterogeneous Federated Learning with NonIID Data. ArXiv, abs/2107.00051 (2021)
Wang, H., Yurochkin, M., Sun, Y., Papailiopoulos, D., Khazaeni, Y.: Federated learning with matched averaging. In International Conference on Learning Representations (2020)
Wang, J., Liu, Q., Liang, H., Joshi, G., Poor, H.V.: Tackling the objective inconsistency problem in heterogeneous federated optimization. Adv. Neural Inf. Process. Syst. 33 (2020)
Meng, L., Tan, A.H., Miao, C.: Salience-aware adaptive resonance theory for large-scale sparse data clustering. Neural Netw. 120, 143–157 (2019)
He, K., Zhang, X., Ren, S., Sun, J.: Deep residual learning for image recognition. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 770–778 (2016)
McMahan, B., Moore, E., Ramage, D., Hampson, S.Y., Arcas, B.A., et al.: Communication-efficient learning of deep networks from decentralized data. arXiv preprint arXiv:1602.05629 (2016)
van der Maaten, L., Hinton, G.: Visualizing data using t-sne. J. Mach. Learn. Res. 9, 2579–2605 (2008)
Acknowledgments
This work is supported in part by the Excellent Youth Scholars Program of Shandong Province (Grant no. 2022HWYQ-048) and the Oversea Innovation Team Project of the “20 Regulations for New Universities" funding program of Jinan (Grant no. 2021GXRC073).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Qi, Z., Wang, Y., Chen, Z., Wang, R., Meng, X., Meng, L. (2022). Clustering-based Curriculum Construction for Sample-Balanced Federated Learning. In: Fang, L., Povey, D., Zhai, G., Mei, T., Wang, R. (eds) Artificial Intelligence. CICAI 2022. Lecture Notes in Computer Science(), vol 13606. Springer, Cham. https://doi.org/10.1007/978-3-031-20503-3_13
Download citation
DOI: https://doi.org/10.1007/978-3-031-20503-3_13
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-20502-6
Online ISBN: 978-3-031-20503-3
eBook Packages: Computer ScienceComputer Science (R0)