Abstract
Federated Learning (FL) is a privacy-preserving framework used to perform machine learning tasks with distributed data. One of the key challenges is heterogeneous data distributions among clients, which results in client-drift, leading to the oscillatory and low-accuracy global model. Although lots of work has been proposed to mitigate client-drift, we find there are drawbacks associated with the two common methods: feature alignment and classifier tuning. For the former, the great bias in classifiers still holds in local models and degrades global model performance. For the latter, it’s hard to obtain suitable global features to introduce external knowledge to locals. To address the above drawbacks, in this paper, we propose a privacy-preserving and effective method, named FCA, to tackle client-drift issues in Non-IID federated learning via aligning models’ components. Specifically, FCA enhances similarity among the local models’ components, i.e. feature extractors and classifiers, by utilizing the estimated global feature representations. Experimental results demonstrate that FCA achieves better performance with fewer rounds. Compared with vanilla, our method achieves from 0.4% to 7.5% performance improvement on three popular datasets with four different Non-IID scenarios.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Russakovsky, O., et al.: Imagenet large scale visual recognition challenge. Int. J. Comput. Vision 115, 211–252 (2015)
Konečnỳ, J., McMahan, H.B., Yu, F.X., Richtárik, P., Suresh, A.T., Bacon, D.: Federated learning: Strategies for improving communication efficiency. arXiv preprint arXiv:1610.05492 (2016)
Konečnỳ, J., McMahan, H.B., Ramage, D., Richtárik, P.: Federated optimization: Distributed machine learning for on-device intelligence. arXiv preprint arXiv:1610.02527 (2016)
McMahan, B., Moore, E., Ramage, D., Hampson, S., y Arcas, B.A.: Communication-efficient learning of deep networks from decentralized data. In: Artificial intelligence and statistics, pp. 1273–1282. PMLR (2017)
Kairouz, P., et al.: Advances and open problems in federated learning. Found. Trends® Mach. Learn. 14(1–2), 1–210 (2021)
Zhao, Y., Li, M., Lai, L., Suda, N., Civin, D., Chandra, V.: Federated learning with non-IID data. arXiv preprint arXiv:1806.00582 (2018)
Karimireddy, S.P., Kale, S., Mohri, M., Reddi, S., Stich, S., Suresh, A.T.: Scaffold: Stochastic controlled averaging for federated learning. In: International Conference on Machine Learning, pp. 5132–5143. PMLR (2020)
Li, Q., Diao, Y., Chen, Q., He, B.: Federated learning on non-IID data silos: an experimental study. In: 2022 IEEE 38th International Conference on Data Engineering (ICDE), pp. 965–978. IEEE (2022)
Hao, W., et al.: Towards fair federated learning with zero-shot data augmentation. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 3310–3319 (2021)
Tang, Z., Zhang, Y., Shi, S., He, X., Han, B., Chu, X.: Virtual homogeneity learning: Defending against data heterogeneity in federated learning. In: International Conference on Machine Learning, pp. 21111–21132. PMLR (2022)
Jeong, E., Oh, S., Kim, H., Park, J., Bennis, M., Kim, S.L.: Communication-efficient on-device machine learning: federated distillation and augmentation under non-IID private data. arXiv preprint arXiv:1811.11479 (2018)
Luo, B., Xiao, W., Wang, S., Huang, J., Tassiulas, L.: Tackling system and statistical heterogeneity for federated learning with adaptive client sampling. In: IEEE INFOCOM 2022-IEEE conference on computer communications, pp. 1739–1748. IEEE (2022)
Li, C., Zeng, X., Zhang, M., Cao, Z.: PyramidFL: a fine-grained client selection framework for efficient federated learning. In: Proceedings of the 28th Annual International Conference on Mobile Computing And Networking, pp. 158–171 (2022)
Ghosh, A., Chung, J., Yin, D., Ramchandran, K.: An efficient framework for clustered federated learning. Adv. Neural. Inf. Process. Syst. 33, 19586–19597 (2020)
Zeng, S., et al.: Heterogeneous federated learning via grouped sequential-to-parallel training. In: Bhattacharya, A., et al. (eds.) Database Systems for Advanced Applications. DASFAA 2022. Lecture Notes in Computer Science, vol. 13246, pp. 455–471. Springer, Cham (2022). https://doi.org/10.1007/978-3-031-00126-0_34
Li, T., Sahu, A.K., Zaheer, M., Sanjabi, M., Talwalkar, A., Smith, V.: Federated optimization in heterogeneous networks. Proc. Mach. Learn. Syst. 2, 429–450 (2020)
Lee, G., Jeong, M., Shin, Y., Bae, S., Yun, S.Y.: Preservation of the global knowledge by not-true distillation in federated learning. Adv. Neural. Inf. Process. Syst. 35, 38461–38474 (2022)
Li, Q., He, B., Song, D.: Model-contrastive federated learning. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 10713–10722 (2021)
Yu, F., et al.: Fed2: feature-aligned federated learning. In: Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery & Data Mining, pp. 2066–2074 (2021)
Zhu, Z., Hong, J., Zhou, J.: Data-free knowledge distillation for heterogeneous federated learning. In: International Conference on Machine Learning, pp. 12878–12889. PMLR (2021)
Luo, M., Chen, F., Hu, D., Zhang, Y., Liang, J., Feng, J.: No fear of heterogeneity: classifier calibration for federated learning with non-IID data. Adv. Neural. Inf. Process. Syst. 34, 5972–5984 (2021)
Chen, T., Kornblith, S., Norouzi, M., Hinton, G.: A simple framework for contrastive learning of visual representations. In: International Conference on Machine Learning, pp. 1597–1607. PMLR (2020)
Chen, X., He, K.: Exploring simple Siamese representation learning. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 15750–15758 (2021)
Moreno-Torres, J.G., Raeder, T., Alaiz-Rodríguez, R., Chawla, N.V., Herrera, F.: A unifying view on dataset shift in classification. Pattern Recogn. 45(1), 521–530 (2012)
Reynolds, D.A., et al.: Gaussian mixture models. Encycl. Biom. 741, 659–663 (2009)
Krizhevsky, A., Hinton, G., et al.: Learning multiple layers of features from tiny images (2009)
Caldas, S., et al.: Leaf: a benchmark for federated settings. arXiv preprint arXiv:1812.01097 (2018)
Chang, H., Shejwalkar, V., Shokri, R., Houmansadr, A.: Cronus: robust and heterogeneous collaborative learning with black-box knowledge transfer. arXiv preprint arXiv:1912.11279 (2019)
Huang, W., Ye, M., Du, B.: Learn from others and be yourself in heterogeneous federated learning. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 10143–10153 (2022)
Chen, T., Kornblith, S., Swersky, K., Norouzi, M., Hinton, G.E.: Big self-supervised models are strong semi-supervised learners. Adv. Neural. Inf. Process. Syst. 33, 22243–22255 (2020)
Van der Maaten, L., Hinton, G.: Visualizing data using t-SNE. J. Mach. Learn. Res. 9(11), 1–27 (2008)
Olson, D.L., Delen, D.: Advanced Data Mining Techniques. Springer Science & Business Media, Heidelberg (2008). https://doi.org/10.1007/978-3-540-76917-0
He, K., Zhang, X., Ren, S., Sun, J.: Deep residual learning for image recognition. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 770–778 (2016)
Hsu, T.M.H., Qi, H., Brown, M.: Measuring the effects of non-identical data distribution for federated visual classification. arXiv preprint arXiv:1909.06335 (2019)
Wang, Z., et al.: Federatedscope-GNN: towards a unified, comprehensive and efficient package for federated graph learning. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 4110–4120 (2022)
Graves, A.: Long short-term memory. In: Graves, A. (ed.) Supervised sequence Labelling with Recurrent Neural Networks, vol. 385, pp. 37–45. Springer, Berlin, Heidelberg (2012). https://doi.org/10.1007/978-3-642-24797-2_4
Vaswani, A., et al.: Attention is all you need. In: Advances in Neural Information Processing Systems, vol. 30 (2017)
Strubell, E., Ganesh, A., McCallum, A.: Energy and policy considerations for deep learning in NLP. arXiv preprint arXiv:1906.02243 (2019)
Acknowledgments
This work is supported by the National Natural Science Foundation of China (No. 62206238), the Natural Science Foundation of Jiangsu Province (No. BK20220562), the Natural Science Research Project of Universities in Jiangsu Province (No. 22KJB520010), the China Postdoctoral Science Foundation (No. 2023M732985).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Xue, B., Zhang, J., Chen, B., Li, W. (2024). Tackling Non-IID for Federated Learning with Components Alignment. In: Kim, D.D., Chen, C. (eds) Machine Learning for Cyber Security. ML4CS 2023. Lecture Notes in Computer Science, vol 14541. Springer, Singapore. https://doi.org/10.1007/978-981-97-2458-1_9
Download citation
DOI: https://doi.org/10.1007/978-981-97-2458-1_9
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-97-2457-4
Online ISBN: 978-981-97-2458-1
eBook Packages: Computer ScienceComputer Science (R0)