Skip to main content

Towards Heterogeneous Federated Learning: Analysis, Solutions, and Future Directions

  • Conference paper
  • First Online:
Artificial Intelligence Security and Privacy (AIS&P 2023)

Abstract

With the rapid growth of edge devices such as smartphones, wearables, and mobile networks, how to effectively utilize a large amount of private data stored on these devices has become a challenging issue. To address this issue, federated learning has emerged as a promising solution. Federated learning allows multiple devices to train machine learning models collaboratively while keeping the data decentralized and following local privacy policies. However, the heterogeneous differences in data distributions, model structures, network environments, and devices pose challenges in realizing collaboration. In this paper, we reviewed the heterogeneous federated learning (HFL) approaches and classified them into data heterogeneity, device heterogeneity, communication heterogeneity, and model heterogeneity. Also, we concluded their advantages and disadvantages and gave the solutions to the limitations in detail. Meanwhile, this paper introduces the commonly used methods for evaluating the performance of federated learning and suggests the future directions of the HFL framework.

This work is supported in part by the National Natural Science Foundation of China under Grant 62372125, in part by the Guangdong Natural Science Funds for Distinguished Young Scholar under Grant 2023B1515020041.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. McMahan, B., et al.: Communication-efficient learning of deep networks from decentralized data. In: Artificial Intelligence and Statistics. PMLR (2017)

    Google Scholar 

  2. Yang, Q., et al.: Federated machine learning: Concept and applications. ACM Trans. Intell. Syst. Technol. (TIST) 10(2), 1ā€“19 (2019)

    Article  Google Scholar 

  3. Dayan, I., et al.: Federated learning for predicting clinical outcomes in patients with COVID-19. Nat. Med. 27(10), 1735ā€“1743 (2021)

    Article  Google Scholar 

  4. Wu, C., et al.: FedGNN: federated graph neural network for a privacy-preserving recommendation. arXiv preprint arXiv:2102.04925 (2021)

  5. Suzumura, T., et al.: Towards federated graph learning for collaborative financial crimes detection. arXiv preprint arXiv:1909.12946 (2019)

  6. Usynin, D., et al.: Adversarial interference and its mitigations in privacy-preserving collaborative machine learning. Nat. Mach. Intell. 3(9), 749ā€“758 (2021)

    Article  Google Scholar 

  7. Li, T., et al.: Federated learning: challenges, methods, and future directions. IEEE Signal Process. Mag. 37(3), 50ā€“60 (2020)

    Article  Google Scholar 

  8. Kairouz, P., et al.: Advances and open problems in federated learning. Found. TrendsĀ® Mach. Learn. 14(1-2), 1ā€“210 (2021)

    Google Scholar 

  9. Wahab, O.A., et al.: Federated machine learning: survey, multi-level classification, desirable criteria and future directions in communication and networking systems. IEEE Commun. Surv. Tutor. 23(2), 1342ā€“1397 (2021)

    Article  Google Scholar 

  10. Tan, A.Z., et al.: Towards personalized federated learning. IEEE Trans. Neural Netw. Learn. Syst. 34, 9587ā€“9603 (2022)

    Article  MathSciNet  Google Scholar 

  11. Abdelmoniem, A.M., et al.: A comprehensive empirical study of heterogeneity in federated learning. IEEE Internet Things J. 10, 14071ā€“14083 (2023)

    Article  Google Scholar 

  12. Gao, D., Yao, X., Yang, Q.: A survey on heterogeneous federated learning. arXiv preprint arXiv:2210.04505 (2022)

  13. Ye, M., et al.: Heterogeneous Federated Learning: State-of-the-art and Research Challenges. arXiv preprint arXiv:2307.10616 (2023)

  14. Zhang, J., et al.: Federated learning with label distribution skew via logits calibration. In: International Conference on Machine Learning. PMLR (2022)

    Google Scholar 

  15. Luo, Z., et al.: Disentangled federated learning for tackling attributes skew via invariant aggregation and diversity transferring. arXiv preprint arXiv:2206.06818 (2022)

  16. Zhu, H., et al.: Federated learning on non-IID data: a survey. Neurocomputing 465, 371ā€“390 (2021)

    Article  Google Scholar 

  17. Yoon, T., et al.: Fedmix: Approximation of mixup under mean augmented federated learning. arXiv preprint arXiv:2107.00233 (2021)

  18. Duan, M., et al.: Astraea: self-balancing federated learning for improving classification accuracy of mobile deep learning applications. In: 2019 IEEE 37th International Conference on Computer Design (ICCD). IEEE (2019)

    Google Scholar 

  19. Fallah, A., Mokhtari, A., Ozdaglar, A.: Personalized federated learning with theoretical guarantees: a model-agnostic meta-learning approach. Adv. Neural. Inf. Process. Syst. 33, 3557ā€“3568 (2020)

    Google Scholar 

  20. Wang, L., Huang, C., Han, X.: Vertical federated knowledge transfer via representation distillation. In: FL-IJCAI Workshop (2022)

    Google Scholar 

  21. Le, H.Q., et al.: Layer-wise Knowledge Distillation for Cross-Device Federated Learning. In: 2023 International Conference on Information Networking (ICOIN). IEEE (2023)

    Google Scholar 

  22. Yu, T., Bagdasaryan, E., Shmatikov, V.: Salvaging federated learning by local adaptation. arXiv preprint arXiv:2002.04758 (2020)

  23. Hou, C., et al.: FeDChain: Chained algorithms for near-optimal communication cost in federated learning. arXiv preprint arXiv:2108.06869 (2021)

  24. Luping, W., Wei, W., Bo, L.: CMFL: mitigating communication overhead for federated learning. In: 2019 IEEE 39th International Conference on Distributed Computing Systems (ICDCS). IEEE (2019)

    Google Scholar 

  25. Li, X., et al.: A unified federated DNNs framework for heterogeneous mobile devices. IEEE Internet Things J. 9(3), 1737ā€“1748 (2021)

    Article  Google Scholar 

  26. Luo, J., et al.: Fedskel: efficient federated learning on heterogeneous systems with skeleton gradients update. In: Proceedings of the 30th ACM International Conference on Information & Knowledge Management (2021)

    Google Scholar 

  27. Yao, X., et al.: Federated learning with additional mechanisms on clients to reduce communication costs. arXiv preprint arXiv:1908.05891 (2019)

  28. Zaccone, R., et al.: Speeding up heterogeneous federated learning with sequentially trained superclients. In: 2022 26th International Conference on Pattern Recognition (ICPR). IEEE (2022)

    Google Scholar 

  29. Gao, Z., et al.: FedSeC: a robust differential private federated learning framework in heterogeneous networks. In: 2022 IEEE Wireless Communications and Networking Conference (WCNC). IEEE (2022)

    Google Scholar 

  30. Ma, Q., et al.: FedSA: a semi-asynchronous federated learning mechanism in heterogeneous edge computing. IEEE J. Sel. Areas Commun. 39(12), 3654ā€“3672 (2021)

    Article  Google Scholar 

  31. Chan, Y.H., Edith, C.H.N.: Fedhe: heterogeneous models and communication-efficient federated learning. In: 2021 17th International Conference on Mobility, Sensing and Networking (MSN). IEEE (2021)

    Google Scholar 

  32. Abdellatif, A.A., et al.: Communication-efficient hierarchical federated learning for IoT heterogeneous systems with imbalanced data. Future Gener. Comput. Syst. 128, 406ā€“419 (2022)

    Article  Google Scholar 

  33. Li, L., et al.: FedSAE: a novel self-adaptive federated learning framework in heterogeneous systems. In: 2021 International Joint Conference on Neural Networks (IJCNN). IEEE (2021)

    Google Scholar 

  34. Wang, H., et al.: Optimizing federated learning on non-IID data with reinforcement learning. In: IEEE INFOCOM 2020-IEEE Conference on Computer Communications. IEEE (2020)

    Google Scholar 

  35. Wang, D., et al.: CFL-HC: a coded federated learning framework for heterogeneous computing scenarios. In: 2021 IEEE Global Communications Conference (GLOBECOM). IEEE (2021)

    Google Scholar 

  36. Elkordy, A.R., Salman Avestimehr, A.: Heterosag: secure aggregation with heterogeneous quantization in federated learning. IEEE Trans. Commun. 70(4), 2372ā€“2386 (2022)

    Article  Google Scholar 

  37. Li, Y., et al.: FedH2L: Federated learning with model and statistical heterogeneity. arXiv preprint arXiv:2101.11296 (2021)

  38. Takahashi, H., Liu, J., Liu, Y.: Breaching FedMD: image recovery via paired-logits inversion attack. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (2023)

    Google Scholar 

  39. Liu, Y., et al.: A secure federated learning framework for 5G networks. IEEE Wirel. Commun. 27(4), 24ā€“31 (2020)

    Article  Google Scholar 

  40. Ding, J., et al.: Differentially private and communication efficient collaborative learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 35. No. 8. (2021)

    Google Scholar 

  41. Zeng, H., et al.: FedCAV: contribution-aware model aggregation on distributed heterogeneous data in federated learning. In: Proceedings of the 50th International Conference on Parallel Processing (2021)

    Google Scholar 

  42. Bibikar, S., et al.: Federated dynamic sparse training: computing less, communicating less, yet learning better. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36. No. 6 (2022)

    Google Scholar 

  43. Hardt, M., Price, E., Srebro, N.: Equality of opportunity in supervised learning. In: Advances in Neural Information Processing Systems, vol. 29 (2016)

    Google Scholar 

  44. Lyu, L., et al.: Towards fair and privacy-preserving federated deep models. IEEE Trans. Parallel Distrib. Syst. 31(11), 2524ā€“2541 (2020)

    Article  Google Scholar 

  45. GƔlvez, B.R., et al.: Enforcing fairness in private federated learning via the modified method of differential multipliers. In: NeurIPS 2021 Workshop Privacy in Machine Learning (2021)

    Google Scholar 

  46. Sun, L., Lyu, L.: Federated model distillation with noise-free differential privacy. arXiv preprint arXiv:2009.05537 (2020)

  47. Bonawitz, K., et al.: Practical secure aggregation for privacy-preserving machine learning. In: Proceedings of the 2017 ACM SIGSAC Conference on Computer and Communications Security (2017)

    Google Scholar 

  48. Chai, D., et al.: FedEval: A Holistic Evaluation Framework for Federated Learning. arXiv preprint arXiv:2011.09655 (2020)

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zhili Zhou .

Editor information

Editors and Affiliations

Appendix

Appendix

Table 1. Heterogeneous data methods
Table 2. Heterogeneous model methods
Table 3. Heterogeneous communication methods
Table 4. Heterogeneous device methods

The four tables above summarize the solutions to federated learning data heterogeneity, model heterogeneity, communication heterogeneity, and device heterogeneity, and analyze the main contributions and limitations of each approach. These valuable discussions can contribute to the high-quality development of the heterogeneous federated learning community.

Rights and permissions

Reprints and permissions

Copyright information

Ā© 2024 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Lin, Y., Long, Y., Zhou, Z., Pang, Y., Yang, C. (2024). Towards Heterogeneous Federated Learning: Analysis, Solutions, and Future Directions. In: Vaidya, J., Gabbouj, M., Li, J. (eds) Artificial Intelligence Security and Privacy. AIS&P 2023. Lecture Notes in Computer Science, vol 14509. Springer, Singapore. https://doi.org/10.1007/978-981-99-9785-5_13

Download citation

  • DOI: https://doi.org/10.1007/978-981-99-9785-5_13

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-99-9784-8

  • Online ISBN: 978-981-99-9785-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics