Skip to main content
Log in

HFSL: heterogeneity split federated learning based on client computing capabilities

  • Published:
The Journal of Supercomputing Aims and scope Submit manuscript

Abstract

With the rapid growth of the internet of things (IoT) and smart devices, edge computing has emerged as a critical technology for processing massive amounts of data and protecting user privacy. Split federated learning, an emerging distributed learning framework, enables model training without needing data to leave local devices, effectively preventing data leakage and misuse. However, the disparity in computational capabilities of edge devices necessitates partitioning models according to the least capable client, resulting in a significant portion of the computational load being offloaded to a more capable server-side infrastructure, thereby incurring substantial training overheads. This work proposes a novel method for split federated learning targeting heterogeneous endpoints to address these challenges. The method addresses the problem of heterogeneous training across different clients by adding auxiliary layers, enhances the accuracy of heterogeneous model split training using self-distillation techniques, and leverages the global model from the previous round to mitigate the accuracy degradation during federated aggregation. We conducted validations on the CIFAR-10 dataset and compared it with the existing SL, SFLV1, and SFLV2 methods; our HFSL2 method improved by 3.81%, 13.94%, and 6.19%, respectively. Validations were also carried out on the HAM10000, FashionMNIST, and MNIST datasets, through which we found that our algorithm can effectively enhance the aggregation accuracy of heterogeneous computing capabilities.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Algorithm 1
Fig. 7
Algorithm 2
Algorithm 3
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12

Similar content being viewed by others

Data Availability

No datasets were generated or analysed during the current study.

References

  1. Asad M, Shaukat S, Javanmardi E et al (2023) A comprehensive survey on privacy-preserving techniques in federated recommendation systems. Appl Sci 13:6201

    Article  Google Scholar 

  2. Diao E, Ding J, Tarokh V (2020) Heterofl: Computation and communication efficient federated learning for heterogeneous clients. arXiv: Learning https://doi.org/10.48550/arXiv.2010.01264

  3. van Erven T, Harremoes P (2014) Rényi divergence and kullback-leibler divergence. IEEE Trans Informat Theory 60:3797–3820

    Article  Google Scholar 

  4. Han X, Rasul K, Vollgraf R (2017) Fashion-mnist: a novel image dataset for benchmarking machine learning algorithms. arXiv: Learning https://doi.org/10.48550/arXiv.1708.07747

  5. He K, Zhang X, Ren S, et al (2016) Deep residual learning for image recognition. In: 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), https://doi.org/10.1109/cvpr.2016.90, http://dx.doi.org/10.1109/cvpr.2016.90

  6. Huang W, Ye M, Du B (2022) Learn from others and be yourself in heterogeneous federated learning. In: proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition

  7. Ilhan F, Gong S, Liu L (2023) Scalefl: Resource-adaptive federated learning with heterogeneous clients. In: proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, https://doi.org/10.1109/CVPR52729.2023.02350

  8. Imteaj A, Thakker U, Wang S et al (2022) A survey on federated learning for resource-constrained iot devices. IEEE Internet Things J 9:1–24

    Article  Google Scholar 

  9. Kim M, Yu S, Kim S, et al (2023) Depthfl: Depthwise federated learning for heterogeneous clients. In: The Eleventh International Conference on Learning Representations

  10. Konecný J, McMahan HB, Ramage D, et al (2016) Federated optimization: distributed machine learning for on-device intelligence. cornell university - arXiv https://doi.org/10.48550/arXiv.1610.02527

  11. Krizhevsky A (2009) Learning multiple layers of features from tiny images. Handbook of systemic autoimmune diseases

  12. Krizhevsky A, Sutskever I, Hinton GE (2017) Imagenet classification with deep convolutional neural networks. Commun ACM 60:84–90

    Article  Google Scholar 

  13. Lecun Y, Bottou L, Bengio Y et al (1998) Gradient-based learning applied to document recognition. Proceedof the IEEE 86(2278):2278–2324

    Article  Google Scholar 

  14. Liang W, Xiao J, Chen Y et al (2024) Tmhd: Twin-bridge scheduling of multi-heterogeneous dependent tasks for edge computing. Future Generat Comput Syst 158:60–72

    Article  Google Scholar 

  15. Liang W, Xie S, Li KC et al (2024) Mc-dsc: a dynamic secure resource configuration scheme based on medical consortium blockchain. IEEE Trans Informat Forens Securit 19:3525–3538. https://doi.org/10.1109/TIFS.2024.3364370

    Article  Google Scholar 

  16. Lin T, Kong L, Stich SU et al (2020) Ensemble distillation for robust model fusion in federated learning. Neural InformatProcess Syst 33(63):2351

    Google Scholar 

  17. Mainetti M, Patrono P, Vilei V (2011) Evolution of wireless sensor networks towards the internet of things: a survey. International conference on software, telecommunications and computer networks

  18. Mao A, Mohri M, Zhong Y (2023) Cross-entropy loss functions: Theoretical analysis and applications. In: International Conference on Machine Learning

  19. McMahan HB, Moore EB, Ramage D, et al (2016) Communication-efficient learning of deep networks from decentralized data. arXiv: Learninghttps://doi.org/10.48550/arXiv.1602.05629

  20. Mu Y, Shen C (2023) Communication and storage efficient federated split learning. In: ICC 2023-IEEE International Conference on Communications, https://doi.org/10.1109/ICC45041.2023.10278891

  21. Shen J, Wang X, Cheng N, et al (2023) Effectively heterogeneous federated learning: a pairing and split learning based approach. In: GLOBECOM 2023-2023 IEEE Global Communications Conference, https://doi.org/10.1109/GLOBECOM54140.2023.10437666

  22. Shukla V, Choudhary S (2022) Deep learning in neural networks: an overview. In: deep learning in visual computing and signal processing, p 29-53, https://doi.org/10.1201/9781003277224-2, http://dx.doi.org/10.1201/9781003277224-2

  23. Sun J, Li A, Wang B, et al (2021) Soteria: Provable defense against privacy leakage in federated learning from representation perspective. In: 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), https://doi.org/10.1109/cvpr46437.2021.00919, http://dx.doi.org/10.1109/cvpr46437.2021.00919

  24. Thapa C, Mahawaga Arachchige PC, Camtepe S, et al (2022) Splitfed: When federated learning meets split learning. Proceedings of the AAAI conference on artificial intelligence 8485 8493

  25. Tschandl P, Rosendahl C, Kittler H (2018) The ham10000 dataset, a large collection of multi-source dermatoscopic images of common pigmented skin lesions. Scientific Data. https://doi.org/10.1038/sdata.2018.161 , http://dx.doi.org/10.1038/sdata.2018.161

  26. Vepakomma P, Gupta O, Swedish T, et al (2018) Split learning for health: distributed deep learning without sharing raw patient data. arXiv: Learninghttps://doi.org/10.48550/arXiv.1812.00564

  27. Wang S (2019) Edge computing: applications, state-of-the-art and challenges. Adv Network 7:8–15

    Article  Google Scholar 

  28. Wang W, Lu Z (2013) Cyber security in the smart grid: survey and challenges. Comput Network 57:1344–1371

    Article  Google Scholar 

  29. Wu H, Wang P, Narayana CVA (2023) Straggler-resilient federated learning: Tackling computation heterogeneity with layer-wise partial model training in mobile edge network. arXiv preprint arXiv:2311.10002https://doi.org/10.48550/arXiv.2311.10002

  30. Ye M, Fang X, Du B (2023) Heterogeneous federated learning: State-of-the-art and research challenges. ACM Comput Surv. https://doi.org/10.1145/3625558

    Article  Google Scholar 

  31. Zhang J, Li A, Tang M, et al (2023) Fed-cbs: a heterogeneity-aware client sampling mechanism for federated learning via class-imbalance reduction. In: International Conference on Machine Learning, https://doi.org/10.48550/arXiv.2209.15245

  32. Zheng F, Chen C, Lyu L, et al (2023) Reducing communication for split learning by randomized top-k sparsification. arXiv preprint arXiv:2305.18469https://doi.org/10.24963/ijcai.2023/519

  33. Zhu G, Deng Y, Chen X et al (2024) Esfl: efficient split federated learning over resource-constrained heterogeneous wireless devices. IEEE Internet Thing J. https://doi.org/10.1109/JIOT.2024.3397677

    Article  Google Scholar 

Download references

Acknowledgments

This work was supported in part by the National Natural Science Foundation of China under Grant 62202156, Grant 62072170, Grant 62473146, Grant 62072056 and Grant 62472168, in part by the Key Project of Hunan Provincial Natural Science Foundation under Grant 2024JJ3017, Grant 2024AQ2028, and Grant 2023GK2001, in part by the Hunan Provincial Natural Science Foundation of China under Grant 2024JJ6220, and in part by the Research Foundation of Education Bureau of Hunan Province, China under Grant 23B0487.

Author information

Authors and Affiliations

Authors

Contributions

A, B, C, D, E, F, and G conceived and planned the experiments. A, B, C, and D carried out the experiments. A, F, and E planned and carried out the simulations. E, F, and G contributed to sample preparation. A., B, and C contributed to the interpretation of the results. A.B. took the lead in writing the manuscript. All authors provided critical feedback and helped shape the research, analysis, and manuscript.

Corresponding authors

Correspondence to Yuxiang Chen or Kuan-Ching Li.

Ethics declarations

Conflict of interest

The authors declare no competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wu, N., Zhao, W., Chen, Y. et al. HFSL: heterogeneity split federated learning based on client computing capabilities. J Supercomput 81, 196 (2025). https://doi.org/10.1007/s11227-024-06632-6

Download citation

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s11227-024-06632-6

Keywords