Abstract
The aim of heterogeneous federated learning (HFL) is to address the issues of data heterogeneity, computational resource disparity, and model generalizability and security in federated learning (FL). To facilitate the collaborative training of data and enhance the predictive performance of models, a heterogeneous federated learning algorithm based on contribution-weighted aggregation (HFedCWA) is proposed in this paper. First, weights are assigned on the basis of the distribution differences and quantities of heterogeneous device data, and a contribution-based weighted aggregation method is introduced to dynamically adjust weights and balance data heterogeneity. Second, personalized strategies based on regularization are formulated for heterogeneous devices with different weights, enabling each device to participate in the overall task in an optimal manner. Differential privacy methods are concurrently utilized in FL training to further enhance the security of the system. Finally, experiments are conducted under various data heterogeneity scenarios using the MNIST and CIFAR10 datasets, and the results demonstrate that the HFedCWA can effectively improve the model’s generalizability ability and adaptability to heterogeneous data, thereby enhancing the overall efficiency and performance of the HFL system.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Data Availability
The data used to support the findings of this study are available from the corresponding author upon request.
References
Shen S, Zhu T, Wu D, Wang W, Zhou W (2022) From distributed machine learning to federated learning: In the view of data privacy and security. Concurrency Comput Practice Exper 34(16):6002
Mcmahan HB, Moore E, Ramage D, Hampson S, Arcas BA (2017) Communication-efficient learning of deep networks from decentralized data. In: Proceedings of the 20th international conference on artificial intelligence and statistics, vol. 54, pp 1273–1282
Chen Z, Yu X, Hang B, Bin Y, Li W, Gao Y (2021) A survey on federated learning. Knowl-Based Syst 216(1):106775
Dash B, Sharma P, Ali A (2022) Federated learning for privacy-preserving: A review of pii data analysis in fintech. Int J Softw Eng Appl (IJSEA) 13(4):1–13
Liang B, Cai J, Yang H (2022) A new cell group clustering algorithm based on validation & correction mechanism. In: Expert systems with applications vol. 193, pp 116410
Long T, Jia QS (2021) Matching uncertain renewable supply with electric vehicle charging demand-a bi-level event-based optimization method. Complex Syst Model Simulation 1(1):33–44
Zhou H, Yang G, Dai H, Liu G (2022) Pflf: Privacy-preserving federated learning framework for edge computing. IEEE Trans Inf Forensics Sec 17:1905–1918
Cong X, Sanmi K, Indranil G (2020) Asynchronous federated optimization. arXiv:1903.03934 [cs.DC]
Li T, Sahu AK, Zaheer M, Sanjabi M, Talwalkar A, Smith V (2020) Federated optimization in heterogeneous networks. Proceed Mach Learn Syst 2:429–450
Durmus AEA, Yue Z, Ramon MN, Matthew M, Paul NW, Venkatesh S (2021) Federated learning based on dynamic regularization. arXiv:2111.04263 [cs.LG]
Yanli L, Dong Y, Abubakar SS, Wei B (2023) Enhancing federated learning robustness in adversarial environment through clustering Non-IID features. Comput Sec 132:103319
Ouyang X, Xie Z, Zhou J, Huang J, Xing G (2021) Clusterfl: a similarity-aware federated learning system for human activity recognition. In: Proceedings of the 19th annual international conference on mobile systems, applications, and services, pp 54–66. Association for Computing Machinery, New York, NY, USA. https://doi.org/10.1145/3458864.3467681
Yujun S, Jian L, Wenqing Z, Vincent YFT, Song B (2024) Towards understanding and mitigating dimensional collapse in heterogeneous federated learning. arXiv:2210.00226 [cs.LG]
T Dinh C, Tran N, Nguyen J (2020) Personalized federated learning with moreau envelopes. Adv Neural Inf Process Syst 33:21394–21405
Zhang J, Hua Y, Wang H, Song T, Xue Z, Ma R, Guan H (2023) Fedala: Adaptive local aggregation for personalized federated learning. Proceed AAAI Conference Artif Intell 37(9):11237–11244
Zhang J, Hua Y, Wang H, Song T, Xue Z, Ma R, Cao J, Guan H (2023) Gpfl: Simultaneously learning global and personalized feature information for personalized federated learning. In: Proceedings of the IEEE/CVF international conference on computer vision (ICCV), pp 5041–5051
Ma X, Zhang J, Guo S, Xu W (2022) Layer-wised model aggregation for personalized federated learning. In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition (CVPR), pp 10092–10101
Mo F, Haddadi H, Katevas K, Marin E, Perino D, Kourtellis N (2021) Ppfl: privacy-preserving federated learning with trusted execution environments. In: Proceedings of the 19th annual international conference on mobile systems, applications, and services, pp 94–108. Association for Computing Machinery, New York, NY, USA. https://doi.org/10.1145/3458864.3466628
Jie L, Junchang Z, Jiahui C (2024) Efficient federated learning privacy preservation method with heterogeneous differential privacy. Comput Sec 139:103715
Wenhan C, Tianqing Z (2024) Gradient-based defense methods for data leakage in vertical federated learning. Comput Sec 139:103744
Chengyi Y, Kun J, Deli K, Jiayin Q, Aimin Z (2024) Dp-gsgld: A bayesian optimizer inspired by differential privacy defending against privacy leakage in federated learning. Comput Sec 142:103839
Jinyin C, Mingjun L, Yao C, Haibin Z (2023) Fedright: An effective model copyright protection for federated learning. Comput Sec 135:103504
Maysaa K, Moez E, Leila MB (2023) Privacy-preserving federated learning: An application for big data load forecast in buildings. Comput Sec 131:103211
Jianqing Z, Yang L, Yang H, Hao W, Tao S, Zhengui X, Ruhui M, Jian C (2023) Pfllib: Personalized federated learning algorithm library. arXiv:2312.04992 [cs.LG]
Tao S, Jie Z, Xinkang J, Fengda Z, Gang H, Pan Z, Kun K, Fei W, Chao W (2020) Federated mutual learning. arXiv:2006.16765 [cs.LG]
Li Q, He B, Song D (2021) Model-contrastive federated learning. In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition (CVPR), pp 10713–10722
Xiaoxiao L, Meirui J, Xiaofei Z, Michael K, Qi D (2021) Fedbn: Federated learning on non-iid features via local batch normalization. arXiv:2102.07623 [cs.LG]
Lee S, Zhang T, Prakash S, Niu Y, Avestimehr S (2024) Embracing federated learning: Enabling weak client participation via partial model training. IEEE Trans Mobile Comput 1–12. https://doi.org/10.1109/TMC.2024.3392212
Wang Y, Fu H, Kanagavelu R, Wei Q, Liu Y, Goh RS (2024) An aggregation-free federated learning for tackling data heterogeneity. In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition (CVPR), pp 26233–26242
Sun L, Zhang Z, Muhammad G (2024) Fedcpd: A federated learning algorithm for processing and securing distributed heterogeneous data in the metaverse. IEEE Open J Commun Soc 5:5540–5551. https://doi.org/10.1109/OJCOMS.2024.3435389
Tinh VP, Son HH, Dang DNM, Nam NH, Le DD, Nguyen TB, Pham TQ, Nguyen VL, Huynh DT, Khoa TA (2024) Crossheterofl: Cross-stratified sampling composition-fitting to federated learning for heterogeneous clients. IEEE Access 12:148011–148025. https://doi.org/10.1109/ACCESS.2024.3475737
Chen Z, Li J, Shen C (2024) Personalized federated learning with attention-based client selection. In: ICASSP 2024 - 2024 IEEE international conference on acoustics, speech and signal processing (ICASSP), pp 6930–6934. https://doi.org/10.1109/ICASSP48485.2024.10447362
Pan Z, Li Y, Guan Z, Liang M, Li A, Wang J, Kou F (2025) Rfcsc: Communication efficient reinforcement federated learning with dynamic client selection and adaptive gradient compression. Neurocomputing 612:128672. https://doi.org/10.1016/j.neucom.2024.128672
Wang H, Fu T, Du Y, Gao W, Huang K, Liu Z, Chandak P, Liu S, Van Katwyk P, Deac A, Anandkumar A (2023) Scientific discovery in the age of artificial intelligence. Nature 620:47–60. https://doi.org/10.1038/s41586-023-06221-2
Hu X, Chu L, Pei J, Liu W, Bian J (2021) Model complexity of deep learning: A survey. Knowl Inf Syst 63:2585–2619. https://doi.org/10.1007/s10115-021-01605-0
Acknowledgements
We would like to express our gratitude to everyone who provided support and guidance during the research and writing of this paper.
Funding
This work was supported by the National Natural Science Foundation of China (No. 61971347), the Scientific Research Program of Shaanxi Province (No. 2022SF-353), the Project of the Xi’an Science and Technology Planning Foundation (No. 24ZDCYISGG0020), and the Natural Science Project of Shaanxi Provincial Department of Education (No. 23JK0562).
Author information
Authors and Affiliations
Contributions
Jiawei Du is responsible for the methodology, software, visualization, and writing of the original draft; Huaijun Wang is responsible for project administration, data curation, and supervision; Junhuai Li is responsible for funding acquisition, resources, and the review and editing of the writing; Kan Wang is responsible for the conceptualization and validation of the experiments; and Rong Fei is responsible for the formal analysis and investigation.
Corresponding author
Ethics declarations
Competing interests
The authors declare that they have no competing interests.
Ethics approval and consent to participate
Not applicable
Consent for publication
Not applicable
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Du, J., Wang, H., Li, J. et al. HFedCWA: heterogeneous federated learning algorithm based on contribution-weighted aggregation. Appl Intell 55, 186 (2025). https://doi.org/10.1007/s10489-024-06123-4
Accepted:
Published:
DOI: https://doi.org/10.1007/s10489-024-06123-4