Abstract
Federated learning is an emerging technology that can effectively safeguard personal information. As opposed to traditional centralized learning, federated learning can avoid data sharing while maintaining global model training. However, in the process of updating the global model, it will consume huge client communication resources, which hinder the wide application of this technology. To reduce the communication overhead without seriously reducing the accuracy of the global model, under the federated learning framework, we use decomposition based multi-objective optimization algorithm (MOEA/D) to optimize the structure of the global model. For the structure of the global model, a highly scalable coding method is used for coding, which improves the efficiency of the evolutionary neural network. As a comparison, we use the non-dominated sorting genetic algorithm II(NSGA II) to optimize the problem under the same conditions and verify the effectiveness of both algorithms according to the obtained Pareto solution. We verify that MOEA/D has better convergence when using multilayer and convolutional neural networks as the global model. Overall, MOEA/D can further strengthen the structure optimization of the federated learning model and reduce communication costs.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Augenstein C, Spangenberg N, Franczyk B (2017) Applying machine learning to Big Data streams an overview of challenges. 2017 IEEE 4th International conference on soft computing & machine intelligence, pp 25–29. https://doi.org/10.1109/ISCMI.2017.8279592
Mcmahan HB, Moore E, Ramage D et al (2017) Communication-efficient learning of deep networks from decentralized data. AISTATS 54:1273–1282
Li T, Hu S, Beirami A et al (2021) Ditto: fair and robust federated learning through personalization. ICML. arXiv:2012.04221
Ahmed U, Srivastava G, Lin JC-W (2022) Reliable customer analysis using federated learning and exploring deep-attention edge intelligence. Fut Gen Comput Syst 127:70–79
Yang Q, Liu Y, Chen TJ et al (2019) Federated machine learning: concept and applications. ACM Trans Intell Syst Technol 10(2):1
Kairouz P, Mcmahan HB, Avent B et al (2021) Advances and open problems in federated learning. Found Trends Mach Learn 14(1–2):1–210
Li T, Sahu AK, Talwalkar A et al (2020) Federated learning: challenges, methods, and future directions. IEEE Signal Process Mag 37(3):50–60
Dwork C (2008) Differential privacy: a survey of results. Theory and Applications of Models of Computation, pp 1–19
Bayatbabolghani F, Blanton M (2018) Secure multi-party computation. In: Proceedings of the 2018 ACM SIGSAC conference on computer and communications security, pp 2157–2159. https://doi.org/10.1145/3243734.3264419
Mohan M, Devi M K K, Prakash V J et al (2017) Homomorphic encryption-state of the art. In: 2017 International conference on intelligent computing and control, pp 1–6. https://doi.org/10.1109/I2C2.2017.8321774
Triastcyn A, Faltings B (2019) Federated learning with Bayesian differential privacy. In: IEEE international conference on Big Data, pp 2587–2596
Feng Z, Xiong H Y, Song C Y, et al (2019) SecureGBM: secure multi-party gradient boosting. In: 2019 IEEE international conference on Big Data, pp 1312–1321
Gao D S, Liu Y, Huang A B, et al (2019) Privacy-preserving Heterogeneous Federated Transfer Learning. In: 2019 IEEE international conference on Big Data, pp 2552–2559
Niknam S, Dhillon HS, Reed JH (2020) Federated learning for wireless communications: motivation, opportunities, and challenges. IEEE Commun Mag 58(6):46–51
Shokri R, Shmatikov V (2015) Privacy-preserving deep learning. In: Proceedings of the 22nd ACM SIGSAC conference on computer and communications security, pp 1310–1321
Zhu H, Zhang H, Jin Y (2021) From federated learning to federated neural architecture search: a survey. Complex Intell Syst 7(2):1
Lin JCW, Srivastava G, Zhang Y et al (2021) Privacy-preserving multiobjective sanitization model in 6G IoT environments. IoT J 8:5340–5349
Lim WYB, Luong NC, Hoang DT et al (2020) Federated learning in mobile edge networks: a comprehensive survey. IEEE Commun Surv Tutor 22(3):2031–2063
Yang W Q, Zhang Y, Lim W et al (2020) Privacy is not free: energy-aware federated learning for mobile and edge intelligence. In: 12th International conference on wireless communications and signal processing, pp 233–238
Konecný J, Mcmahan H B, Yu F et al (2016) Federated learning: strategies for improving communication efficiency. arXiv:1610.05492
Alistarh D, Grubic D, Li JZ et al (2017) QSGD: communication-efficient SGD via gradient quantization and encoding. Adv Neural Inf Process Syst 1:1709–1720
Konečný J (2017) Stochastic, distributed and federated optimization for machine learning. arXiv:1707.01155
Li T, Sahu A K, Zaheer M et al (2018) Federated optimization in heterogeneous networks. arXiv:1812.06127.
Konečný J, Brendan Mcmahan H, Ramage D et al (2016) Federated optimization: distributed machine learning for on-device intelligence. arXiv:1610.02527
Dinh CT, Tran NH, Nguyen MNH et al (2021) Federated learning over wireless networks: convergence analysis and resource allocation. Ieee-Acm Transactions on Networking 29(1):398–409
Wen W, Xu C, Yan F et al (2017) TernGrad: Ternary Gradients to Reduce Communication in Distributed Deep Learning. In: 31st Conference on neural information processing system. arXiv:1705.07878
Sattler F, Wiedemann S, Müller KR et al (2020) Robust and communication-efficient federated learning from non-i.i.d. data. IEEE Trans Neural Netw Learn Syst 31(9):3400–3413
Xu J, Du W, Jin Y et al (2020) Ternary compression for communication-efficient federated learning. IEEE Trans Neural Netw Learn Syst. https://doi.org/10.1109/TNNLS.2020.3041185
Xin Y (1999) Evolving artificial neural networks. Proc IEEE 87(9):1423–1447
Stanley KO, Miikkulainen R (2002) Evolving neural networks through augmenting topologies. Evol Comput 10(2):99–127
Fekiac J, Zelinka I, Burguillo J C (2011) A review of methods for encoding neural network topologies in evolutionary computation. In: Proceedings 25th European conference on modelling and simulation, pp 410–416
Mocanu DC, Mocanu E, Stone P et al (2018) Scalable training of artificial neural networks with adaptive sparse connectivity inspired by network science. Nat Commun. https://doi.org/10.1038/s41467-018-04316-3
Zhu HY, Jin YC (2020) Multi-objective evolutionary federated learning. IEEE Trans Neural Netw Learn Syst 31(4):1310–1322
Razmjooy N, Ashourian M, Foroozandeh Z (2021) Metaheuristics and optimization in computer and electrical engineering. Cham, Switzerland
Razmjooy N, Estrela V V, Loschi H J et al (2019) A Comprehensive Survey of New Meta-Heuristic Algorithms
De S, Dey S, Bhattacharyya S (2020). Recent Advances in Hybrid Metaheuristics for Data Clustering. https://doi.org/10.1002/9781119551621
Razmjooy N, Khalilpour M, Ramezani M (2016) A new meta-heuristic optimization algorithm inspired by FIFA world cup competitions: theory and its application in PID designing for AVR system. J Control Autom Electr Syst 27:419–440. https://doi.org/10.1007/s40313-016-0242-6
Razmjooy N, Estrela VV, Loschi HJ (2020) Entropy-based breast cancer detection in digital mammograms using world cup optimization algorithm. Int J Swarm Intell Res 11(3):1–18
Zhang G, Xiao C-Y, Razmjooy N (2020) Optimal parameter extraction of PEM fuel cells by meta-heuristics. Int J Ambient Energy. https://doi.org/10.1080/01430750.2020.1745276
Gong W, Razmjooy N (2020) A new optimisation algorithm based on OCM and PCM solution through energy reserve. Int J Ambient Energy. https://doi.org/10.1080/01430750.2020.1730952
Zhang QF, Li H (2007) MOEA/D: A multiobjective evolutionary algorithm based on decomposition. IEEE Trans Evol Comput 11(6):712–731
Deb K, Pratap A, Agarwal S et al (2002) A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Trans Evol Comput 6(2):182–197
Acknowledgements
This work was supported by the National Natural Science Foundation of China (Grant nos. 61972456, 62172298); Natural Science Foundation of Tianjin (No. 20JCYBJC00140; Key Laboratory of Universal Wireless Communications (BUPT), Ministry of Education, P. R.China (KFKT-2020101)
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Chai, Zy., Yang, Cd. & Li, Yl. Communication efficiency optimization in federated learning based on multi-objective evolutionary algorithm. Evol. Intel. 16, 1033–1044 (2023). https://doi.org/10.1007/s12065-022-00718-x
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12065-022-00718-x