ABSTRACT
Federated learning (FL) uses distributed fashion of training via local models (e.g., convolutional neural network) computation at devices followed by central aggregation at the edge or cloud. Such distributed training uses a significant amount of computational resources (i.e., CPU-cycles/sec) that seem difficult to be met by Internet of Things (IoT) sensors. Addressing these challenges, split FL (SFL) was recently proposed based on computing a part of a model at devices and remaining at edge/cloud servers. Although SFL resolves devices computing resources constraints, it still suffers from fairness issues and slow convergence. To enable FL with these features, we propose a novel hierarchical SFL (HSFL) architecture that combines SFL with a hierarchical fashion of learning. To avoid a single point of failure and fairness issues, HSFL has a truly distributed nature (i.e., distributed aggregations). We also define a cost function that can be minimized relative local accuracy, transmit power, resource allocation, and association. Due to the non-convex nature, we propose a block successive upper bound minimization (BSUM) based solution. Finally, numerical results are presented.
- Mehdi Salehi Heydar Abad, Emre Ozfatura, Deniz Gunduz, and Ozgur Ercetin. 2020. Hierarchical federated learning across heterogeneous cellular networks. In ICASSP 2020-2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 8866–8870.Google ScholarCross Ref
- Mingzhe Chen, Zhaohui Yang, Walid Saad, Changchuan Yin, H Vincent Poor, and Shuguang Cui. 2020. A joint learning and communications framework for federated learning over wireless networks. IEEE Transactions on Wireless Communications 20, 1 (2020), 269–283.Google ScholarDigital Library
- Zhu Han, Mingyi Hong, and Dan Wang. 2017. Signal processing and networking for big data applications. Cambridge University Press.Google Scholar
- Latif U Khan, Mohsen Guizani, Dusit Niyato, Ala Al-Fuqaha, and Merouane Debbah. 2023. Metaverse for Wireless Systems: Architecture, Advances, Standardization, and Open Challenges. arXiv preprint arXiv:2301.11441 (2023).Google Scholar
- Latif U Khan, Zhu Han, Dusit Niyato, Ekram Hossain, and Choong Seon Hong. 2022. Metaverse for Wireless Systems: Vision, Enablers, Architecture, and Future Directions. arXiv preprint arXiv:2207.00413 (2022).Google Scholar
- Latif U Khan, Zhu Han, Walid Saad, Ekram Hossain, Mohsen Guizani, and Choong Seon Hong. 2022. Digital twin of wireless systems: Overview, taxonomy, challenges, and opportunities. IEEE Communications Surveys & Tutorials (2022).Google Scholar
- Latif U Khan, Ehzaz Mustafa, Junaid Shuja, Faisal Rehman, Kashif Bilal, Zhu Han, and Choong Seon Hong. 2022. Federated learning for digital twin-based vehicular networks: Architecture and challenges. arXiv preprint arXiv:2208.05558 (2022).Google Scholar
- Latif U Khan, Shashi Raj Pandey, Nguyen H Tran, Walid Saad, Zhu Han, Minh NH Nguyen, and Choong Seon Hong. 2020. Federated learning for edge networks: Resource optimization and incentive mechanism. IEEE Communications Magazine 58, 10 (2020), 88–93.Google ScholarCross Ref
- Latif U Khan, Walid Saad, Zhu Han, and Choong Seon Hong. 2021. Dispersed federated learning: Vision, taxonomy, and future directions. IEEE Wireless Communications 28, 5 (2021), 192–198.Google ScholarDigital Library
- Latif U Khan, Yan Kyaw Tun, Madyan Alsenwi, Muhammad Imran, Zhu Han, and Choong Seon Hong. 2021. A dispersed federated learning framework for 6G-enabled autonomous driving cars. arXiv preprint arXiv:2105.09641 (2021).Google Scholar
- Lumin Liu, Jun Zhang, SH Song, and Khaled B Letaief. 2020. Client-edge-cloud hierarchical federated learning. In ICC 2020-2020 IEEE International Conference on Communications (ICC). IEEE, 1–6.Google ScholarCross Ref
- Shashi Raj Pandey, Nguyen H Tran, Mehdi Bennis, Yan Kyaw Tun, Aunas Manzoor, and Choong Seon Hong. 2020. A crowdsourcing framework for on-device federated learning. IEEE Transactions on Wireless Communications 19, 5 (2020), 3241–3256.Google ScholarCross Ref
- Chandra Thapa, Mahawaga Arachchige Pathum Chamikara, Seyit Camtepe, and Lichao Sun. 2020. Splitfed: When federated learning meets split learning. arXiv preprint arXiv:2004.12088 (2020).Google Scholar
- Valeria Turina, Zongshun Zhang, Flavio Esposito, and Ibrahim Matta. 2020. Combining split and federated architectures for efficiency and privacy in deep learning. In Proceedings of the 16th International Conference on emerging Networking EXperiments and Technologies. 562–563.Google ScholarDigital Library
- Praneeth Vepakomma, Otkrist Gupta, Tristan Swedish, and Ramesh Raskar. 2018. Split learning for health: Distributed deep learning without sharing raw patient data. arXiv preprint arXiv:1812.00564 (2018).Google Scholar
- Jiayi Wang, Shiqiang Wang, Rong-Rong Chen, and Mingyue Ji. 2020. Local averaging helps: Hierarchical federated learning and convergence analysis. arXiv preprint arXiv:2010.12998 (2020).Google Scholar
Index Terms
- Resource Optimized Hierarchical Split Federated Learning for Wireless Networks
Recommendations
Energy and Latency-aware Computation Load Distribution of Hybrid Split and Federated Learning on IoT Devices
NSysS '23: Proceedings of the 10th International Conference on Networking, Systems and SecuritySplit learning (SL) and Federated Learning (FL) are popular distributed learning frameworks used to increase data privacy and reduce computation loads of Internet of Things (IoT) devices. However, one of the major challenges of distributed learning on ...
Combining split and federated architectures for efficiency and privacy in deep learning
CoNEXT '20: Proceedings of the 16th International Conference on emerging Networking EXperiments and TechnologiesDistributed learning systems are increasingly being adopted for a variety of applications as centralized training becomes unfeasible. A few architectures have emerged to divide and conquer the computational load, or to run privacy-aware deep learning ...
ARES: Adaptive Resource-Aware Split Learning for Internet of Things
AbstractDistributed training of Machine Learning models in edge Internet of Things (IoT) environments is challenging because of three main points. First, resource-constrained devices have large training times and limited energy budget. Second, ...
Comments