Abstract:
Split federated learning (SFL) allows clients with limited resources to engage in distributed machine learning, yet it grapples with issues related to energy usage and th...Show MoreMetadata
Abstract:
Split federated learning (SFL) allows clients with limited resources to engage in distributed machine learning, yet it grapples with issues related to energy usage and the efficiency of training. We propose low-carbon hierarchical multiple SFL (HMSFL) to address these issues and advance sustainable computing. HMSFL partitions the client model into several segments, enabling local aggregation among clients. This process amplifies energy efficiency and diminishes the carbon footprint. We formulate the training cost minimization problem and solve it using a generalized task allocation algorithm. Evaluation across real-world tasks demonstrates that HMSFL achieves a 36% reduction in training time and a 33% decrease in energy consumption compared to baseline methods, showcasing its potential for sustainable distributed machine learning.
Published in: IEEE Internet of Things Journal ( Volume: 11, Issue: 24, 15 December 2024)