Abstract:
Due to data security concerns, federated learning (FL) has significant computation and communication costs, which lowers total training effectiveness. This research propo...Show MoreMetadata
Abstract:
Due to data security concerns, federated learning (FL) has significant computation and communication costs, which lowers total training effectiveness. This research proposes a new FL framework, Lightweight FL, to resolve this problem by enhancing the current fundamental processes. First, a local network comprising numerous lightweight training methodologies is designed to lower the costs of local model training via small-scale convolution calculation. Second, nonstructural pruning and fine-tuning of the local model is performed on this premise to reduce computation costs by reducing network complexity. Third, the optimal selection strategy is proposed during model pruning and model aggregation processes, and the model with the best performance is chosen as the benchmark model for the next iteration of learning. This strategy is equipped to reduce communication costs and improve the learning efficiency of the framework. It is proved through verification of the bearing, gearbox, and bogie datasets that it can effectively decrease learning costs while still assuring good model performance. This offers a workable option for FL deployments in the future on low-performance edge devices.
Published in: IEEE Transactions on Instrumentation and Measurement ( Volume: 73)