Skip to main content

Advertisement

FedIBD: a federated learning framework in asynchronous mode for imbalanced data

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

With the development of edge computing and Internet of Things (IoT), the computing power of edge devices continues to increase, and the data obtained is more specific and private. Methods based on Federated Learning (FL) can help utilize the data that exists widely on edge devices in a privacy-preserving way and train a shareable global model collaboratively. However, the imbalanced data from edge devices pose a huge challenge to FL, as data features extracted from uneven, biased, and incomplete samples complicate the model aggregation process required to achieve well-performing models. To support FL on imbalanced data, a new asynchronous FL framework, named FedIBD: Federated learning framework in Asynchronous mode for Imbalanced Data, is proposed. FedIBD not only considers the temporal inconsistency in asynchronous learning but also measures the informative differences in imbalanced data to support FL in asynchronous and heterogeneous environments. Compared with the existing synchronous and asynchronous FL methods, FedIBD can achieve significantly better performance in terms of accuracy, communication time and cost on imbalanced data.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Algorithm 1
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

Data Availability

We use the publicly available MNIST, FMNIST, CIFAR-10, CIFAR-100, and SMD dataset. The datasets analysed during the current study are available in the MNIST, http://yann.lecun.com/exdb/mnist/, FMNIST, https://github.com/zalandoresearch/fashion-mnist, CIFAR-10 and CIFAR-100, https://www.cs.toronto.edu/~kriz/cifar.html, SMD, https://github.com/NetManAIOps/OmniAnomaly/tree/master/ServerMachineDataset

Code Availability

Codes are not publicly available duo to private restrictions.

Notes

  1. http://yann.lecun.com/exdb/mnist/

  2. https://github.com/zalandoresearch/fashion-mnist

  3. https://www.cs.toronto.edu/~kriz/cifar.html

  4. https://github.com/NetManAIOps/OmniAnomaly/tree/master/ServerMachineDataset

References

  1. Rey V, Sánchez P et al (2022) Federated learning for malware detection in IoT devices. Comput Netw 204(5):108693–108706. https://doi.org/10.1016/j.comnet.2021.108693

    Article  MATH  Google Scholar 

  2. Yang Q, Liu Y et al (2019) Federated learning. Aynthesis lectures on artificial intelligence and machine learning 13(3):1–207. https://doi.org/10.1007/978-3-031-01585-4

    Article  MATH  Google Scholar 

  3. McMahan B, Moore E, et al. (2017) Communication-efficient learning of deep networks from decentralized data. Art Intell Stat, 1273-1282 . https://proceedings.mlr.press/v54/mcmahan17a.html

  4. You L, Liu S et al (2023) Federated and asynchronized learning for autonomous and intelligent things. IEEE Network 38(2):286–293. https://doi.org/10.1109/MNET.2023.3321519

    Article  MATH  Google Scholar 

  5. Lim W, Luong N et al (2020) Federated learning in mobile edge networks: A comprehensive survey. IEEE Commu Surv Tutor 22(3):2031–2063. https://doi.org/10.1109/COMST.2020.2986024

    Article  MATH  Google Scholar 

  6. Wang S, Tuor T et al (2019) Adaptive federated learning in resource constrained edge computing systems. IEEE J Select Areas in Commu 37(6):1205–1221. https://doi.org/10.1109/JSAC.2019.2904348

    Article  MATH  Google Scholar 

  7. Shaheen M, Farooq M, et al. (2022) Applications of federated learning; taxonomy, challenges, and research trends. Electron 11(4):670-702 . https://doi.org/10.3390/electronics11040670

  8. Li J, Meng Y et al (2021) A federated learning based privacy-preserving smart healthcare system. IEEE Trans Indust Inf 18(3):2021–2031. https://doi.org/10.1109/TII.2021.3098010

    Article  MathSciNet  MATH  Google Scholar 

  9. Li Y, Tao X et al (2021) Privacy-preserved federated learning for autonomous driving. IEEE Trans Intell Trans Syst 23(7):8423–8434. https://doi.org/10.1109/TITS.2021.3081560

    Article  MATH  Google Scholar 

  10. Kopparapu K, Lin E et al (2022) Tinyfedtl: Federated transfer learning on ubiquitous tiny iot devices. IEEE Int Conf Perv Comput Commu Workshops and Other Affiliate Events 79–81. https://doi.org/10.1109/PerComWorkshops53856.2022.9767250

  11. Li K, Wang H et al (2023) FedTCR: communication-efficient federated learning via taming computing resources. Complex & Intelligent Systems 9(5):5199–5219. https://doi.org/10.1007/s40747-023-01006-6

    Article  MATH  Google Scholar 

  12. Ma Z, Xu Y et al (2021) Adaptive batch size for federated learning in resource-constrained edge computing. IEEE Trans Mobile Comput 22(1):37–53. https://doi.org/10.1109/TMC.2021.3075291

    Article  MATH  Google Scholar 

  13. You L, Liu S et al (2022) A triple-step asynchronous federated learning mechanism for client activation, interaction optimization, and aggregation enhancement. IEEE Internet of Things J 9(23):24199–24211. https://doi.org/10.1109/JIOT.2022.3188556

    Article  MATH  Google Scholar 

  14. Xie C, Koyejo S, et al. (2019) Asynchronous federated optimization. Proc. NeurIPS Workshop Optim Mach Learn, 1-11 . https://opt-ml.org/oldopt/papers/2020/paper_28.pdf

  15. Liu S, Chen Q et al (2022) Fed2a: Federated learning mechanism in asynchronous and adaptive modes. Electron 11(9):1393–1409. https://doi.org/10.3390/electronics11091393

    Article  MATH  Google Scholar 

  16. Sattler F, Wiedemann S et al (2019) Robust and communication-efficient federated learning from non-iid data. IEEE Trans neural Netw Learn Syst 31(9):3400–3413. https://doi.org/10.1109/TNNLS.2019.2944481

    Article  MATH  Google Scholar 

  17. Nguyen J, Malik K, et al. (2022) Federated learning with buffered asynchronous aggregation. Int Conf Art Intell Stat 151(1):3581-3607 . https://proceedings.mlr.press/v151/nguyen22b.html

  18. Chen F, Xie Z, et al. (2021) Asynchronous federated learning aggregation update algorithm. J Chinese Comput Syst 42(12):2473-2478. https://kns.cnki.net/kcms/detail/21.1106.TP.20210818.1356.047.html

  19. Chen S, Shen C et al (2021) Dynamic aggregation for heterogeneous quantization in federated learning. IEEE Trans Wireless Commu 20(10):6804–6819. https://doi.org/10.1109/TWC.2021.3076613

    Article  MATH  Google Scholar 

  20. Li S, Ngai E et al (2022) Auto-weighted robust federated learning with corrupted data sources. ACM Trans Intell Syst Technol 13(5):1–20. https://doi.org/10.1145/3517821

    Article  MATH  Google Scholar 

  21. Wang X, Li R, et al. (2021) Attention-weighted federated deep reinforcement learning for device-to-device assisted heterogeneous collaborative edge caching. IEEE J Select Areas in Commu 39(1):154-169. http://doi.org/10.1109/JSAC.2020.3036946

  22. Ek S, Portet F, et al. (2021) Artifact: A federated learning aggregation algorithm for pervasive computing: Evaluation and comparison. IEEE Int Conf Perv Comput Commu Workshops and Other Affiliate Events, 448-449 . https://doi.org/10.1109/PERCOM50583.2021.9439129

  23. Lv H, Zheng Z, et al. (2021) Data-free evaluation of user contributions in federated learning. Int Symp Model Opt Mobile, Ad Hoc, and Wireless Netw, 1–8. https://doi.org/10.23919/WiOpt52861.2021.9589136

  24. Sattler F, Müller K, et al. (2020) Clustered federated learning: Model-agnostic distributed multitask optimization under privacy constraints. IEEE Trans Neural Netw Learn Syst 32(8):3710-3722 . https://doi.org/10.1109/TNNLS.2020.3015958

  25. Cho Y, Wang J et al (2023) Communication-Efficient and Model-Heterogeneous Personalized Federated Learning via Clustered Knowledge Transfer. IEEE J Select Topics in Signal Process 17(1):234–247. https://doi.org/10.1109/JSTSP.2022.3231527

    Article  MATH  Google Scholar 

  26. Gao Z, Qiu C et al (2021) Fedim: An anti-attack federated learning based on agent importance aggregation. IEEE Int Conf Trust, Secur Privacy in Comput Commu 1445–1451. https://doi.org/10.1109/TrustCom53373.2021.00205

  27. Ullah S, Kim D (2021) Federated learning using sparse-adaptive model selection for embedded edge computing. IEEE Access 9(5):167868-167879 . https://doi.org/10.1109/ACCESS.2021.3137189

  28. Nandi A, Xhafa F, et al. (2023) A Docker-based federated learning framework design and deployment for multi-modal data stream classification. Comput 105(10):2195-2229 . https://doi.org/10.1007/s00607-023-01179-5

  29. Chen Y, Sun X et al (2020) Communication-efficient federated deep learning with layerwise asynchronous model update and temporally weighted aggregation. IEEE Trans Neural Netw Learn Syst 31(10):4229–4238. https://doi.org/10.1109/TNNLS.2019.2953131

    Article  MATH  Google Scholar 

  30. Chen S, Wang X et al (2022) Heterogeneous semi-asynchronous federated learning in internet of things: A multi-armed bandit approach. IEEE Trans Emerg Topics in Comput Intell 6(5):1113–1124. https://doi.org/10.1109/TETCI.2022.3146871

    Article  MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

This work was supported in part by the Guangdong Basic and Applied Basic Research Foundation under Grant 2023A1515012895, in part by the National Key Research and Development Program of China under Grant 2023YFB4301900, in part by Department of Science and Technology of Guangdong Province (Project No. 2021QN02S161), and in part by the National Natural Science Foundation of China (62002398).

Funding

The authors declare they have no financial interests.

Author information

Authors and Affiliations

Authors

Contributions

YH and HL are the main contributor who wrote the paper and ran experiments. Co-authors ZG, WW, RL and LY proposed the ideas, joined the discussion and helped polish the paper.

Corresponding author

Correspondence to Linlin You.

Ethics declarations

Ethics Approval

Not Applicable.

Consent to Participate

Not Applicable.

Consent for Publication

Not Applicable.

Conflicts of Interest

The authors declare that they have no confict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Hou, Y., Li, H., Guo, Z. et al. FedIBD: a federated learning framework in asynchronous mode for imbalanced data. Appl Intell 55, 122 (2025). https://doi.org/10.1007/s10489-024-06032-6

Download citation

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s10489-024-06032-6

Keywords