Skip to main content

FeDZIO: Decentralized Federated Knowledge Distillation on Edge Devices

  • Conference paper
  • First Online:
Image Analysis and Processing - ICIAP 2023 Workshops (ICIAP 2023)

Abstract

In recent years, the proliferation of edge devices and distributed sensors has fueled the need for training sophisticated deep learning models directly on resource-constrained nodes, in order to guarantee data locality and prevent the transmission of private information to centralized training infrastructures. However, executing large-scale models on edge devices poses significant challenges due to limited computational power, memory constraints and energy consumption limitation. Federated Learning (FL) has emerged as a promising approach to partially address these issues, enabling decentralized model training across multiple devices without the need to exchange local data. At the same time, Knowledge Distillation (KD) has demonstrated its efficacy in compressing complex models by transferring knowledge from a larger teacher model to a smaller student model.

This paper presents a novel framework combining Federated Learning with Knowledge Distillation, specifically tailored for accelerating training on edge devices. The proposed approach leverages the collaborative learning capabilities of federated learning to perform knowledge distillation in a privacy-preserving and efficient manner. Instead of relying on a central server for aggregation, edge devices with localized data collaboratively exchange knowledge with each other, enabling transmission of minimal quantities of data without compromising data privacy and model performance. The distributed nature of this approach allows edge devices to leverage collective intelligence while avoiding the need for sharing raw data across the network.

We conduct extensive experiments on diverse edge device scenarios using state-of-the-art deep learning architectures. The results demonstrate that our approach achieves substantial model compression while maintaining competitive performance compared to traditional knowledge distillation methods. Additionally, the federated nature of our approach ensures scalability and robustness, even in dynamic edge device environments.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 89.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 119.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Deng, S., Zhao, H., Fang, W., Yin, J., Dustdar, S., Zomaya, A.Y.: Edge intelligence: the confluence of edge computing and artificial intelligence. IEEE Internet Things J. 7(8), 7457–7469 (2020)

    Article  Google Scholar 

  2. Gou, J., Yu, B., Maybank, S.J., Tao, D.: Knowledge distillation: a survey. Int. J. Comput. Vision 129, 1789–1819 (2021)

    Article  Google Scholar 

  3. He, C., Annavaram, M., Avestimehr, S.: Group knowledge transfer: federated learning of large CNNs at the edge. Adv. Neural. Inf. Process. Syst. 33, 14068–14080 (2020)

    Google Scholar 

  4. Kaissis, G.A., Makowski, M.R., Rückert, D., Braren, R.F.: Secure, privacy-preserving and federated machine learning in medical imaging. Nat. Mach. Intell. 2(6), 305–311 (2020)

    Article  Google Scholar 

  5. Lalitha, A., Shekhar, S., Javidi, T., Koushanfar, F.: Fully decentralized federated learning. In: Third Workshop on Bayesian Deep Learning (NeurIPS), vol. 2 (2018)

    Google Scholar 

  6. Li, T., Sahu, A.K., Talwalkar, A., Smith, V.: Federated learning: challenges, methods, and future directions. IEEE Signal Process. Mag. 37(3), 50–60 (2020)

    Article  Google Scholar 

  7. Li, Y., Chen, C., Liu, N., Huang, H., Zheng, Z., Yan, Q.: A blockchain-based decentralized federated learning framework with committee consensus. IEEE Network 35(1), 234–241 (2020)

    Article  Google Scholar 

  8. Lim, W.Y.B., et al.: Federated learning in mobile edge networks: a comprehensive survey. IEEE Commun. Surv. Tutorials 22(3), 2031–2063 (2020)

    Article  Google Scholar 

  9. Mishra, R., Gupta, H.P.: Designing and training of lightweight neural networks on edge devices using early halting in knowledge distillation. IEEE Trans. Mob. Comput. (2023)

    Google Scholar 

  10. Pappas, C., Chatzopoulos, D., Lalis, S., Vavalis, M.: IPLS: a framework for decentralized federated learning. In: 2021 IFIP Networking Conference (IFIP Networking), pp. 1–6. IEEE (2021)

    Google Scholar 

  11. Roy, A.G., Siddiqui, S., Pölsterl, S., Navab, N., Wachinger, C.: BrainTorrent: a peer-to-peer environment for decentralized federated learning. arXiv preprint arXiv:1905.06731 (2019)

  12. Yu, R., Li, P.: Toward resource-efficient federated learning in mobile edge computing. IEEE Network 35(1), 148–155 (2021)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Luca Palazzo .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Palazzo, L., Pennisi, M., Bellitto, G., Kavasidis, I. (2024). FeDZIO: Decentralized Federated Knowledge Distillation on Edge Devices. In: Foresti, G.L., Fusiello, A., Hancock, E. (eds) Image Analysis and Processing - ICIAP 2023 Workshops. ICIAP 2023. Lecture Notes in Computer Science, vol 14366. Springer, Cham. https://doi.org/10.1007/978-3-031-51026-7_18

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-51026-7_18

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-51025-0

  • Online ISBN: 978-3-031-51026-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics