Skip to main content

FedTD: Efficiently Share Telemedicine Data with Federated Distillation Learning

  • Conference paper
  • First Online:
  • 1009 Accesses

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 13655))

Abstract

With the Internet of Things and medical technology development, patients use wearable telemedicine devices to transmit health data to hospitals. The need for data sharing for public health has become more urgent under the COVID-19 pandemic. Previously, security protection technology was difficult to solve the increasing security risks and challenges of telemedicine. To address the above hindrances, Federated learning (FL) solves the difficulty for companies and institutions to share user data securely. The global server iterative aggregates the model parameters from the local server instead of uploading the user's data directly to the cloud server. We propose a new model of federated distillation learning called FedTD, which allows the different models between local hospital servers and global servers. Unlike traditional federated learning, we combine the knowledge distillation method to solve the non-Independent Identically Distribution (non-IID) problem of patient medical data. It provides a security solution for sharing patients’ medical information among hospitals. We tested our approach on the COVID-19 Radiography and COVID-Chestxray datasets to improve the model performance and reduce communication costs. Extensive experiments show that our FedTD significantly outperforms the state-of-the-art.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Research Report on Medical Internet of Things Security. Institute of Security, China Academy of Information and Communications Technology (2021)

    Google Scholar 

  2. IoT Endpoint Security White Paper[R]. IoT Security Innovation Lab (2019)

    Google Scholar 

  3. McMahan, B., Moore, E., Ramage, D., et al.: Communication-efficient learning of deep networks from decentralized data. In: Artificial Intelligence and Statistics, pp. 1273–1282. PMLR, (2017)

    Google Scholar 

  4. Hard, A., Rao, K., Mathews, R., et al.: Federated learning for mobile keyboard prediction (2018). arXiv preprint arXiv:1811.03604

  5. Hinton, G., Vinyals, O., Dean, J.: Distilling the knowledge in a neural network, 2(7) (2015). arXiv preprint arXiv:1503.02531

  6. Anil, R., Pereyra, G., Passos, A., et al.: Large scale distributed neural network training through online distillation (2018). arXiv preprint arXiv:1804.03235

  7. Jeong, E., Oh, S., Kim, H., et al.: Communication-efficient on-device machine learning: federated distillation and augmentation under non-iid private data (2018). arXiv preprint arXiv:1811.11479

  8. Griggs, K.N., Ossipova, O., Kohlios, C.P., et al.: Healthcare blockchain system using smart contracts for secure automated remote patient monitoring. J. Med. Syst. 42(7), 1–7 (2018)

    Article  Google Scholar 

  9. Musale, P., Baek, D., Werellagama, N., et al.: You walk, we authenticate: lightweight seamless authentication based on gait in wearable IoT systems. IEEE Access 7, 37883–37895 (2019)

    Article  Google Scholar 

  10. Nguyen, D.C., Pathirana, P.N., Ding, M., Seneviratne, A.: Blockchain for secure EHRs sharing of mobile cloud based E-health systems. IEEE Access 7, 66792–66806 (2019)

    Article  Google Scholar 

  11. Vallathan, G., Rajani, T., Kumaraswamy, E., et al.: A prediction based encryption approach for telemedicine applications. In: IOP Conference Series: Materials Science and Engineering, vol. 981, no. 3, p. 032008. IOP Publishing (2020)

    Google Scholar 

  12. Lin, Q., Yan, H., Huang, Z., et al.: An ID-based linearly homomorphic signature scheme and its application in blockchain. IEEE Access 6, 20632–20640 (2018)

    Article  Google Scholar 

  13. Shuai, W., Jing, W., Xiao, W., et al.: Blockchain-powered parallel healthcare systems based on the ACP approach. IEEE Trans. Comput. Soc. Syst. 5(4), 942–950 (2018)

    Article  Google Scholar 

  14. Li, L., Liu, L., Peng, H., et al.: Flexible and secure data transmission system based on semitensor compressive sensing in wireless body area networks. IEEE Internet Things J. 6(2), 3212–3227 (2018)

    Article  Google Scholar 

  15. Li, D., Wang, J.: Fedmd: heterogenous federated learning via model distillation (2019). arXiv preprint arXiv:1910.03581

  16. Chang, H., Shejwalkar, V., Shokri, R., et al.: Cronus: robust and heterogeneous collaborative learning with black-box knowledge transfer (2019). arXiv preprint arXiv:1912.11279

  17. Li, T., Sahu, A.K., Zaheer, M., et al.: Feddane: a federated newton-type method. In: 2019 53rd Asilomar Conference on Signals, Systems, and Computers, pp. 1227–1231. IEEE (2019)

    Google Scholar 

  18. Lin, T., Kong, L., Stich, S.U., et al.: Ensemble distillation for robust model fusion in federated learning. Adv. Neural. Inf. Process. Syst. 33, 2351–2363 (2020)

    Google Scholar 

  19. He, C., Annavaram, M., Avestimehr, S.: Group knowledge transfer: federated learning of large cnns at the edge. Adv. Neural. Inf. Process. Syst. 33, 14068–14080 (2020)

    Google Scholar 

  20. Sun, L., Lyu, L.: Federated model distillation with noise-free differential privacy (2020). arXiv preprint arXiv:2009.05537

  21. Li, Q., He, B., Song, D.: Practical one-shot federated learning for cross-silo setting (2020). arXiv preprint arXiv:2010.01017

  22. Seo, H., Park, J., Oh, S., et al.: Federated knowledge distillation (2020). arXiv preprint arXiv:2011.02367

  23. Chen, H.Y., Chao, W.L.: Fedbe: making bayesian model ensemble applicable to federated learning (2020). arXiv preprint arXiv:2009.01974

  24. Sattler, F., Korjakow, T., Rischke, R., et al.: Fedaux: leveraging unlabeled auxiliary data in federated learning. IEEE Trans. Neural Netw. Learn. Syst., 1–13 (2021)

    Google Scholar 

  25. Zhu, Z., Hong, J., Zhou, J.: Data-free knowledge distillation for heterogeneous federated learning. In: International Conference on Machine Learning, pp. 12878–12889. PMLR (2021)

    Google Scholar 

  26. Cheng, S., Wu, J., Xiao, Y., et al.: FedGEMS: federated learning of larger server models via selective knowledge fusion (2021). arXiv preprint arXiv:2110.11027

  27. Gong, X., Sharma, A., Karanam, S., et al.: Ensemble attention distillation for privacy-preserving federated learning. In: Proceedings of the IEEE CVF International Conference on Computer Vision, pp. 15076–15086 (2021)

    Google Scholar 

  28. Chowdhury, M.E.H., Rahman, T., Khandakar, A., et al.: Can AI help in screening viral and COVID-19 pneumonia? IEEE Access 8, 132665–132676 (2020)

    Article  Google Scholar 

  29. Rahman, T., Khandakar, A., Qiblawey, Y., et al.: Exploring the effect of image enhancement techniques on COVID-19 detection using chest X-ray images. Comput. Biol. Med. 132, 104319 (2021)

    Article  Google Scholar 

  30. Cohen, J.P., Morrison, P., Dao, L.: COVID-19 image data collection (2020). arXiv preprint arXiv:2003.11597

  31. He, K., Zhang, X., Ren, S., et al.: Deep residual learning for image recognition. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 770–778 (2016)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Wei Ou .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Li, N., Wang, N., Ou, W., Han, W. (2023). FedTD: Efficiently Share Telemedicine Data with Federated Distillation Learning. In: Xu, Y., Yan, H., Teng, H., Cai, J., Li, J. (eds) Machine Learning for Cyber Security. ML4CS 2022. Lecture Notes in Computer Science, vol 13655. Springer, Cham. https://doi.org/10.1007/978-3-031-20096-0_37

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-20096-0_37

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-20095-3

  • Online ISBN: 978-3-031-20096-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics