Skip to main content

DNet: An Efficient Privacy-Preserving Distributed Learning Framework for Healthcare Systems

  • Conference paper
  • First Online:

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 12582))

Abstract

Medical data held in silos by institutions, makes it challenging to predict new trends and gain insights, as, sharing individual data leaks user privacy and is restricted by law. Meanwhile, the Federated Learning framework [11] would solve this problem by facilitating on-device training while preserving privacy. However, the presence of a central server has its inherent problems, including a single point of failure and trust. Moreover, data may be prone to inference attacks. This paper presents a Distributed Net algorithm called DNet to address these issues posing its own set of challenges in terms of high communication latency, performance, and efficiency. Four different networks have been discussed and compared for computation, latency, and precision. Empirical analysis has been performed over Chest X-ray Images and COVID-19 dataset. The theoretical analysis proves our claim that the algorithm has a lower communication latency and provides an upper bound.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    AI, W.: Federated ai technology enabler (2019), https://www.fedai.org/cn/.

  2. 2.

    Google.: Tensorflow federated (2019), https://www.tensorflow.org/federate.

  3. 3.

    OpenMined: Pysyft-tensorflow (2019), https://github.com/OpenMined/PySyft-TensorFlow.

  4. 4.

    https://www.kaggle.com/paultimothymooney/chest-xray-pneumonia.

  5. 5.

    https://www.kaggle.com/tarandeep97/covid19-normal-posteroanteriorpa-xrays.

References

  1. Agarwal, N., Suresh, A.T., Yu, F.X.X., Kumar, S., McMahan, B.: cpSGD: communication-efficient and differentially-private distributed SGD. In: Advances in Neural Information Processing Systems, pp. 7564–7575 (2018)

    Google Scholar 

  2. Bonawitz, K., Salehi, F., Konečnỳ, J., McMahan, B., Gruteser, M.: Federated learning with autotuned communication-efficient secure aggregation. arXiv preprint arXiv:1912.00131 (2019)

  3. Brisimi, T.S., Chen, R., Mela, T., Olshevsky, A., Paschalidis, I.C., Shi, W.: Federated learning of predictive models from federated electronic health records. Int. J. Med. Inform. 112, 59–67 (2018)

    Article  Google Scholar 

  4. Chen, Y., Qin, X., Wang, J., Yu, C., Gao, W.: Fedhealth: a federated transfer learning framework for wearable healthcare. IEEE Intell. Syst. 35, 83–93 (2020)

    Article  Google Scholar 

  5. Deist, T.M., et al.: Infrastructure and distributed learning methodology for privacy-preserving multi-centric rapid learning health care: euroCAT. Clin. Transl. Radiat. Oncol. 4, 24–31 (2017)

    Article  Google Scholar 

  6. Konečnỳ, J., McMahan, H.B., Yu, F.X., Richtárik, P., Suresh, A.T., Bacon, D.: Federated learning: strategies for improving communication efficiency. arXiv preprint arXiv:1610.05492 (2016)

  7. Kuo, T.T., Kim, J., Gabriel, R.A.: Privacy-preserving model learning on a blockchain network-of-networks. J. Am. Med. Inform. Assoc. 27, 343–354 (2020)

    Article  Google Scholar 

  8. Lalitha, A., Kilinc, O.C., Javidi, T., Koushanfar, F.: Peer-to-peer federated learning on graphs (2019)

    Google Scholar 

  9. Liu, D., Miller, T., Sayeed, R., Mandl, K.D.: FADL: federated-autonomous deep learning for distributed electronic health record. arXiv preprint arXiv:1811.11400 (2018)

  10. Lu, S., Zhang, Y., Wang, Y., Mack, C.: Learn electronic health records by fully decentralized federated learning (2019)

    Google Scholar 

  11. McMahan, H.B., Moore, E., Ramage, D., Arcas, B.A.: Federated learning of deep networks using model averaging. CoRR abs/1602.05629 (2016). http://arxiv.org/abs/1602.05629

  12. Ramanan, P., Nakayama, K.: BAFFLE: blockchain based aggregator free federated learning (2019)

    Google Scholar 

  13. Ramaswamy, S., Mathews, R., Rao, K., Beaufays, F.: Federated learning for emoji prediction in a mobile keyboard. arXiv preprint arXiv:1906.04329 (2019)

  14. Shamir, O., Srebro, N., Zhang, T.: Communication efficient distributed optimization using an approximate newton-type method. CoRR abs/1312.7853 (2013). http://arxiv.org/abs/1312.7853

  15. Shokri, R., Shmatikov, V.: Privacy-preserving deep learning. In: Proceedings of the 22nd ACM SIGSAC Conference on Computer and Communications Security, pp. 1310–1321 (2015)

    Google Scholar 

  16. Stephen, O., Sain, M., Maduh, U.J., Jeong, D.U.: An efficient deep learning approach to pneumonia classification in healthcare. J. Healthc. Eng. 2019 (2019)

    Google Scholar 

  17. Wang, H., Qu, Z., Guo, S., Gao, X., Li, R., Ye, B.: Intermittent pulling with local compensation for communication-efficient federated learning. arXiv preprint arXiv:2001.08277 (2020)

  18. Xu, J., Wang, F.: Federated learning for healthcare informatics. arXiv preprint arXiv:1911.06270 (2019)

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Parth Parag Kulkarni .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Kulkarni, P.P., Kasyap, H., Tripathy, S. (2021). DNet: An Efficient Privacy-Preserving Distributed Learning Framework for Healthcare Systems. In: Goswami, D., Hoang, T.A. (eds) Distributed Computing and Internet Technology. ICDCIT 2021. Lecture Notes in Computer Science(), vol 12582. Springer, Cham. https://doi.org/10.1007/978-3-030-65621-8_9

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-65621-8_9

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-65620-1

  • Online ISBN: 978-3-030-65621-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics