Loading [a11y]/accessibility-menu.js
Towards Efficient and Privacy-Preserving Federated Deep Learning | IEEE Conference Publication | IEEE Xplore

Towards Efficient and Privacy-Preserving Federated Deep Learning


Abstract:

Deep learning has been applied in many areas, such as computer vision, natural language processing and emotion analysis. Differing from the traditional deep learning that...Show More

Abstract:

Deep learning has been applied in many areas, such as computer vision, natural language processing and emotion analysis. Differing from the traditional deep learning that collects users' data centrally, federated deep learning requires participants to train the networks on private datasets and share the training results, and hence has more gratifying efficiency and stronger security. However, it still presents some privacy issues since adversaries can deduce users' privacy from local outputs, such as gradients. While the problem of private federated deep learning has been an active research issue, the latest research findings are still inadequate in terms of security, accuracy and efficiency. In this paper, we propose an efficient and privacy-preserving federated deep learning protocol based on stochastic gradient descent method by integrating the additively homomorphic encryption with differential privacy. Specifically, users add noises to each local gradients before encrypting them to obtain the optical performance and security. Moreover, our scheme is secure to honest-but-curious server setting even if the cloud server colludes with multiple users. Besides, our scheme supports federated learning for large-scale users scenarios and extensive experiments demonstrate our scheme has high efficiency and high accuracy compared with non-private model.
Date of Conference: 20-24 May 2019
Date Added to IEEE Xplore: 15 July 2019
ISBN Information:

ISSN Information:

Conference Location: Shanghai, China

References

References is not available for this document.