Secure Deep Neural Network Models Publishing Against Membership Inference Attacks Via Training Task Parallelism | IEEE Journals & Magazine | IEEE Xplore

Secure Deep Neural Network Models Publishing Against Membership Inference Attacks Via Training Task Parallelism


Abstract:

Vast data and computing resources are commonly needed to train deep neural networks, causing an unaffordable price for individual users. Motivated by the increasing deman...Show More

Abstract:

Vast data and computing resources are commonly needed to train deep neural networks, causing an unaffordable price for individual users. Motivated by the increasing demands of deep learning applications, sharing well-trained models becomes popular. The owner of a pre-trained model can share it by publishing the model directly or providing a prediction interface. Either way, individual users can benefit from deep learning without much cost, and computing resources can be saved. However, recent studies of machine learning security have identified severe threats to these model publishing approaches. This article will focus on the privacy leakage issue of publishing well-trained deep neural network models. To tackle this problem, we propose a series of secure model publishing solutions based on training task parallelism. Specifically, we show how to estimate private model parameters through parallel model training and generate new model parameters in a privacy-preserving manner to replace the original ones for publishing. Based on data parallelism and parameter generating techniques, we design another two solutions concentrating on model quality and parameter privacy, respectively. Through privacy leakage analysis and experimental attack evaluation, we conclude that deep neural network models published with our solutions can provide on-demand model quality guarantees and resist membership inference attacks.
Published in: IEEE Transactions on Parallel and Distributed Systems ( Volume: 33, Issue: 11, 01 November 2022)
Page(s): 3079 - 3091
Date of Publication: 22 November 2021

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.