Abstract:
Secure two-party neural network (2P-NN) inference allows the server with a neural network model and the client with inputs to perform neural network inference without rev...Show MoreMetadata
Abstract:
Secure two-party neural network (2P-NN) inference allows the server with a neural network model and the client with inputs to perform neural network inference without revealing their private data to each other. However, the state-of-the-art 2P-NN inference still suffers from large computation and communication overhead especially when used in ImageNet-scale deep neural networks. In this work, we design and build Panther, a lightweight and efficient secure 2P-NN inference system, which has great efficiency in evaluating 2P-NN inference while safeguarding the privacy of the server and the client. At the core of Panther, we have new protocols for 2P-NN inference. Firstly, we propose a customized homomorphic encryption scheme to reduce burdensome polynomial multiplications in the homomorphic encryption arithmetic circuit of linear protocols. Secondly, we present a more efficient and communication concise design for the millionaires’ protocol, which enables non-linear protocols with less communication cost. Our evaluations over three sought-after varying-scale deep neural networks show that Panther outperforms the state-of-the-art 2P-NN inference systems in terms of end-to-end runtime and communication overhead. Panther achieves state-of-the-art performance with up to 24.95\times speedup for linear protocols and 6.40 \times speedup for non-linear protocols in WAN when compared to prior arts.
Published in: IEEE Transactions on Information Forensics and Security ( Volume: 20)