Abstract:
Privacy-preserving machine learning (PPML) schemes aim at protecting client-side data privacy in two-party secure computing tasks such as private deep neural network (DNN...Show MoreMetadata
Abstract:
Privacy-preserving machine learning (PPML) schemes aim at protecting client-side data privacy in two-party secure computing tasks such as private deep neural network (DNN) inference. While fully homomorphic encryption (FHE) can provide provable security for client data privacy, efficiently verifying that such homomorphic DNN inference protocol is honestly executed on the server presents to be challenging. In this work, we propose THE-V, a novel DNN inference framework that combines FHE and Trusted Execution Environment (TEE) to achieve data privacy, verifiable execution and efficient computation all at once. We first point out that, while the trivial solution of executing FHE entirely within TEE can ensure both private and verifiable computing, the limited resource within TEE becomes a severe computational bottleneck. To solve such dilemma, we devise a new strategy of securely outsourcing computation-heavy tasks in TEE to untrusted environments. By rigorous experiments, we show that we can achieve verifiable and private DNN inference with up to 15\times speedup compared with the state-of-the-art solution.
Date of Conference: 28 October 2023 - 02 November 2023
Date Added to IEEE Xplore: 30 November 2023
ISBN Information: