Efficient and Privacy-Preserving Integrity Verification for Federated Learning with TEEs | IEEE Conference Publication | IEEE Xplore

Efficient and Privacy-Preserving Integrity Verification for Federated Learning with TEEs


Abstract:

Federated Learning, as a promising distributed machine learning approach that allows collaborative model training without sharing raw data, has gained prominence as a key...Show More

Abstract:

Federated Learning, as a promising distributed machine learning approach that allows collaborative model training without sharing raw data, has gained prominence as a key application in zero-trust edge computing. However, the decentralized nature of FL poses challenges in ensuring the integrity of the training process, as malicious participants can undermine the global model’s accuracy and reliability. In this work, we propose a hardware-assisted federated learning framework that leverages trusted execution environments (TEEs) to allow the model owner to verify the integrity of the training process. To further improve the performance, we introduce a secure and efficient partial offloading scheme that allows TEE to outsource the computationally intensive linear operations to the co-located GPU. Our framework demonstrates a substantial improvement, over 13× acceleration on existing sampling-based TEE-retraining solutions, facilitating the paradigm of zero-trust federated learning.
Date of Conference: 28 October 2024 - 01 November 2024
Date Added to IEEE Xplore: 06 December 2024
ISBN Information:

ISSN Information:

Conference Location: Washington, DC, USA

Funding Agency:


References

References is not available for this document.