Using Highly Compressed Gradients in Federated Learning for Data Reconstruction Attacks | IEEE Journals & Magazine | IEEE Xplore

Using Highly Compressed Gradients in Federated Learning for Data Reconstruction Attacks


Abstract:

Federated learning (FL) preserves data privacy by exchanging gradients instead of local training data. However, these private data can still be reconstructed from the exc...Show More

Abstract:

Federated learning (FL) preserves data privacy by exchanging gradients instead of local training data. However, these private data can still be reconstructed from the exchanged gradients. Deep leakage from gradients (DLG) is a classical reconstruction attack that optimizes dummy data to real data by making the corresponding dummy and real gradients as similar as possible. Nevertheless, DLG fails with highly compressed gradients, which are crucial for communication-efficient FL. In this study, we propose an effective data reconstruction attack against highly compressed gradients, called highly compressed gradient leakage attack (HCGLA). In particular, HCGLA is characterized by the following three key techniques: 1) Owing to the unreasonable optimization objective of DLG in compression scenarios, we redesign a plausible objective function, ensuring that compressed dummy gradients are similar to the compressed real gradients. 2) Instead of simply initializing dummy data through random noise, as in DLG, we design a novel dummy data initialization method, Init-Generation, to compensate for information loss caused by gradient compression. 3) To further enhance reconstruction quality, we train an ad hoc denoising model using the methods of “first optimizing, next filtering, and then reoptimizing”. Extensive experiments on various benchmark data sets and mainstream models show that HCGLA is an effective reconstruction attack even against highly compressed gradients of 0.1%, whereas state-of-the-art attacks can only support 70% compression, thereby achieving a 700-fold improvement.
Page(s): 818 - 830
Date of Publication: 29 December 2022

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.