Abstract:
With increasing concerns on privacy leakage from gradients, various attack mechanisms emerged to recover private data from gradients, which challenged the primary advanta...Show MoreMetadata
Abstract:
With increasing concerns on privacy leakage from gradients, various attack mechanisms emerged to recover private data from gradients, which challenged the primary advantage of privacy protection in federated learning. However, we cast doubt upon the real impact of these gradient leakage attacks on production federated learning systems. By taking away several impractical assumptions that the literature has made, we find that these attacks pose a limited degree of threat to the privacy of raw data. In this paper, through a comprehensive evaluation of existing gradient leakage attacks in a federated learning system with practical assumptions, we have systematically analyzed their effectiveness under a wide range of configurations. We first present key priors required to make the attack possible or stronger, such as a narrow distribution of initial model weights, as well as inversion at early stages of training. We then propose a new lightweight defense mechanism that provides sufficient and self-adaptive protection against time-varying levels of the privacy leakage risk throughout the federated learning process. Our proposed defense, called OUTPOST, selectively adds Gaussian noise to gradients at each update iteration according to the Fisher information matrix, where the level of noise is determined by the privacy leakage risk quantified by the spread of model weights at each layer. To limit the computation overhead and training performance degradation, OUTPOST only performs perturbation with iteration-based decay. Our experimental results demonstrate that OUTPOST can achieve a much better tradeoff than the state-of-the-art with respect to convergence performance, computational overhead, and protection against gradient leakage attacks.
Published in: IEEE/ACM Transactions on Networking ( Volume: 32, Issue: 4, August 2024)