Loading [a11y]/accessibility-menu.js
Differentially Private Federated Learning With an Adaptive Noise Mechanism | IEEE Journals & Magazine | IEEE Xplore

Differentially Private Federated Learning With an Adaptive Noise Mechanism


Abstract:

Federated Learning (FL) enables multiple distributed clients to collaboratively train a model with owned datasets. To avoid the potential privacy threat in FL, researcher...Show More

Abstract:

Federated Learning (FL) enables multiple distributed clients to collaboratively train a model with owned datasets. To avoid the potential privacy threat in FL, researchers propose the DP-FL strategy, which utilizes differential privacy (DP) to add elaborate noise to the exchanged parameters to hide privacy information. DP-FL guarantees the privacy of FL at the cost of model performance degradation. To balance the trade-off between model accuracy and security, we propose a differentially private federated learning scheme with an adaptive noise mechanism. This is challenging, as the distributed nature of FL makes it difficult to appropriately estimate sensitivity, where sensitivity is a concept in DP that determines the scale of noise. To resolve this, we design a generic method for sensitivity estimates based on local and global historical information. We also provide instances on four commonly used optimizers to verify its effectiveness. The experiments on MNIST, FMNIST and CIFAR-10 convincingly prove that our proposed scheme achieves higher accuracy while keeping high-level privacy protection compared to prior works.
Page(s): 74 - 87
Date of Publication: 25 September 2023

ISSN Information:

Funding Agency:


References

References is not available for this document.