Authors:
Kunlong Liu
and
Trinabh Gupta
Affiliation:
University of California, Santa Barbara, U.S.A.
Keyword(s):
Federated Learning, Secure Multi-Party Computation, Homomorphic Encryption.
Abstract:
Federated learning for training models over mobile devices is gaining popularity. Current systems for this task exhibit significant trade-offs between model accuracy, privacy guarantee, and device efficiency. For instance, Oort (OSDI 2021) provides excellent accuracy and efficiency but requires a trusted central server. On the other hand, Orchard (OSDI 2020) provides good accuracy and the differential privacy guarantee without a trusted server, but creates high overhead for the devices. This paper describes Aero, a new federated learning system that significantly improves this trade-off. Aero guarantees good accuracy, differential privacy without a trusted server, and low device overhead. The key idea of Aero is to tune system architecture and design to a specific federated learning algorithm. This tuning requires novel optimizations and techniques, including a new protocol to securely aggregate gradient updates from devices. An evaluation of Aero demonstrates that it provides compar
able accuracy to plain federated learning (without differential privacy), and it improves efficiency ( CPU and network) over Orchard by a factor of 10 5 .
(More)