skip to main content
research-article

Fair Federated Learning with Multi-Objective Hyperparameter Optimization

Published: 21 August 2024 Publication History

Abstract

Federated learning (FL) is an attractive paradigm for privacy-aware distributed machine learning, which enables clients to collaboratively learn a global model without sharing clients’ data. Recently, many strategies have been proposed to improve the generality of the global model and thus improve FL effect. However, existing strategies either ignore the fairness among clients or sacrifice performance for fairness. They cannot ensure that the gap among clients is as small as possible without sacrificing federated performance. To address this issue, we propose ParetoFed, a new local information aggregation method dedicated to obtaining better federated performance with smaller gap among clients. Specifically, we propose to use multi-objective hyperparameter optimization (HPO) algorithm to gain global models that are both fair and effective. Then, we send Pareto Optimal global models to each client, allowing them to choose the most suitable one as the base to optimize their local model. ParetoFed not only make the global models more fair but also make the selection of local models more personalized, which can further improve the federated performance. Extensive experiments show that ParetoFed outperforms existing FL methods in terms of fairness, and even achieves better federated performance, which demonstrates the significance of our method.

References

[1]
Daoyuan Chen, Liuyi Yao, Dawei Gao, Bolin Ding, and Yaliang Li. 2023. Efficient personalized federated learning via sparse model-adaptation. In Proceedings of the International Conference on Machine Learning. PMLR, 5234–5256.
[2]
Ran Cheng, Yaochu Jin, Markus Olhofer, and Bernhard Sendhoff. 2016. A reference vector guided evolutionary algorithm for many-objective optimization. IEEE Transactions on Evolutionary Computation 20, 5 (2016), 773–791.
[3]
Carlos A. Coello Coello, Gary B. Lamont, and David A. Van Veldhuizen. 2007. Evolutionary Algorithms for Solving Multi-Objective Problems, Vol. 5. Springer.
[4]
Liam Collins, Hamed Hassani, Aryan Mokhtari, and Sanjay Shakkottai. 2021. Exploiting shared representations for personalized federated learning. In Proceedings of the International Conference on Machine Learning. PMLR, 2089–2099.
[5]
Kalyanmoy Deb. 2014. Multi-objective optimization. In Search Methodologies. Springer, 403–449.
[6]
Kalyanmoy Deb and Himanshu Jain. 2013. An evolutionary many-objective optimization algorithm using reference-point-based nondominated sorting approach, part I: solving problems with box constraints. IEEE Transactions on Evolutionary Computation 18, 4 (2013), 577–601.
[7]
Kalyanmoy Deb, Amrit Pratap, Sameer Agarwal, and T. A. M. T. Meyarivan. 2002. A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Transactions on Evolutionary Computation 6, 2 (2002), 182–197.
[8]
Gabriele Eichfelder. 2009. An adaptive scalarization method in multiobjective optimization. SIAM Journal on Optimization 19, 4 (2009), 1694–1718.
[9]
Alireza Fallah, Aryan Mokhtari, and Asuman Ozdaglar. 2020. Personalized federated learning with theoretical guarantees: A model-agnostic meta-learning approach. Advances in Neural Information Processing Systems 33 (2020), 3557–3568.
[10]
Sergio Garcia and Cong T. Trinh. 2019. Comparison of multi-objective evolutionary algorithms to solve the modular cell design problem for novel biocatalysis. Processes 7, 6 (2019), 361.
[11]
Mitsuo Gen and Runwei Cheng. 1999. Genetic Algorithms and Engineering Optimization. Vol. 7. John Wiley & Sons.
[12]
Yann LeCun, Léon Bottou, Yoshua Bengio, and Patrick Haffner. 1998. Gradient-based learning applied to document recognition. Proceedings of the IEEE 86, 11 (1998), 2278–2324.
[13]
Ang Li, Jingwei Sun, Xiao Zeng, Mi Zhang, Hai Li, and Yiran Chen. 2021. Fedmask: Joint computation and communication-efficient personalized federated learning via heterogeneous masking. In Proceedings of the 19th ACM Conference on Embedded Networked Sensor Systems. 42–55.
[14]
Tian Li, Maziar Sanjabi, Ahmad Beirami, and Virginia Smith. 2019. Fair resource allocation in federated learning. arXiv:1905.10497. Retrieved from
[15]
R. Timothy Marler and Jasbir S. Arora. 2004. Survey of multi-objective optimization methods for engineering. Structural and Multidisciplinary Optimization 26, 6 (2004), 369–395.
[16]
Brendan McMahan, Eider Moore, Daniel Ramage, Seth Hampson, and Blaise Aguera y Arcas. 2017. Communication-efficient learning of deep networks from decentralized data. In Proceedings of the Artificial Intelligence and Statistics. PMLR, 1273–1282.
[17]
Mehryar Mohri, Gary Sivek, and Ananda T. Suresh. 2019. Agnostic federated learning. In Proceedings of the International Conference on Machine Learning. PMLR, 4615–4625.
[18]
Thomas Pierrot, Guillaume Richard, Karim Beguir, and Antoine Cully. 2022. Multi-objective quality diversity optimization. arXiv:2202.03057. Retrieved from
[19]
Adriana Schulz, Harrison Wang, Eitan Grinspun, Justin Solomon, and Wojciech Matusik. 2018. Interactive exploration of design trade-offs. ACM Transactions on Graphics (TOG) 37, 4 (2018), 1–14.
[20]
Ohad Shamir, Nati Srebro, and Tong Zhang. 2014. Communication-efficient distributed optimization using an approximate newton-type method. In Proceedings of the International Conference on Machine Learning. PMLR, 1000–1008.
[21]
Canh T. Dinh, Nguyen Tran, and Josh Nguyen. 2020. Personalized federated learning with moreau envelopes. In Proceedings of the Advances in Neural Information Processing Systems, Vol. 33. 21394–21405.
[22]
Xubo Yue, Maher Nouiehed, and Raed A. Kontar. 2021. GIFAIR-FL: An approach for group and individual fairness in federated learning. arXiv:2108.02741. Retrieved from
[23]
Daniel Yue Zhang, Ziyi Kou, and Dong Wang. 2020a. Fairfl: A fair federated learning approach to reducing demographic bias in privacy-sensitive classification models. In Proceedings of the IEEE International Conference on Big Data (Big Data ’20). IEEE, 1051–1060.
[24]
Jingfeng Zhang, Cheng Li, Antonio Robles-Kelly, and Mohan Kankanhalli. 2020b. Hierarchically fair federated learning. arXiv:2004.10386. Retrieved from
[25]
Qingfu Zhang and Hui Li. 2007. MOEA/D: A multiobjective evolutionary algorithm based on decomposition. IEEE Transactions on Evolutionary Computation 11, 6 (2007), 712–731.

Cited By

View all
  • (2025)Consensus-Driven Hyperparameter Optimization for Accelerated Model Convergence in Decentralized Federated LearningInternet of Things10.1016/j.iot.2024.10147630(101476)Online publication date: Mar-2025

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Transactions on Knowledge Discovery from Data
ACM Transactions on Knowledge Discovery from Data  Volume 18, Issue 8
September 2024
700 pages
EISSN:1556-472X
DOI:10.1145/3613713
Issue’s Table of Contents

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 21 August 2024
Online AM: 12 July 2024
Accepted: 27 June 2024
Revised: 14 April 2024
Received: 30 October 2022
Published in TKDD Volume 18, Issue 8

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Fair federated learning
  2. hyperparameter optimization
  3. multi-objective optimization

Qualifiers

  • Research-article

Funding Sources

  • NSFC

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)242
  • Downloads (Last 6 weeks)21
Reflects downloads up to 08 Mar 2025

Other Metrics

Citations

Cited By

View all
  • (2025)Consensus-Driven Hyperparameter Optimization for Accelerated Model Convergence in Decentralized Federated LearningInternet of Things10.1016/j.iot.2024.10147630(101476)Online publication date: Mar-2025

View Options

Login options

Full Access

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Full Text

View this article in Full Text.

Full Text

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media