skip to main content
10.1145/3652583.3657999acmconferencesArticle/Chapter ViewAbstractPublication PagesicmrConference Proceedingsconference-collections
research-article

Federated Multi-Task Learning on Non-IID Data Silos: An Experimental Study

Published: 07 June 2024 Publication History

Abstract

The innovative Federated Multi-Task Learning (FMTL) approach consolidates the benefits of Federated Learning (FL) and Multi-Task Learning (MTL), enabling collaborative model training on multi-task learning datasets. However, a comprehensive evaluation method, integrating the unique features of both FL and MTL, is currently absent in the field. This paper fills this void by introducing a novel framework, FMTL-Bench, for systematic evaluation of the FMTL paradigm. This benchmark covers various aspects at the data, model, and optimization algorithm levels, and comprises seven sets of comparative experiments, encapsulating a wide array of non-independent and identically distributed (Non-IID) data partitioning scenarios. We propose a systematic process for comparing baselines of diverse indicators and conduct a case study on communication expenditure, time, and energy consumption. Through our exhaustive experiments, we aim to provide valuable insights into the strengths and limitations of existing baseline methods, contributing to the ongoing discourse on optimal FMTL application in practical scenarios. The source code can be found at https://github.com/youngfish42/FMTL-Benchmark.

References

[1]
Lasse F. Wolff Anthony et al. 2020. Carbontracker: Tracking and Predicting the Carbon Footprint of Training Deep Learning Models. ICML Workshop on Challenges in Deploying and monitoring Machine Learning Systems.
[2]
Xiang Bai et al. 2021. Advancing COVID-19 Diagnosis with Privacy-Preserving Collaboration in Artificial Intelligence. Nature Machine Intelligence, Vol. 3, 12 (2021), 1081--1089.
[3]
Cosmin I. Bercea et al. 2022. Federated Disentangled Representation Learning for Unsupervised Brain Anomaly Detection. Nature Machine Intelligence, Vol. 4, 8 (2022), 685--695.
[4]
Ruisi Cai, Xiaohan Chen, Shiwei Liu, Jayanth Srinivasa, Myungjin Lee, Ramana Kompella, and Zhangyang Wang. 2023. Many-Task Federated Learning: A New Problem Setting and A Simple Baseline. In CVPR. 5037--5045.
[5]
Rich Caruana. 1997. Multitask learning. Machine learning, Vol. 28, 1 (1997), 41--75.
[6]
Hong-You Chen and Wei-Lun Chao. 2022. On Bridging Generic and Personalized Federated Learning for Image Classification. In ICLR.
[7]
Liang-Chieh Chen et al. 2018. Encoder-decoder with atrous separable convolution for semantic image segmentation. In ECCV. 801--818.
[8]
Yiqiang Chen, Teng Zhang, Xinlong Jiang, Qian Chen, Chenlong Gao, and Wuliang Huang. 2023. FedBone: Towards Large-Scale Federated Multi-Task Learning. CoRR, Vol. abs/2306.17465 (2023).
[9]
Liam Collins, Hamed Hassani, et al. 2021. Exploiting shared representations for personalized federated learning. In ICML. 2089--2099.
[10]
Michael Crawshaw. 2020. Multi-task learning with deep neural networks: A survey. CoRR, Vol. abs/2009.09796 (2020).
[11]
Ittai Dayan et al. 2021. Federated Learning for Predicting Clinical Outcomes in Patients with COVID-19. Nature Medicine, Vol. 27, 10 (2021), 1735--1743.
[12]
Janez Demsar. 2006. Statistical Comparisons of Classifiers over Multiple Data Sets. J. Mach. Learn. Res., Vol. 7 (2006), 1--30.
[13]
Sabri Eyuboglu et al. 2021. Multi-task weak supervision enables anatomically-resolved abnormality detection in whole-body FDG-PET/CT. Nat Commun, Vol. 12 (2021), 1880.
[14]
Bao Feng, Jiangfeng Shi, et al. 2024. Robustly federated learning model for identifying high-risk patients with postoperative gastric cancer recurrence. Nat Commun, Vol. 15 (2024), 742.
[15]
Tianyu Han et al. 2020. Breaking Medical Data Sharing Boundaries by Using Synthesized Radiographs. Science Advances, Vol. 6, 49 (2020), eabb7973.
[16]
Chaoyang He, Emir Ceyani, Keshav Balasubramanian, Murali Annavaram, and Salman Avestimehr. 2022. Spreadgnn: Decentralized multi-task federated learning for graph neural networks on molecular data. In AAAI, Vol. 36. 6865--6873.
[17]
Kaiming He, Xiangyu Zhang, Shaoqing Ren, and Jian Sun. 2016. Deep residual learning for image recognition. In CVPR. 770--778.
[18]
Sixu Hu, Yuan Li, Xu Liu, Qinbin Li, Zhaomin Wu, and Bingsheng He. 2022. The oarf benchmark suite: Characterization and implications for federated learning systems. ACM Transactions on Intelligent Systems and Technology, Vol. 13 (2022), 1--32.
[19]
Yutao Huang, Lingyang Chu, et al. 2021. Personalized cross-silo federated learning on non-iid data. In AAAI, Vol. 35. 7865--7873.
[20]
Joel Janai et al. 2020. Computer vision for autonomous vehicles: Problems, datasets and state of the art. Foundations and Trends® in Computer Graphics and Vision, Vol. 12, 1--3 (2020), 1--308.
[21]
Cheng Jin et al. 2021. Predicting treatment response from longitudinal images using multi-task deep learning. Nat Commun, Vol. 12 (2021), 1851.
[22]
Peter Kairouz, H. Brendan McMahan, et al. 2021. Advances and Open Problems in Federated Learning. Found. Trends Mach. Learn., Vol. 14, 1--2 (2021), 1--210.
[23]
Georgios Kaissis et al. 2021. End-to-End Privacy Preserving Deep Learning on Multi-Institutional Medical Imaging. Nature Machine Intelligence, Vol. 3, 6 (2021), 473--484.
[24]
Georgios A. Kaissis et al. 2020. Secure, Privacy-Preserving and Federated Machine Learning in Medical Imaging. Nature Machine Intelligence, Vol. 2, 6 (2020), 305--311.
[25]
Shivam Kalra et al. 2023. Decentralized federated learning through proxy model sharing. Nat Commun, Vol. 14 (2023), 2899.
[26]
Menelaos Kanakis, David Bruggemann, Suman Saha, Stamatios Georgoulis, Anton Obukhov, and Luc Van Gool. 2020. Reparameterizing convolutions for incremental multi-task learning without task interference. In ECCV. 689--707.
[27]
Alexandros Karargyris and otherss. 2023. Federated benchmarking of medical artificial intelligence with MedPerf. Nat Mach Intell, Vol. 5, 7 (2023), 799--810.
[28]
Jakub Konečný et al. 2015. Federated Optimization: Distributed Optimization Beyond the Datacenter. CoRR, Vol. abs/1511.03575 (2015).
[29]
Qinbin Li, Yiqun Diao, Quan Chen, and Bingsheng He. 2022. Federated learning on non-iid data silos: An experimental study. In 2022 IEEE 38th International Conference on Data Engineering (ICDE). IEEE, 965--978.
[30]
Tian Li, Anit Kumar Sahu, Manzil Zaheer, Maziar Sanjabi, et al. 2020. Federated Optimization in Heterogeneous Networks. In MLSys.
[31]
Bo Liu, Xingchao Liu, Xiaojie Jin, Peter Stone, and Qiang Liu. 2021b. Conflict-Averse Gradient Descent for Multi-task learning. In NeurIPS. 18878--18890.
[32]
Ken Liu, Shengyuan Hu, Steven Z Wu, and Virginia Smith. 2022. On privacy and personalization in cross-silo federated learning. NeurIPS, Vol. 35 (2022), 5925--5940.
[33]
Ze Liu, Yutong Lin, Yue Cao, Han Hu, Yixuan Wei, Zheng Zhang, Stephen Lin, and Baining Guo. 2021a. Swin transformer: Hierarchical vision transformer using shifted windows. In ICCV. 10012--10022.
[34]
Ilya Loshchilov and Frank Hutter. 2017. Sgdr: Stochastic gradient descent with warm restarts. In ICLR.
[35]
Ilya Loshchilov and Frank Hutter. 2019. Decoupled Weight Decay Regularization. In ICLR.
[36]
Kevis-Kokitsi Maninis, Ilija Radosavovic, and Iasonas Kokkinos. 2019. Attentive single-tasking of multiple tasks. In CVPR. 1851--1860.
[37]
Othmane Marfoq, Giovanni Neglia, Auré lien Bellet, Laetitia Kameni, and Richard Vidal. 2021. Federated Multi-Task Learning under a Mixture of Distributions. In NeurIPS. 15434--15447.
[38]
Brendan McMahan et al. 2017. Communication-Efficient Learning of Deep Networks from Decentralized Data. In AISTATS, Vol. 54. 1273--1282.
[39]
Jed Mills et al. 2021. Multi-task federated learning for personalised deep neural networks in edge computing. TPDS, Vol. 33, 3 (2021), 630--641.
[40]
Roozbeh Mottaghi et al. 2014. The role of context for object detection and semantic segmentation in the wild. In CVPR. 891--898.
[41]
Sangjoon Park et al. 2021. Federated Split Task-Agnostic Vision Transformer for COVID-19 CXR Diagnosis. In NeurIPS.
[42]
Adam Paszke et al. 2019. Pytorch: An imperative style, high-performance deep learning library. NeurIPS, Vol. 32 (2019).
[43]
Sara Pieri, Jose Restom, Samuel Horvath, and Hisham Cholakkal. 2024. Handling Data Heterogeneity via Architectural Design for Federated Visual Recognition. NeurIPS, Vol. 36 (2024).
[44]
Tao Qi et al. 2023. Differentially private knowledge transfer for federated learning. Nat Commun, Vol. 14 (2023), 3785.
[45]
Xinchi Qiu, Titouan Parcollet, Javier Fernández-Marqués, Pedro P. B. de Gusmao, Yan Gao, Daniel J. Beutel, Taner Topal, Akhil Mathur, and Nicholas D. Lane. 2023. A First Look into the Carbon Footprint of Federated Learning. J. Mach. Learn. Res., Vol. 24 (2023), 129:1--129:23.
[46]
René Ranftl, Alexey Bochkovskiy, and Vladlen Koltun. 2021. Vision transformers for dense prediction. In ICCV. 12179--12188.
[47]
Sebastian Ruder. 2017. An overview of multi-task learning in deep neural networks. CoRR, Vol. abs/1706.05098 (2017).
[48]
Lothar Sachs. 2013. Angewandte Statistik: Statistische Methoden und ihre Anwendungen. Springer-Verlag.
[49]
Mingjia Shi et al. 2023. PRIOR: Personalized Prior for Reactivating the Information Overlooked in Federated Learning. In NeurIPS.
[50]
Nathan Silberman et al. 2012. Indoor segmentation and support inference from rgbd images. In ECCV. 746--760.
[51]
Virginia Smith, Chao-Kai Chiang, Maziar Sanjabi, and Ameet Talwalkar. 2017. Federated Multi-Task Learning. In NeurIPS. 4424--4434.
[52]
Guolei Sun, Thomas Probst, Danda Pani Paudel, Nikola Popović, Menelaos Kanakis, Jagruti Patel, Dengxin Dai, and Luc Van Gool. 2021. Task switching network for multi-task learning. In ICCV. 8291--8300.
[53]
Alysa Ziying Tan, Han Yu, Lizhen Cui, and Qiang Yang. 2022. Towards Personalized Federated Learning. IEEE Transactions on Neural Networks and Learning Systems (2022), 1--17.
[54]
Simon Vandenhende et al. 2021. Multi-task learning for dense prediction tasks: A survey. IEEE TPAMI, Vol. 44, 7 (2021), 3614--3633.
[55]
Wenhai Wang et al. 2021. Pyramid vision transformer: A versatile backbone for dense prediction without convolutions. In ICCV. 568--578.
[56]
Stefanie Warnat-Herresthal et al. 2021. Swarm Learning for Decentralized and Confidential Clinical Machine Learning. Nature, Vol. 594, 7862 (2021), 265--270.
[57]
Chuhan Wu et al. 2022a. Communication-Efficient Federated Learning via Knowledge Distillation. Nature Communications, Vol. 13, 1 (2022), 2032.
[58]
Chuhan Wu et al. 2022b. A Federated Graph Neural Network Framework for Privacy-Preserving Personalization. Nature Communications, Vol. 13 (2022), 3091.
[59]
Xiaosong Wu et al. 2023. Wearable in-sensor reservoir computing using optoelectronic polymers with through-space charge-transport characteristics for multi-task learning. Nat Commun, Vol. 14 (2023), 468.
[60]
Qiang Yang, Yang Liu, Tianjian Chen, and Yongxin Tong. 2019. Federated Machine Learning: Concept and Applications. ACM Trans. Intell. Syst. Technol., Vol. 10, 2 (2019), 12:1--12:19.
[61]
Hanrong Ye and Dan Xu. 2022. Inverted pyramid multi-task transformer for dense scene understanding. In ECCV. 514--530.
[62]
Qi Yincheng, HUO Yalin, WANG Ning, and HOU Yu. 2024. Joint dynamic correction algorithms for local and global drifts in federated learning. Journal of Image and Graphics 1006--8961 (2024).
[63]
Tianhe Yu, Saurabh Kumar, Abhishek Gupta, Sergey Levine, Karol Hausman, and Chelsea Finn. 2020. Gradient Surgery for Multi-Task Learning. In NeurIPS.
[64]
Angela Zhang et al. 2022. Shifting Machine Learning for Healthcare from Development to Deployment and from Models to Data. Nature Biomedical Engineering (2022), 1--16.
[65]
Weiming Zhuang, Yonggang Wen, Lingjuan Lyu, and Shuai Zhang. 2023. MAS: Towards Resource-Efficient Federated Multiple-Task Learning. In ICCV. 23414--23424.

Index Terms

  1. Federated Multi-Task Learning on Non-IID Data Silos: An Experimental Study

        Recommendations

        Comments

        Information & Contributors

        Information

        Published In

        cover image ACM Conferences
        ICMR '24: Proceedings of the 2024 International Conference on Multimedia Retrieval
        May 2024
        1379 pages
        ISBN:9798400706196
        DOI:10.1145/3652583
        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

        Sponsors

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        Published: 07 June 2024

        Permissions

        Request permissions for this article.

        Check for updates

        Author Tags

        1. dense prediction
        2. federated learning
        3. multi-task learning

        Qualifiers

        • Research-article

        Funding Sources

        • NSFC
        • Shanghai Municipal Science and Technology Major Project

        Conference

        ICMR '24
        Sponsor:

        Acceptance Rates

        Overall Acceptance Rate 254 of 830 submissions, 31%

        Contributors

        Other Metrics

        Bibliometrics & Citations

        Bibliometrics

        Article Metrics

        • 0
          Total Citations
        • 108
          Total Downloads
        • Downloads (Last 12 months)108
        • Downloads (Last 6 weeks)24
        Reflects downloads up to 19 Feb 2025

        Other Metrics

        Citations

        View Options

        Login options

        View options

        PDF

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader

        Figures

        Tables

        Media

        Share

        Share

        Share this Publication link

        Share on social media