Skip to main content
Log in

Federated mutual learning: a collaborative machine learning method for heterogeneous data, models, and objectives

联邦相互学习: 一种针对异构数据、 模型和目标的协同机器学习方法

  • Research Article
  • Published:
Frontiers of Information Technology & Electronic Engineering Aims and scope Submit manuscript

Abstract

Federated learning (FL) is a novel technique in deep learning that enables clients to collaboratively train a shared model while retaining their decentralized data. However, researchers working on FL face several unique challenges, especially in the context of heterogeneity. Heterogeneity in data distributions, computational capabilities, and scenarios among clients necessitates the development of customized models and objectives in FL. Unfortunately, existing works such as FedAvg may not effectively accommodate the specific needs of each client. To address the challenges arising from heterogeneity in FL, we provide an overview of the heterogeneities in data, model, and objective (DMO). Furthermore, we propose a novel framework called federated mutual learning (FML), which enables each client to train a personalized model that accounts for the data heterogeneity (DH). A “meme model” serves as an intermediary between the personalized and global models to address model heterogeneity (MH). We introduce a knowledge distillation technique called deep mutual learning (DML) to transfer knowledge between these two models on local data. To overcome objective heterogeneity (OH), we design a shared global model that includes only certain parts, and the personalized model is task-specific and enhanced through mutual learning with the meme model. We evaluate the performance of FML in addressing DMO heterogeneities through experiments and compare it with other commonly used FL methods in similar scenarios. The results demonstrate that FML outperforms other methods and effectively addresses the DMO challenges encountered in the FL setting.

摘要

联邦学习(FL)是深度学习中的一种新技术, 可以让客户端在保留各自隐私数据的情况下协同训练模型. 然而, 由于每个客户端的数据分布、 算力和场景都不同, 联邦学习面临客户端异构环境的挑战. 现有方法(如FedAvg)无法有效满足每个客户的定制化需求. 为解决联邦学习中的异构挑战, 本文首先详述了数据、模型和目标(DMO)这3个主要异构来源, 然后提出一种新的联邦相互学习(FML)框架. 该框架使得每个客户端都能训练一个考虑到数据异构(DH)的个性化模型. 在模型异构(MH)问题上, 引入一种“模因模型”作为个性化模型与全局模型之间的中介, 并且采用深度相互学习(DML)的知识蒸馏技术在两个异构模型之间传递知识. 针对目标异构(OH)问题, 通过共享部分模型参数, 设计针对特定任务的个性化模型, 同时, 利用模因模型进行相互学习. 本研究通过实验评估了FML在应对DMO异构性方面的表现, 并与其他常见FL方法在相似场景下进行对比. 实验结果表明, FML在处理FL环境中的DMO问题的表现卓越, 优于其他方法.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Data availability

The data that support the findings of this study are openly available in public repositories. The MNIST dataset used in this study is publicly available and can be downloaded from the MNIST website (http://yann.lecun.com/exdb/mnist/). The CIFAR-10/100 datasets used in this study are also publicly available and can be downloaded from the CIFAR website (https://www.cs.toronto.edu/∼kriz/cifar.html).

References

Download references

Author information

Authors and Affiliations

Authors

Contributions

All authors contributed to the study conception and design. Tao SHEN, Fengda ZHANG, and Chao WU proposed the motivation of the study. Tao SHEN, Jie ZHANG, and Xinkang JIA designed the method. Tao SHEN, Jie ZHANG, and Zheqi LV performed the experiments. Tao SHEN drafted the paper. All authors commented on previous versions of the paper. Kun KUANG, Chao WU, and Fei WU revised the paper. All authors read and approved the final paper.

Corresponding authors

Correspondence to Chao Wu  (吴超) or Fei Wu  (吴飞).

Ethics declarations

Fei WU is an editorial board member of Frontiers of Information Technology & Electronic Engineering. Tao SHEN, Jie ZHANG, Xinkang JIA, Fengda ZHANG, Zheqi LV, Kun KUANG, Chao WU, and Fei WU declare that they have no conflict of interest.

Additional information

Project supported by the National Natural Science Foundation of China (Nos. U20A20387, 62006207, and 62037001), the Young Elite Scientists Sponsorship Program by China Association for Science and Technology (No. 2021QNRC001), the Zhejiang Provincial Natural Science Foundation, China (No. LQ21F020020), the Project by Shanghai AI Laboratory, China (No. P22KS00111), the Program of Zhejiang Province Science and Technology (No. 2022C01044), the StarryNight Science Fund of Zhejiang University Shanghai Institute for Advanced Study, China (No. SN-ZJU-SIAS-0010), and the Fundamental Research Funds for the Central Universities, China (Nos. 226-2022-00142 and 226-2022-00051)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Shen, T., Zhang, J., Jia, X. et al. Federated mutual learning: a collaborative machine learning method for heterogeneous data, models, and objectives. Front Inform Technol Electron Eng 24, 1390–1402 (2023). https://doi.org/10.1631/FITEE.2300098

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1631/FITEE.2300098

Key words

关键词

CLC number

Navigation