Skip to main content
Log in

FedSEMA: similarity-aware for representation consistency in federated contrastive learning

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

Contrastive learning has emerged as a promising method for addressing the non-independent and identically distributed (non-IID) problem in federated learning. Existing methods make the limited assumption that all clients are consistently available in each communication round. However, in the more constrained scenario of cross-device settings, resource-constrained clients intermittently participate in the training process, resulting in the updates of the local models being delayed and thus reducing the representation consistency in the federated contrastive loss. In this paper, we analyse the superiority of the federated contrastive loss over traditional methods in terms of addressing the non-IID problem: the federated contrastive loss not only corrects the local objective towards the global objective but also debiases the local updates. To address the representation inconsistency issue, we propose a novel method called Federated Similarity-aware Exponential Moving Average update (FedSEMA), which incorporates a similarity-aware function into the EMA update process. First, FedSEMA adaptively facilitates the underlying pairwise collaborations between clients to generate personalized knowledge based on the similarity-aware EMA update procedure. Second, FedSEMA effectively exploits personalized knowledge to update the delayed local models, maintaining representation consistency to maximally benefit representation learning. Our extensive experiments conducted on various datasets under different non-IID settings demonstrate that FedSEMA is an effective and robust method for tackling the representation inconsistency issue.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Algorithm 1
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Similar content being viewed by others

Data Availability

The MNIST dataset is available at: http://yann.lecun.com/exdb/mnist/. The Fashion-MNIST dataset is available at: https://github.com/zalandoresearch/fashion-mnist. The CIFAR-10 dataset is available at: https://www.cs.toronto.edu/~kriz/cifar.html.

References

  1. Voigt P, Von dem Bussche A (2017) The EU general data protection regulation (GDPR): a practical guide. Springer Publishing Company, Incorporated, 1st edn. ISBN 3319579584

  2. Yang Q, Liu Y, Chen T, Tong Y (2019) Federated machine learning: Concept and applications. ACM Transactions on Intelligent Systems and Technology (TIST) 10(2):1–19

    Article  Google Scholar 

  3. Li T, Sahu AK, Talwalkar A, Smith V (2020) Federated learning: Challenges, methods, and future directions. IEEE Signal Proc Mag 37(3):50–60

    Article  Google Scholar 

  4. Lim WY, Luong NC, Hoang DT, Jiao Y, Liang YC, Yang Q, Niyato D, Miao C (2020) Federated learning in mobile edge networks: A comprehensive survey. IEEE Commun Surv Tutor 22(3):2031–2063

    Article  Google Scholar 

  5. Kairouz P, McMahan HB, Avent B, Bellet A, Bennis M, Bhagoji AN, Bonawitz K, Charles Z, Cormode G, Cummings R, et al (2021) Advances and open problems in federated learning. Foundations and Trends® in Machine Learning 14(1–2):1–210

  6. Li Q, Wen Z, Wu Z, Hu S, Wang N, Li Y, Liu X, He B (2021a) A survey on federated learning systems: Vision, hype and reality for data privacy and protection. IEEE Trans Knowl Data Eng

  7. McMahan B, Moore E, Ramage D, Hampson S, y Arcas BA (2017) Communication-efficient learning of deep networks from decentralized data. In: Artificial intelligence and statistics, pp 1273–1282. PMLR

  8. Hsu TM, Qi H, Brown (2020) Federated visual classification with real-world data distribution. In: European conference on computer vision, pp 76–92. Springer

  9. Wang J, Liu Q, Liang H, Joshi G, Poor HV (2020) Tackling the objective inconsistency problem in heterogeneous federated optimization. Adv Neural Inf Process Syst 33:7611–7623

    Google Scholar 

  10. Li Tian, Sahu Anit Kumar, Zaheer Manzil, Sanjabi Maziar, Talwalkar Ameet, Smith Virginia (2020) Federated optimization in heterogeneous networks. Proc Mach Learn Syst 2:429–450

    Google Scholar 

  11. Karimireddy SP, Kale S, Mohri M, Reddi S, Stich S, Suresh AT (2020) Scaffold: Stochastic controlled averaging for federated learning. In: International conference on machine learning, pp 5132–5143. PMLR

  12. Alp Emre D, Zhao Y, Matas R, Mattina M, Whatmough P, Saligrama V (2021) Federated learning based on dynamic regularization. In: International conference on learning representations

  13. Wang F, Liu H (2021) Understanding the behaviour of contrastive loss. In: Proceedings of the IEEE/CVF Conference on computer vision and pattern recognition (CVPR), pp 2495–2504

  14. Khosla P, Teterwak P, Wang C, Sarna A, Tian Y, Isola P, Maschinot A, Liu C, Krishnan D (2020) Supervised contrastive learning. Adv Neural Inf Process Syst 33:18661–18673

    Google Scholar 

  15. Li Q, He B, Song D (2021) Model-contrastive federated learning. In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pp 10713–10722

  16. Mu X, Shen Y, Cheng K, Geng X, Fu J, Zhang T, Zhang Z (2023) Fedproc: Prototypical contrastive federated learning on non-iid data. Futur Gener Comput Syst 143:93–104

    Article  Google Scholar 

  17. Tan Y, Long G, Ma J, Liu L, Zhou T, Jiang J (2022) Federated learning from pre-trained models: A contrastive learning approach. Adv Neural Inf Process Syst 35:19332–19344

    Google Scholar 

  18. Mao Y, Zhao Z, Yang M, Liang L, Liu Y, Ding W, Lan T, Zhang XP (2023) Safari: Sparsity-enabled federated learning with limited and unreliable communications. IEEE Trans Mob Comput

  19. Zheng S, Meng Q, Wang T, Chen W, Yu N, Ma ZM, Liu TY (2017) Asynchronous stochastic gradient descent with delay compensation. In: International conference on machine learning, pp 4120–4129. PMLR

  20. Lian X, Zhang W, Zhang C, Liu J (2018) Asynchronous decentralized parallel stochastic gradient descent. In: International conference on machine learning, pp 3043–3052. PMLR

  21. Xie C, Koyejo S, Gupta I (2019) Asynchronous federated optimization. arXiv:1903.03934

  22. Guo P, Wang P, Zhou J, Jiang S (2021) Multi-institutional collaborations for improving deep learning-based magnetic resonance image reconstruction using federated learning. In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition (CVPR), pp 2423–2432

  23. Liu Q, Chen C, Qin J, Dou Q, Heng PA (2021a) Feddg: Federated domain generalization on medical image segmentation via episodic learning in continuous frequency space. In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition (CVPR), pp 1013–1023

  24. Yang T, Andrew G, Eichner H, Sun H, Li W, Kong N, Ramage D, Beaufays F (2018) Applied federated learning: Improving google keyboard query suggestions. arXiv:1812.02903

  25. Lin BY, He C, Zeng Z, Wang H, Huang Y, Dupuy C, Gupta R, Soltanolkotabi M, Ren X, Avestimehr S (2022) Fednlp: Benchmarking federated learning methods for natural language processing tasks. Findings of NAACL

  26. Yang L, Tan B, Zheng VW, Chen K, Yang Q (2020) Federated recommendation systems. Privacy and Incentive, Federated Learning, pp 225–239

    Google Scholar 

  27. Li Q, Wen Z, Wu Z, Hu S, Wang N, Li Y, Liu X, He B (2023) A survey on federated learning systems: Vision, hype and reality for data privacy and protection. IEEE Trans Knowl Data Eng 35(4):3347–3366. https://doi.org/10.1109/TKDE.2021.3124599

    Article  Google Scholar 

  28. Wang J, Joshi G (2021) Cooperative sgd: A unified framework for the design and analysis of local-update sgd algorithms. J Mach Learn Res 22(1):9709–9758

    MathSciNet  Google Scholar 

  29. Wang S, Tuor T, Salonidis T, Leung KK, Makaya C, He T, Chan K (2019) Adaptive federated learning in resource constrained edge computing systems. IEEE J Sel Areas Commun 37(6):1205–1221

    Article  Google Scholar 

  30. Zhao Y, Li M, Lai L, Suda N, Civin D, Chandra V (2018) Federated learning with non-iid data. arXiv:1806.00582

  31. Hsu TM, Qi H, Brown M (2019) Measuring the effects of non-identical data distribution for federated visual classification. arXiv:1909.06335

  32. Reddi S, Charles Z, Zaheer M, Garrett Z, Rush K, Konečný J, Kumar S, McMahan HB (2021) Adaptive federated optimization. In: International conference on learning representations

  33. Shi Y, Yu H, Leung C (2023) Towards fairness-aware federated learning. IEEE Trans Neural Netw Learn Syst, pp 1–17. https://doi.org/10.1109/TNNLS.2023.3263594

  34. Li T, Hu S, Beirami A, Smith V (2021c) Ditto: Fair and robust federated learning through personalization. In: International conference on machine learning, pp 6357–6368. PMLR

  35. Dinh CT, Tran N, Nguyen J (2020) Personalized federated learning with moreau envelopes. Adv Neural Inf Process Syst 33:21394–21405

    Google Scholar 

  36. Collins L, Hassani H, Mokhtari A, Shakkottai S (2021) Exploiting shared representations for personalized federated learning. In: International conference on machine learning, pp 2089–2099. PMLR

  37. Huang Y, Chu L, Zhou Z, Wang L, Liu J, Pei J, Zhang Y (2021) Personalized cross-silo federated learning on non-iid data. In: AAAI, pp 7865–7873

  38. Fallah A, Mokhtari A, Ozdaglar A (2020) Personalized federated learning with theoretical guarantees: A model-agnostic meta-learning approach. Adv Neural Inf Process Syst 33:3557–3568

    Google Scholar 

  39. Acar DA, Zhao Y, Zhu R, Matas R, Mattina M, Whatmough P, Saligrama V (2021) Debiasing model updates for improving personalized federated training. In: International conference on machine learning, pp 21–31. PMLR

  40. Liu B, Guo Y, Chen X (2021) Pfa: Privacy-preserving federated adaptation for effective model personalization. Proceedings of the Web Conference 2021:923–934

    Google Scholar 

  41. Ghosh A, Chung J, Yin D, Ramchandran K (2020) An efficient framework for clustered federated learning. Adv Neural Inf Process Syst 33:19586–19597

    Google Scholar 

  42. Li Chengxi, Li Gang, Varshney Pramod K (2022) Federated learning with soft clustering. IEEE Internet Things J 9(10):7773–7782

    Article  Google Scholar 

  43. Jin Y, Wei X, Liu Y, Yang Q (2020) Towards utilizing unlabeled data in federated learning: A survey and prospective. arXiv:2002.11545

  44. Jeong W, Yoon J, Yang E, Hwang SJ (2021) Federated semi-supervised learning with inter-client consistency & disjoint learning. In: International conference on learning representations

  45. Feng S, Li B, Yu H, Liu Y, Yang Q (2022) Semi-supervised federated heterogeneous transfer learning. Knowl-Based Syst 252:109384

    Article  Google Scholar 

  46. Zhang F, Kuang K, You Z, Shen T, Xiao J, Zhang Y, Wu C, Zhuang Y, Li X (2020) Federated unsupervised representation learning. arXiv:2010.08982

  47. Zhuang W, Gan X, Wen Y, Zhang S, Yi S (2021) Collaborative unsupervised visual representation learning from decentralized data. In: Proceedings of the IEEE/CVF international conference on computer vision, pp 4912–4921

  48. Zhuang W, Wen Y, Zhang S (2022) Divergence-aware federated self-supervised learning. In: International conference on learning representations

  49. Chen T, Kornblith S, Norouzi M, Hinton G (2020) A simple framework for contrastive learning of visual representations. In: International conference on machine learning, pp 1597–1607. PMLR

  50. Grill JB, Strub F, Altché F, Tallec C, Richemond P, Buchatskaya E, Doersch C, Avila Pires B, Guo Z, Gheshlaghi Azar M et al (2020) Bootstrap your own latent-a new approach to self-supervised learning. Adv Neural Inf Process Syst 33:21271–21284

    Google Scholar 

  51. Chen X, He K (2021) Exploring simple siamese representation learning. In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pp 15750–15758

  52. Guo Y, Lin T, Tang X (2022) Fedaug: Reducing the local learning bias improves federated learning on heterogeneous data. arXiv:2205.13462

  53. Yoon J, Jeong W, Lee G, Yang E, Hwang SJ (2021) Federated continual learning with weighted inter-client transfer. In: International conference on machine learning, pp 12073–12086. PMLR

  54. Zhang M, Sapra K, Fidler S, Yeung S, Alvarez JM (2021) Personalized federated learning with first order model optimization. In: International conference on learning representations

Download references

Funding

This work was supported in part by the National Key Research and Development Project under grant 2019YFB1706101.

Author information

Authors and Affiliations

Authors

Contributions

Yanbing Zhou: Methodology, Conceptualization, Writing - Original draft preparation. Yingbo Wu: Project administration, Supervision. Jiyang Zhou: Writing - Review & Editing. Xin Zheng: Data Curation, Formal Analysis.

Corresponding author

Correspondence to Yingbo Wu.

Ethics declarations

Competing of interest

The authors have no competing interests to declare that are relevant to the content of this article.

Ethical approval

This article does not contain any studies with human participants or animals performed by any of the authors.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhou, Y., Wu, Y., Zhou, J. et al. FedSEMA: similarity-aware for representation consistency in federated contrastive learning. Appl Intell 54, 301–316 (2024). https://doi.org/10.1007/s10489-023-05193-0

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-023-05193-0

Keywords

Navigation