Skip to main content

FedECCR: Federated Learning Method with Encoding Comparison and Classification Rectification

  • Conference paper
  • First Online:
Collaborative Computing: Networking, Applications and Worksharing (CollaborateCom 2023)

Abstract

Federated learning is a distributed training method that integrates multi-party data information using privacy-preserving technologies through dispersed client data sets to jointly construct a global model under the coordination of a central server. However, in practical applications, there is a high degree of data distribution skewness among clients, which causes the optimization direction of the client models to diverge, resulting in model bias and reducing the accuracy of the global model. Existing methods require the calculation and transmission of much information to correct the optimization direction of the client models, or only roughly limit the deviation of the client models end-to-end, ignoring targeted processing of the internal structure of the model, resulting in unclear improvement effects. To address these problems, we propose a federated optimization algorithm FedECCR based on encoding contrast and classification correction. This algorithm divides the model into an encoder and a classifier. It utilizes prototype contrastive training of the model encoder and unbiased classification correction of the classifier. This approach notably improves the accuracy of the global model while maintaining low communication costs. We conducted experiments on multiple data sets to evaluate the validity of our method, and the quantified results showed that FedECCR can improve the global model classification accuracy by approximately 1% to 6% compared to FedAvg, FedProx, and MOON.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 79.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 99.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Wang, X., Gao, H., Huang, K.: Artificial intelligence in collaborative computing. Mobile Netw. Appl. 26, 2389–2391 (2021). https://doi.org/10.1007/s11036-021-01829-y

    Article  Google Scholar 

  2. Yang, J., Zheng, J., Zhang, Z., Chen, Q.I., Wong, D.S., Li, Y.: Security of federated learning for cloud-edge intelligence collaborative computing. Int. J. Intell. Syst., 9290–9308 (2022). https://doi.org/10.1002/int.22992

  3. McMahan, H.B., Moore, E., Ramage, D., Hampson, S., Arcas, B.: Communication-efficient learning of deep networks from decentralized data. arXiv: Learning (2016)

    Google Scholar 

  4. Hard, A., et al.: Federated learning for mobile keyboard prediction. arXiv: Computation and Language (2018)

    Google Scholar 

  5. Geyer, R.C., Klein, T., Nabi, M.: Differentially private federated learning: a client level perspective. Cornell University - arXiv (2017)

    Google Scholar 

  6. Tan, Y., Long, G., Liu, L., Zhou, T., Jiang, J.: FedProto: federated prototype learning over heterogeneous devices. arXiv: Learning (2021)

    Google Scholar 

  7. Reynolds, D.A.: Gaussian Mixture Models (2009)

    Google Scholar 

  8. Yan, Y., Zhu, L.: A Simple Data Augmentation for Feature Distribution Skewed Federated Learning (2023)

    Google Scholar 

  9. Zhao, Y., Li, M., Lai, L., Suda, N., Civin, D., Chandra, V.: Federated learning with non-IID data. Cornell University - arXiv (2018)

    Google Scholar 

  10. Tuor, T., Wang, S., Ko, B., Liu, C., Leung, K.K.: Overcoming noisy and irrelevant data in federated learning. arXiv: Learning (2020)

    Google Scholar 

  11. Yoshida, N., Nishio, T., Morikura, M., Yamamoto, K., Yonetani, R.: Hybrid-FL: Cooperative Learning Mechanism Using Non-IID Data in Wireless Networks (2019)

    Google Scholar 

  12. Wicaksana, J., et al.: FedMix: Mixed Supervised Federated Learning for Medical Image Segmentation (2022)

    Google Scholar 

  13. Seol, M., Kim, T.: Performance enhancement in federated learning by reducing class imbalance of non-IID data. Sensors, 1152 (2023)

    Google Scholar 

  14. Shin, M., Hwang, C., Kim, J., Park, J., Bennis, M., Kim, S.-L.: XOR mixup: privacy-preserving data augmentation for one-shot federated learning. Cornell University - arXiv (2020)

    Google Scholar 

  15. Jeong, E., Oh, S., Park, J., Kim, H., Bennis, M., Kim, S.-L.: Multi-hop federated private data augmentation with sample compression. arXiv: Learning (2019)

    Google Scholar 

  16. Karimireddy, S., Kale, S., Mohri, M., Reddi, S.J., Stich, S.U., Suresh, A.: SCAFFOLD: stochastic controlled averaging for federated learning. In: International Conference on Machine Learning (2020)

    Google Scholar 

  17. Gao, L., Fu, H., Li, L., Chen, Y., Xu, M., Xu, C.-Z.: FedDC: Federated Learning with Non-IID Data via Local Drift Decoupling and Correction

    Google Scholar 

  18. Liu, Y., Sun, Y., Ding, Z., Shen, L., Liu, B., Tao, D.: Enhance Local Consistency in Federated Learning: A Multi-Step Inertial Momentum Approach (2023)

    Google Scholar 

  19. Li, B., Schmidt, M.N., Alstrøm, T.S., Stich, S.U.: Partial Variance Reduction improves Non-Convex Federated learning on heterogeneous data (2022)

    Google Scholar 

  20. Li, T., Sahu, A., Zaheer, M., Sanjabi, M., Talwalkar, A., Smith, V.: Federated optimization in heterogeneous networks. arXiv: Learning (2018)

    Google Scholar 

  21. Shoham, N., et al.: Overcoming forgetting in federated learning on non-IID data. Cornell University - arXiv (2019)

    Google Scholar 

  22. Yao, X., Sun, L.: Continual local training for better initialization of federated models. In: 2020 IEEE International Conference on Image Processing (ICIP) (2020). https://doi.org/10.1109/icip40778.2020.9190968

  23. Li, H., Krishnan, A., Wu, J., Kolouri, S., Pilly, P.K., Braverman, V.: Lifelong learning with sketched structural regularization. Cornell University - arXiv (2021)

    Google Scholar 

  24. Kirkpatrick, J., et al.: Overcoming catastrophic forgetting in neural networks. In: Proceedings of the National Academy of Sciences, pp. 3521–3526 (2017). https://doi.org/10.1073/pnas.1611835114

  25. Li, Q., He, B., Song, D.: Model-contrastive federated learning. In: 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) (2021). https://doi.org/10.1109/cvpr46437.2021.01057

  26. Chen, T., Kornblith, S., Norouzi, M., Hinton, G.E.: A simple framework for contrastive learning of visual representations. Cornell University - arXiv (2020)

    Google Scholar 

  27. Vanschoren, J.: Meta-learning: a survey. arXiv: Learning (2018)

    Google Scholar 

  28. Zhang, Y., Yang, Q.: An overview of multi-task learning. Natl. Sci. Rev., 30–43 (2018). https://doi.org/10.1093/nsr/nwx105

  29. Yang, L., Huang, J., Lin, W., Cao, J.: Personalized federated learning on non-IID data via group-based meta-learning. ACM Trans. Knowl. Discov. Data., 1–20 (2023). https://doi.org/10.1145/3558005

  30. He, C., Ceyani, E., Balasubramanian, K., Annavaram, M., Avestimehr, A.S.: SpreadGNN: serverless multi-task federated learning for graph neural networks. Cornell University - arXiv (2021)

    Google Scholar 

  31. Mu, X., et al.: FedProc: prototypical contrastive federated learning on non-IID data. arXiv: Learning (2021)

    Google Scholar 

  32. Miller, J.W., Harrison, M.T.: Mixture models with a prior on the number of components. arXiv: Methodology (2015)

    Google Scholar 

  33. Hsu, H., Qi, H., Brown, M.: Measuring the effects of non-identical data distribution for federated visual classification. arXiv: Learning (2019)

    Google Scholar 

Download references

Acknowledgement

This work is supported by the Key Research and Development Program of Zhejiang Province 2023C03194; the National Natural Science Foundation of China under Grant No. 62072146; and the Natural Science Foundation of Zhejiang Province under Grant No. LQ23F020015.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mingyao Zhou .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2024 ICST Institute for Computer Sciences, Social Informatics and Telecommunications Engineering

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Zeng, Y. et al. (2024). FedECCR: Federated Learning Method with Encoding Comparison and Classification Rectification. In: Gao, H., Wang, X., Voros, N. (eds) Collaborative Computing: Networking, Applications and Worksharing. CollaborateCom 2023. Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering, vol 563. Springer, Cham. https://doi.org/10.1007/978-3-031-54531-3_4

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-54531-3_4

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-54530-6

  • Online ISBN: 978-3-031-54531-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics