Skip to main content

Optimized Federated Learning on Class-Biased Distributed Data Sources

  • Conference paper
  • First Online:
Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECML PKDD 2021)

Abstract

Due to privacy protection, the conventional machine learning approaches, which upload all data to a central location, has become less feasible. Federated learning, a privacy-preserving distributed machine learning paradigm, has been proposed as a solution to comply with privacy requirements. By enabling multiple clients collaboratively to learn a shared global model, model parameters instead of local private data will be exchanged under privacy restrictions. However, compared with centralized approaches, federated learning suffers from performance degradation when trained on non-independently and identically distributed (non-i.i.d.) data across the participants. Meanwhile, the class imbalance problem is always encountered in machine learning in practice and causes bad prediction on minority classes. In this work, We propose FedBGVS to alleviate the class bias severity by employing a balanced global validation set. The model aggregation algorithm is refined by using the Balanced Global Validation Score (BGVS). We evaluate our methods by experiments conducted on both the classical benchmark datasets MNIST, SVHN and CIFAR-10 and a public clinical dataset ISIC-2019. The empirical results demonstrate that our proposed methods outperform the state-of-the-art federated learning algorithms in label distribution skew and class imbalance settings.

Y. Mou and J. Geng—The authors contributed equally to this work.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Acar, D.A.E., Zhao, Y., Matas, R., Mattina, M., Whatmough, P., Saligrama, V.: Federated learning based on dynamic regularization. In: International Conference on Learning Representations (2020)

    Google Scholar 

  2. Beyan, O., et al.: Distributed analytics on sensitive medical data: The personal health train. Data Intell. 2(1–2), 96–107 (2020)

    Google Scholar 

  3. Chang, K., et al.: Distributed deep learning networks among institutions for medical imaging. J. Am. Med. Inform. Assoc. 25(8), 945–954 (2018)

    Article  Google Scholar 

  4. Deng, J., Dong, W., Socher, R., Li, L.J., Li, K., Fei-Fei, L.: Imagenet: a large-scale hierarchical image database. In: 2009 IEEE Conference on Computer Vision and Pattern Recognition, pp. 248–255. IEEE (2009)

    Google Scholar 

  5. González, F., Yu, Y., Figueroa, A., López, C., Aragon, C.: Global reactions to the cambridge analytica scandal: a cross-language social media study. In: Companion Proceedings of the 2019 World Wide Web Conference, pp. 799–806 (2019)

    Google Scholar 

  6. He, K., Zhang, X., Ren, S., Sun, J.: Deep residual learning for image recognition. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 770–778 (2016)

    Google Scholar 

  7. Kairouz, P., et al.: Advances and open problems in federated learning. arXiv preprint arXiv:1912.04977 (2019)

  8. Karimireddy, S.P., Kale, S., Mohri, M., Reddi, S., Stich, S., Suresh, A.T.: Scaffold: stochastic controlled averaging for federated learning. In: International Conference on Machine Learning, pp. 5132–5143. PMLR (2020)

    Google Scholar 

  9. Kingma, D.P., Ba, J.: Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980 (2014)

  10. Konečnỳ, J., McMahan, B., Ramage, D.: Federated optimization: distributed optimization beyond the datacenter. arXiv preprint arXiv:1511.03575 (2015)

  11. Konečnỳ, J., McMahan, H.B., Ramage, D., Richtárik, P.: Federated optimization: distributed machine learning for on-device intelligence. arXiv preprint arXiv:1610.02527 (2016)

  12. Kopparapu, K., Lin, E., Zhao, J.: FEDCD: improving performance in non-iid federated learning. arXiv preprint arXiv:2006.09637 (2020)

  13. Li, Q., Diao, Y., Chen, Q., He, B.: Federated learning on non-iid data silos: an experimental study. arXiv preprint arXiv:2102.02079 (2021)

  14. Li, T., Sahu, A.K., Talwalkar, A., Smith, V.: Federated learning: challenges, methods, and future directions. IEEE Signal Process. Mag. 37(3), 50–60 (2020)

    Article  Google Scholar 

  15. Li, T., Sahu, A.K., Zaheer, M., Sanjabi, M., Talwalkar, A., Smith, V.: Federated optimization in heterogeneous networks. arXiv preprint arXiv:1812.06127 (2018)

  16. Li, X., Huang, K., Yang, W., Wang, S., Zhang, Z.: On the convergence of fedavg on non-iid data. arXiv preprint arXiv:1907.02189 (2019)

  17. Li, X., Jiang, M., Zhang, X., Kamp, M., Dou, Q.: Fedbn: federated learning on non-iid features via local batch normalization. Training 1, 1–2 (2021)

    Google Scholar 

  18. McMahan, B., Moore, E., Ramage, D., Hampson, S., Arcas, B.A.: Communication-efficient learning of deep networks from decentralized data. In: Artificial Intelligence and Statistics, pp. 1273–1282. PMLR (2017)

    Google Scholar 

  19. McMahan, H.B., Moore, E., Ramage, D., y Arcas, B.A.: Federated learning of deep networks using model averaging. arXiv preprint arXiv:1602.05629 (2016)

  20. Mou, Y., Welten, S., Jaberansary, M., Ucer Yediel, Y., Kirsten, T., Decker, S., Beyan, O.: Distributed skin lesion analysis across decentralised data sources. In: Public Health and Informatics, pp. 352–356. IOS Press (2021)

    Google Scholar 

  21. Wang, H., Kaplan, Z., Niu, D., Li, B.: Optimizing federated learning on non-iid data with reinforcement learning. In: IEEE INFOCOM 2020-IEEE Conference on Computer Communications, pp. 1698–1707. IEEE (2020)

    Google Scholar 

  22. Wang, H., Yurochkin, M., Sun, Y., Papailiopoulos, D., Khazaeni, Y.: Federated learning with matched averaging. arXiv preprint arXiv:2002.06440 (2020)

  23. Wilson, R.C., et al.: Datashield - new directions and dimensions. Data Sci. J. 16, 21 (2017). https://doi.org/10.5334/dsj-2017-021

    Article  Google Scholar 

  24. Yeganeh, Y., Farshad, A., Navab, N., Albarqouni, S.: Inverse distance aggregation for federated learning with Non-IID data. In: Albarqouni, S., et al. (eds.) DART/DCL –2020. LNCS, vol. 12444, pp. 150–159. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-60548-3_15

    Chapter  Google Scholar 

  25. Zhao, Y., Li, M., Lai, L., Suda, N., Civin, D., Chandra, V.: Federated learning with non-iid data. arXiv preprint arXiv:1806.00582 (2018)

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yongli Mou .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Mou, Y., Geng, J., Welten, S., Rong, C., Decker, S., Beyan, O. (2021). Optimized Federated Learning on Class-Biased Distributed Data Sources. In: Kamp, M., et al. Machine Learning and Principles and Practice of Knowledge Discovery in Databases. ECML PKDD 2021. Communications in Computer and Information Science, vol 1524. Springer, Cham. https://doi.org/10.1007/978-3-030-93736-2_13

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-93736-2_13

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-93735-5

  • Online ISBN: 978-3-030-93736-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics