Skip to main content

FedPHP: Federated Personalization with Inherited Private Models

  • Conference paper
  • First Online:
Book cover Machine Learning and Knowledge Discovery in Databases. Research Track (ECML PKDD 2021)

Abstract

Federated Learning (FL) generates a single global model via collaborating distributed clients without leaking data privacy. However, the statistical heterogeneity of non-iid data across clients poses a fundamental challenge to the model personalization process of each client. Our significant observation is that the newly downloaded global model from the server may perform poorly on local clients, while it could become better after adequate personalization steps. Inspired by this, we advocate that the hard-won personalized model in each communication round should be rationally exploited, while standard FL methods directly overwrite the previous personalized models. Specifically, we propose a novel concept named “inHerited Private Model” (HPM) for each local client as a temporal ensembling of its historical personalized models and exploit it to supervise the personalization process in the next global round. We explore various types of knowledge transfer to facilitate the personalization process. We provide both theoretical analysis and abundant experimental studies to verify the superiorities of our algorithm.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 89.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 119.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    https://pytorch.org/docs/stable/torchvision/models.html.

  2. 2.

    https://www.tensorflow.org/tutorials/images/cnn.

References

  1. Arivazhagan, M.G., Aggarwal, V., Singh, A.K., Choudhary, S.: Federated learning with personalization layers. CoRR abs/1912.00818 (2019)

    Google Scholar 

  2. Caldas, S., et al.: LEAF: A benchmark for federated settings. CoRR abs/1812.01097 (2018)

    Google Scholar 

  3. Gretton, A., Borgwardt, K.M., Rasch, M.J., Schölkopf, B., Smola, A.J.: A kernelmethod for the two-sample-problem. In: Advances in Neural Information ProcessingSystems 19, pp. 513–520 (2006)

    Google Scholar 

  4. Hamer, J., Mohri, M., Suresh, A.T.: FedBoost: a communication-efficient algorithm for federated learning. In: Proceedings of the 37th International Conference on Machine Learning, pp. 3973–3983 (2020)

    Google Scholar 

  5. Hinton, G.E., Vinyals, O., Dean, J.: Distilling the knowledge in a neural network. CoRR abs/1503.02531 (2015)

    Google Scholar 

  6. Kairouz, P., et al.: Advances and open problems in federated learning. CoRR abs/1912.04977 (2019)

    Google Scholar 

  7. Karimireddy, S.P., Kale, S., Mohri, M., Reddi, S.J., Stich, S.U., Suresh, A.T.: SCAFFOLD: stochastic controlled averaging for federated learning. In: Proceedings of the 37th International Conference on Machine Learning, pp. 5132–5143 (2020)

    Google Scholar 

  8. Krizhevsky, A.: Learning multiple layers of features from tiny images (2012)

    Google Scholar 

  9. Kulkarni, V., Kulkarni, M., Pant, A.: Survey of personalization techniques for federated learning. CoRR abs/2003.08673 (2020)

    Google Scholar 

  10. Laine, S., Aila, T.: Temporal ensembling for semi-supervised learning. In: 5th International Conference on Learning Representations (2017)

    Google Scholar 

  11. Li, D., Wang, J.: FedMD: Heterogenous federated learning via model distillation. CoRR abs/1910.03581 (2019)

    Google Scholar 

  12. Li, T., Sahu, A.K., Zaheer, M., Sanjabi, M., Talwalkar, A., Smith, V.: Federated optimization in heterogeneous networks. In: Proceedings of Machine Learning and Systems (2020)

    Google Scholar 

  13. Li, X., Grandvalet, Y., Davoine, F.: Explicit inductive bias for transfer learning with convolutional networks. In: Proceedings of the 35th International Conference on Machine Learning, vol. 80, pp. 2830–2839 (2018)

    Google Scholar 

  14. Liang, P.P., Liu, T., Liu, Z., Salakhutdinov, R., Morency, L.: Think locally, act globally: Federated learning with local and global representations. CoRR abs/2001.01523 (2020)

    Google Scholar 

  15. Long, M., Cao, Y., Wang, J., Jordan, M.I.: Learning transferable features with deep adaptation networks. In: Proceedings of the 32nd International Conference on Machine Learning, vol. 37, pp. 97–105 (2015)

    Google Scholar 

  16. Mansour, Y., Mohri, M., Ro, J., Suresh, A.T.: Three approaches for personalization with applications to federated learning. CoRR abs/2002.10619 (2020)

    Google Scholar 

  17. McMahan, B., Moore, E., Ramage, D., Hampson, S., y Arcas, B.A.: Communication-efficient learning of deep networks from decentralized data. In: Proceedings of the 20th International Conference on Artificial Intelligence and Statistics, pp. 1273–1282 (2017)

    Google Scholar 

  18. Pan, S.J., Yang, Q.: A survey on transfer learning. IEEE Trans. Knowl. Data Eng. 22(10), 1345–1359 (2010)

    Article  Google Scholar 

  19. Peterson, D., Kanani, P., Marathe, V.J.: Private federated learning with domain adaptation. CoRR abs/1912.06733 (2019)

    Google Scholar 

  20. Shen, T., et al.: Federated mutual learning. CoRR abs/2006.16765 (2020)

    Google Scholar 

  21. Shoham, N., et al.: Overcoming forgetting in federated learning on non-iid data. CoRR abs/1910.07796 (2019)

    Google Scholar 

  22. Tarvainen, A., Valpola, H.: Mean teachers are better role models: weight-averaged consistency targets improve semi-supervised deep learning results. In: 5th International Conference on Learning Representations (2017)

    Google Scholar 

  23. Yang, Q., Liu, Y., Chen, T., Tong, Y.: Federated machine learning: concept and applications. ACM TIST 10(2), 12:1–12:19 (2019)

    Google Scholar 

  24. Yao, X., Huang, C., Sun, L.: Two-stream federated learning: reduce the communication costs. In: IEEE Visual Communications and Image Processing, pp. 1–4 (2018)

    Google Scholar 

  25. Yao, X., Huang, T., Wu, C., Zhang, R., Sun, L.: Towards faster and better federated learning: a feature fusion approach. In: IEEE International Conference on Image Processing, pp. 175–179 (2019)

    Google Scholar 

  26. Yu, T., Bagdasaryan, E., Shmatikov, V.: Salvaging federated learning by local adaptation. CoRR abs/2002.04758 (2020)

    Google Scholar 

  27. Zhao, Y., Li, M., Lai, L., Suda, N., Civin, D., Chandra, V.: Federated learning with non-iid data. CoRR abs/1806.00582 (2018)

    Google Scholar 

Download references

Acknowledgments

This research was partially supported by National Natural Science Foundation of China (Grant Nos. 61773198, 61632004 and 61921006), and NSFC-NRF Joint Research Project under Grant 61861146001, and Collaborative Innovation Center of Novel Software Technology and Industrialization. Thanks to Huawei Noah’s Ark Lab NetMIND Research Team for funding this research. Professor De-Chuan Zhan is the corresponding author.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to De-Chuan Zhan .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Li, XC., Zhan, DC., Shao, Y., Li, B., Song, S. (2021). FedPHP: Federated Personalization with Inherited Private Models. In: Oliver, N., Pérez-Cruz, F., Kramer, S., Read, J., Lozano, J.A. (eds) Machine Learning and Knowledge Discovery in Databases. Research Track. ECML PKDD 2021. Lecture Notes in Computer Science(), vol 12975. Springer, Cham. https://doi.org/10.1007/978-3-030-86486-6_36

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-86486-6_36

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-86485-9

  • Online ISBN: 978-3-030-86486-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics