Skip to main content

Clustered Federated Learning with Inference Hash Codes Based Local Sensitive Hashing

  • Conference paper
  • First Online:
Information Security and Cryptology (Inscrypt 2023)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 14527))

Included in the following conference series:

  • 143 Accesses

Abstract

Federated Learning (FL) is a distributed paradigm enabling clients to train a global model collaboratively while protecting client privacy. During the FL training process, the statistical heterogeneity between different clients can compromise the overall performance of the global model and its generalization ability on each client, making it difficult for the training process to converge. This paper proposes an efficient clustered FL (cFL) method called FedCC, which aims to cluster clients based on their inference results on a public dataset. As inference results may leak client data distribution, we use Locality Sensitive Hashing (LSH) to transform inference results into Inference Hash Codes (IHC), which are irreversible but can be used for similarity calculations. The server compares the similarity of IHCs between clients and implements dynamic clustering using the DBSCAN algorithm. FedCC also provides an elegant method to quickly select the appropriate cluster model for clients without downloading all cluster models. We evaluated FedCC on four commonly used datasets and compared them against seven baselines. Experimental results show that FedCC achieves faster convergence than other baselines while achieving an accuracy 1.66% higher than the state-of-the-art baseline. Finally, we further validated the robustness of FedCC against Byzantine attacks, where malicious clients upload negative gradients to reduce model accuracy and prevent convergence.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 59.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 74.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Reisizadeh, A., Mokhtari, A., Hassani, H., Jadbabaie, A., Pedarsani, R.: FedPAQ: a communication-efficient federated learning method with periodic averaging and quantization. In: International Conference on Artificial Intelligence and Statistics, pp. 2021–2031. PMLR (2020)

    Google Scholar 

  2. Dinh, C.T., Tran, N., Nguyen, J.: Personalized federated learning with Moreau envelopes. In: Advances in Neural Information Processing Systems, vol. 33, pp. 21394–21405 (2020)

    Google Scholar 

  3. Li, X., Huang, K., Yang, W., Wang, S., Zhang, Z.: On the convergence of FedAvg on Non-IID data. arXiv preprint arXiv:1907.02189 (2019)

  4. Li, Q., Diao, Y., Chen, Q., He, B.: Federated learning on Non-IID data silos: an experimental study. In: 2022 IEEE 38th International Conference on Data Engineering (ICDE), pp. 965–978. IEEE (2022)

    Google Scholar 

  5. Liang, P.P., et al.: Think locally, act globally: federated learning with local and global representations. arXiv preprint arXiv:2001.01523 (2020)

  6. Vahidian, S., Morafah, M., Lin, B.: Personalized federated learning by structured and unstructured pruning under data heterogeneity. In: 2021 IEEE 41st International Conference on Distributed Computing Systems Workshops (ICDCSW), pp. 27–34. IEEE (2021)

    Google Scholar 

  7. Fallah, A., Mokhtari, A., Ozdaglar, A.: Personalized federated learning with theoretical guarantees: a model-agnostic meta-learning approach. In: Advances in Neural Information Processing Systems, vol. 33, pp. 3557–3568 (2020)

    Google Scholar 

  8. Vahidian, S., et al.: Efficient distribution similarity identification in clustered federated learning via principal angles between client data subspaces. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 37, pp. 10043–10052 (2023)

    Google Scholar 

  9. Liu, B., Guo, Y., Chen, X.: PFA: privacy-preserving federated adaptation for effective model personalization. In: Proceedings of the Web Conference 2021, pp. 923–934 (2021)

    Google Scholar 

  10. Sattler, F., Müller, K.-R., Samek, W.: Clustered federated learning: model-agnostic distributed multitask optimization under privacy constraints. IEEE Trans. Neural Netw. Learn. Syst. 32(8), 3710–3722 (2020)

    Article  MathSciNet  Google Scholar 

  11. Ghosh, A., Chung, J., Yin, D., Ramchandran, K.: An efficient framework for clustered federated learning. In: Advances in Neural Information Processing Systems, vol. 33, pp. 19586–19597 (2020)

    Google Scholar 

  12. Mansour, Y., Mohri, M., Ro, J., Suresh, A.T.: Three approaches for personalization with applications to federated learning. arXiv preprint arXiv:2002.10619 (2020)

  13. Morafah, M., Vahidian, S., Wang, W., Lin, B.: FLIS: clustered federated learning via inference similarity for Non-IID data distribution. IEEE Open J. Comput. Soc. 4, 109–120 (2023)

    Article  Google Scholar 

  14. Slaney, M., Casey, M.: Locality-sensitive hashing for finding nearest neighbors [lecture notes]. IEEE Signal Process. Mag. 25(2), 128–131 (2008)

    Article  Google Scholar 

  15. Bäcklund, H., Hedblom, A., Neijman, N.: A density-based spatial clustering of application with noise. Data Mining TNM033 33, 11–30 (2011)

    Google Scholar 

  16. Knott, B., Venkataraman, S., Hannun, A., Sengupta, S., Ibrahim, M., van der Maaten, L.: CrypTen: secure multi-party computation meets machine learning. In: Advances in Neural Information Processing Systems, vol. 34, pp. 4961–4973 (2021)

    Google Scholar 

  17. Bonawitz, K., et al.: Towards federated learning at scale: system design. Proc. Mach. Learn. Syst. 1, 374–388 (2019)

    Google Scholar 

  18. Sattler, F., Wiedemann, S., Müller, K.-R., Samek, W.: Robust and communication-efficient federated learning from Non-IID data. IEEE Trans. Neural Netw. Learn. Syst. 31(9), 3400–3413 (2019)

    Article  Google Scholar 

  19. Zhao, Y., Li, M., Lai, L., Suda, N., Civin, D., Chandra, V.: Federated learning with Non-IID data. arXiv preprint arXiv:1806.00582 (2018)

  20. Li, T., Sahu, A.K., Zaheer, M., Sanjabi, M., Talwalkar, A., Smith, V.: Federated optimization in heterogeneous networks. In: Proceedings of Machine Learning and Systems, vol. 2, pp. 429–450 (2020)

    Google Scholar 

  21. Finn, C., Abbeel, P., Levine, S.: Model-agnostic meta-learning for fast adaptation of deep networks. In: International Conference on Machine Learning, pp. 1126–1135. PMLR (2017)

    Google Scholar 

  22. Lee, J.-W., Oh, J., Lim, S., Yun, S.-Y., Lee, J.-G.: TornadoAggregate: accurate and scalable federated learning via the ring-based architecture. arXiv preprint arXiv:2012.03214 (2020)

  23. Hu, H., Dobbie, G., Salcic, Z., Liu, M., Zhang, J., Zhang, X.: A locality sensitive hashing based approach for federated recommender system. In: 2020 20th IEEE/ACM International Symposium on Cluster, Cloud and Internet Computing (CCGRID), pp. 836–842. IEEE (2020)

    Google Scholar 

  24. Jafari, O., Maurya, P., Nagarkar, P., Islam, K.M., Crushev, C.: A survey on locality sensitive hashing algorithms and their applications. arXiv preprint arXiv:2102.08942 (2021)

  25. He, W., Li, Y., Zhang, Y., Li, X.: A binary-search-based locality-sensitive hashing method for cross-site user identification. IEEE Trans. Comput. Soc. Syst. 10(2), 480–491 (2022)

    Article  Google Scholar 

  26. Qi, L., Zhang, X., Dou, W., Ni, Q.: A distributed locality-sensitive hashing-based approach for cloud service recommendation from multi-source data. IEEE J. Sel. Areas Commun. 35(11), 2616–2624 (2017)

    Article  Google Scholar 

  27. Liu, P., Yu, X., Zhou, J.T.: Meta knowledge condensation for federated learning. arXiv preprint arXiv:2209.14851 (2022)

  28. Li, D., Wang, J.: FedMD: heterogenous federated learning via model distillation. arXiv preprint arXiv:1910.03581 (2019)

  29. Tan, Y., Guodong Long, L., Liu, T.Z., Qinghua, L., Jiang, J., Zhang, C.: FedProto: federated prototype learning across heterogeneous clients. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 8432–8440 (2022)

    Google Scholar 

  30. Liu, X.: Synthetic dataset generation for adversarial machine learning research. arXiv preprint arXiv:2207.10719 (2022)

  31. McMahan, H.B., Moore, E., Ramage, D., Arcas, B.: Federated learning of deep networks using model averaging. arXiv preprint arXiv:1602.05629, vol. 2, p. 2 (2016)

  32. McMahan, B., Moore, E., Ramage, D., Hampson, S., Arcas, B.A.: Communication-efficient learning of deep networks from decentralized data. In: Artificial Intelligence and Statistics, pp. 1273–1282. PMLR (2017)

    Google Scholar 

  33. Shamsian, A., Navon, A., Fetaya, E., Chechik, G.: Personalized federated learning using hypernetworks. In: International Conference on Machine Learning, pp. 9489–9502. PMLR (2021)

    Google Scholar 

  34. Li, Q., He, B., Song, D.: Model-contrastive federated learning. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 10713–10722 (2021)

    Google Scholar 

  35. Li, T., Sahu, A.K., Talwalkar, A., Smith, V.: Federated learning: challenges, methods, and future directions. IEEE Sig. Process. Mag. 37(3), 50–60 (2020)

    Google Scholar 

  36. Lin, T., Kong, L., Stich, S.U., Jaggi, M.: Ensemble distillation for robust model fusion in federated learning. In: Advances in Neural Information Processing Systems, vol. 33, pp. 2351–2363 (2020)

    Google Scholar 

  37. Wang, J., Liu, Q., Liang, H., Joshi, G., Vincent Poor, H.: Tackling the objective inconsistency problem in heterogeneous federated optimization. In: Advances in Neural Information Processing Systems, vol. 33, pp. 7611–7623 (2020)

    Google Scholar 

  38. Castellon, F.E., Mayoue, A., Sublemontier, J.-H., Gouy-Pailler, C.: Federated learning with incremental clustering for heterogeneous data. In: 2022 International Joint Conference on Neural Networks (IJCNN), pp. 1–8. IEEE (2022)

    Google Scholar 

Download references

Acknowledgements

We thank the anonymous reviewers for their helpful feedback. This work is supported by the National Key Research and Development Program of China (Grant No. 2021YFB0301100)

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ximeng Liu .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2024 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Tan, Z., Liu, X., Che, Y., Wang, Y. (2024). Clustered Federated Learning with Inference Hash Codes Based Local Sensitive Hashing. In: Ge, C., Yung, M. (eds) Information Security and Cryptology. Inscrypt 2023. Lecture Notes in Computer Science, vol 14527. Springer, Singapore. https://doi.org/10.1007/978-981-97-0945-8_5

Download citation

  • DOI: https://doi.org/10.1007/978-981-97-0945-8_5

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-97-0944-1

  • Online ISBN: 978-981-97-0945-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics