Skip to main content

Rule-Based Collaborative Learning with Heterogeneous Local Learning Models

  • Conference paper
  • First Online:
Advances in Knowledge Discovery and Data Mining (PAKDD 2022)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 13280))

Included in the following conference series:

Abstract

Collaborative learning such as federated learning enables to train a global prediction model in a distributed way without the need to share the training data. However, most existing schemes adopt deep learning models and require all local models to have the same architecture as the global model, making them unsuitable for applications using resource- and bandwidth-hungry devices. In this paper, we present CloREF, a novel rule-based collaborative learning framework, that allows participating devices to use different local learning models. A rule extraction method is firstly proposed to bridge the heterogeneity of local learning models by approximating their decision boundaries. Then a novel rule fusion and selection mechanism is designed based on evolutionary optimization to integrate the knowledge learned by all local models. Experimental results on a number of synthesized and real-world datasets demonstrate that the rules generated by our rule extraction method can mimic the behaviors of various learning models with high fidelity (>0.95 in most tests), and CloREF gives comparable and sometimes even better AUC compared with the best-performing model trained centrally.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Alcalá-Fdez, J., et al.: Keel data-mining software tool: data set repository, integration of algorithms and experimental analysis framework. J. Multiple-Valued Log. Soft Comput. 17(2–3), 255–187 (2011). Citeseer

    Google Scholar 

  2. Arivazhagan, M.G., et al.: Federated learning with personalization layers. arXiv (2019)

    Google Scholar 

  3. Colin Cameron, A., et al.: An r-squared measure of goodness of fit for some common nonlinear regression models. J. Econom. 77(2), 329–342 (1997)

    Article  MathSciNet  Google Scholar 

  4. Diao, E., et al.: HeteroFL: Computation and communication efficient federated learning for heterogeneous clients. arXiv (2021)

    Google Scholar 

  5. Fallah, A., et al.: Personalized federated learning: a meta-learning approach. CoRR abs/2002.07948 (2020)

    Google Scholar 

  6. FedAI: FATE: an industrial grade federated learning framework. https://fate.fedai.org (2021)

  7. Guidotti, R., et al.: A survey of methods for explaining black box models. ACM Comput. Surv. 51(5), 1–42 (2018)

    Article  Google Scholar 

  8. Guo, W., et al.: LEMNA: explaining deep learning based security applications. In: Proceedings of the 2018 ACM SIGSAC Conference on Computer and Communications Security, pp. 364–379 (2018)

    Google Scholar 

  9. Konečný, J., et al.: Federated learning: strategies for improving communication efficiency. In: NIPS Workshop on Private Multi-Party Machine Learning (2016)

    Google Scholar 

  10. Li, D., Wang, J.: FedMD: heterogenous federated learning via model distillation. In: NeurIPS 2019 Workshop on Federated Learning for Data Privacy and Confidentiality (2019)

    Google Scholar 

  11. Li, T., et al.: Federated learning: challenges, methods, and future directions. IEEE Signal Process. Mag. 37(3), 50–60 (2020)

    Article  Google Scholar 

  12. Liu, Y., et al.: A secure federated transfer learning framework. IEEE Intell. Syst. 35(4), 70–82 (2020)

    Article  Google Scholar 

  13. McMahan, B., et al.: Communication-efficient learning of deep networks from decentralized data. In: International Conference on Artificial Intelligence and Statistics. Proceedings of Machine Learning Research, vol. 54, pp. 1273–1282. PMLR (2017)

    Google Scholar 

  14. Narendra, T., et al.: Explaining deep learning models using causal inference (2018)

    Google Scholar 

  15. Platt, J., et al.: Probabilistic outputs for support vector machines and comparisons to regularized likelihood methods. Adv. Large Margin Classif. 10(3), 61–74 (1999)

    Google Scholar 

  16. Verbraeken, J., Wolting, M.: A survey on distributed machine learning. ACM Comput. Surv. 53(2), 1–33 (2020)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Haibo Zhang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Pang, Y., Zhang, H., Deng, J.D., Peng, L., Teng, F. (2022). Rule-Based Collaborative Learning with Heterogeneous Local Learning Models. In: Gama, J., Li, T., Yu, Y., Chen, E., Zheng, Y., Teng, F. (eds) Advances in Knowledge Discovery and Data Mining. PAKDD 2022. Lecture Notes in Computer Science(), vol 13280. Springer, Cham. https://doi.org/10.1007/978-3-031-05933-9_50

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-05933-9_50

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-05932-2

  • Online ISBN: 978-3-031-05933-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics