Skip to main content

Fourier Enhanced MLP with Adaptive Model Pruning for Efficient Federated Recommendation

  • Conference paper
  • First Online:
Knowledge Science, Engineering and Management (KSEM 2022)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 13370))

  • 1904 Accesses

Abstract

Federated learning (FL) is gradually gaining traction as the de facto standard for distributed recommendation model training that takes advantage of on-device user data while reducing server costs. However, the computation resources of user devices in FL are usually much more limited compared to servers in a datacenter, which hinders the application of some advanced recommendation models (e.g., Transformer-based models) in FL. In addition, models with better recommendation performance tend to have more parameters, which increases the cost of communication between servers and user devices. Therefore, it is difficult for existing federated recommendation methods to achieve a good trade-off between recommendation accuracy and computation and communication costs. As a response, we propose a novel federated recommendation framework for efficient recommendations. First, we propose an all-MLP model by replacing the self-attention sublayer in a Transformer encoder with a Fourier sublayer, in which the noise information in the user interaction data is effectively attenuated using Fast Fourier Transform and learnable filters. Second, we adopt an adaptive model pruning technique in the FL framework, which can significantly reduce the model size without affecting the recommendation performance. Extensive experiments on four real-world datasets demonstrate that our method outperforms existing federated recommendation methods and strikes a good trade-off between recommendation performance and model size.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 99.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 129.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    https://gdpr-info.eu.

  2. 2.

    https://oag.ca.gov/privacy/ccpa.

  3. 3.

    http://jmcauley.ucsd.edu/data/amazon/.

  4. 4.

    https://grouplens.org/datasets/movielens/.

  5. 5.

    https://cseweb.ucsd.edu/~jmcauley/datasets.html#steam_data.

References

  1. Ammad-Ud-Din, M., et al.: Federated collaborative filtering for privacy-preserving personalized recommendation system. arXiv preprint arXiv:1901.09888 (2019)

  2. Chai, D., Wang, L., Chen, K., Yang, Q.: Secure federated matrix factorization. IEEE Intell. Syst. 36(5), 11–20 (2020)

    Article  Google Scholar 

  3. Frigo, M., Johnson, S.G.: The design and implementation of FFTW3. Proc. IEEE 93(2), 216–231 (2005)

    Article  Google Scholar 

  4. Heideman, M.T., Johnson, D.H., Burrus, C.S.: Gauss and the history of the fast Fourier transform. Archive for History of Exact Sciences, pp. 265–277 (1985)

    Google Scholar 

  5. Hidasi, B., Karatzoglou, A., Baltrunas, L., Tikk, D.: Session-based recommendations with recurrent neural networks. arXiv preprint arXiv:1511.06939 (2015)

  6. Kang, W.C., McAuley, J.: Self-attentive sequential recommendation. In: 2018 IEEE International Conference on Data Mining (ICDM), pp. 197–206. IEEE (2018)

    Google Scholar 

  7. Konečnỳ, J., McMahan, H.B., Yu, F.X., Richtárik, P., Suresh, A.T., Bacon, D.: Federated learning: strategies for improving communication efficiency. arXiv preprint arXiv:1610.05492 (2016)

  8. Koren, Y., Bell, R.: Advances in collaborative filtering. In: Recommender Systems Handbook, pp. 77–118 (2015)

    Google Scholar 

  9. Koren, Y., Bell, R., Volinsky, C.: Matrix factorization techniques for recommender systems. Computer 42(8), 30–37 (2009)

    Article  Google Scholar 

  10. Li, J., Wang, Y., McAuley, J.: Time interval aware self-attention for sequential recommendation. In: Proceedings of the 13th International Conference on Web Search and Data Mining, pp. 322–330 (2020)

    Google Scholar 

  11. Lian, J., Zhou, X., Zhang, F., Chen, Z., Xie, X., Sun, G.: xDeepFM: combining explicit and implicit feature interactions for recommender systems. In: Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1754–1763 (2018)

    Google Scholar 

  12. McAuley, J., Targett, C., Shi, Q., Van Den Hengel, A.: Image-based recommendations on styles and substitutes. In: SIGIR, pp. 43–52 (2015)

    Google Scholar 

  13. McMahan, B., Moore, E., Ramage, D., Hampson, S., Aguera y Arcas, B.: Communication-efficient learning of deep networks from decentralized data. In: Artificial Intelligence and Statistics, pp. 1273–1282 (2017)

    Google Scholar 

  14. Muhammad, K., et al.: FedFast: going beyond average for faster training of federated recommender systems. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1234–1242 (2020)

    Google Scholar 

  15. Rabiner, L.R., Gold, B.: Theory and Application of Digital Signal Processing. Prentice-Hall, Englewood Cliffs (1975)

    Google Scholar 

  16. Radford, A., Narasimhan, K., Salimans, T., Sutskever, I.: Improving language understanding by generative pre-training (2018)

    Google Scholar 

  17. Rendle, S., Freudenthaler, C., Schmidt-Thieme, L.: Factorizing personalized Markov chains for next-basket recommendation. In: WWW (2010)

    Google Scholar 

  18. Rendle, S., Freudenthaler, C., Gantner, Z., Schmidt-Thieme, L.: BPR: Bayesian personalized ranking from implicit feedback. arXiv preprint arXiv:1205.2618 (2012)

  19. Sarwar, B., Karypis, G., Konstan, J., Riedl, J.: Item-based collaborative filtering recommendation algorithms. In: Proceedings of the 10th International Conference on World Wide Web, pp. 285–295 (2001)

    Google Scholar 

  20. Shaw, P., Uszkoreit, J., Vaswani, A.: Self-attention with relative position representations. arXiv preprint arXiv:1803.02155 (2018)

  21. Soliman, S.S., Srinath, M.D.: Continuous and discrete signals and systems, Englewood Cliffs (1990)

    Google Scholar 

  22. Sun, F., et al.: BERT4Rec: sequential recommendation with bidirectional encoder representations from transformer. In: CIKM, pp. 1441–1450 (2019)

    Google Scholar 

  23. Xue, H.J., Dai, X., Zhang, J., Huang, S., Chen, J.: Deep matrix factorization models for recommender systems. In: IJCAI, vol. 17, pp. 3203–3209 (2017)

    Google Scholar 

Download references

Acknowledgements

This work is supported by the National Key Research and Development Program of China under Grant 2021YFB3101503; by the National Natural Science Foundation of China under Grant 61931019.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Guangjun Wu .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Ai, Z., Wu, G., Li, B., Wang, Y., Chen, C. (2022). Fourier Enhanced MLP with Adaptive Model Pruning for Efficient Federated Recommendation. In: Memmi, G., Yang, B., Kong, L., Zhang, T., Qiu, M. (eds) Knowledge Science, Engineering and Management. KSEM 2022. Lecture Notes in Computer Science(), vol 13370. Springer, Cham. https://doi.org/10.1007/978-3-031-10989-8_28

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-10989-8_28

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-10988-1

  • Online ISBN: 978-3-031-10989-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics