Skip to main content

Transformer-Based Cache Replacement Policy Learning

  • Conference paper
  • First Online:
Web Information Systems Engineering – WISE 2022 (WISE 2022)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 13724))

Included in the following conference series:

  • 1077 Accesses

Abstract

Cache replacement policy is critical in computer system. It determines which data to be evicted from the cache when new data is coming. A good cache replacement policy increases the cache hit rate and decreases system delay significantly. There are a few heuristic methods designed for specific access patterns, but they perform poorly on diverse and complex access patterns. In order to deal with complicated access patterns, we formulate the cache replacement problem as matching question answering and design a Transformer-based cache replacement (TBCR) model. TBCR learns access patterns based on a Transformer encoder, and this architecture performs well even on complex access patterns. We evaluated on six memory-intensive Standard Performance Evaluation Corporation (SPEC) applications. TBCR increases cache hit rates by 3\(\%\) over the state-of-the-art.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. The 2nd cache replacement championship. https://crc2.ece.tamu.edu/

  2. Ali, W., Shamsuddin, S.M., Ismail, A.S.: A survey of web caching and prefetching. Int. J. Adv. Soft Comput. Appl. 3(1), 18–44 (2011)

    Google Scholar 

  3. Belady, L.A.: A study of replacement algorithms for virtual-storage computer. IBM Syst. J. 5(2), 78–101 (1966)

    Article  Google Scholar 

  4. Bryant, R.E., O’Hallaron, D.R.: Computer System: A Programmer’s Perspective, 3rd edn. Pearson, Boston (2011)

    Google Scholar 

  5. Cidon, A., Eisenman, A., Alizadeh, M., Katti, S.: Cliffhanger: scaling performance cliffs in web memory caches. In: 13th USENIX Symposium on Networked Systems Design and Implementation, NSDI 2016, Santa Clara, CA, USA, 16–18 March 2016, pp. 379–392 (2016)

    Google Scholar 

  6. Dhingra, B., Liu, H., Yang, Z., Cohen, W.W., Salakhutdinov, R.: Gated-attention readers for text comprehension. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics, ACL 2017, Vancouver, Canada, 30 July–4 August, vol. 1: Long Papers, pp. 1832–1846 (2017)

    Google Scholar 

  7. Jin, Q., et al.: Biomedical question answering: a survey of approaches and challenges. ACM Comput. Surv. 55(2), 35:1–35:36 (2023)

    Google Scholar 

  8. Jouppi, N.P.: Improving direct-mapped cache performance by the addition of a small fully-associative cache and prefetch buffers. In: Proceedings of the 17th Annual International Symposium on Computer Architecture, Seattle, WA, USA, June 1990, pp. 364–373 (1990)

    Google Scholar 

  9. Li, L., Zhang, M., Chao, Z., Xiang, J.: Using context information to enhance simple question answering. World Wide Web 24(1), 249–277 (2021)

    Article  Google Scholar 

  10. Li, X., Cao, Y., Li, Q., Shang, Y., Li, Y., Liu, Y., Xu, G.: RLINK: deep reinforcement learning for user identity linkage. World Wide Web 24(1), 85–103 (2021)

    Article  Google Scholar 

  11. Liang, Y., Guo, B., Yu, Z., Zheng, X., Wang, Z., Tang, L.: A multi-view attention-based deep learning system for online deviant content detection. World Wide Web 24(1), 205–228 (2021)

    Article  Google Scholar 

  12. Liu, E.Z., Hashemi, M., Swersky, K., Ranganathan, P., Ahn, J.: An imitation learning approach for cache replacement. In: Proceedings of the 37th International Conference on Machine Learning, ICML 2020, 13–18 July 2020, Virtual Event, vol. 119, pp. 6237–6247 (2020)

    Google Scholar 

  13. Luong, T., Pham, H., Manning, C.D.: Effective approaches to attention-based neural machine translation. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, EMNLP 2015, Lisbon, Portugal, 17–21 September 2015, pp. 1412–1421 (2015)

    Google Scholar 

  14. Man, X., Ouyang, D., Li, X., Song, J., Shao, J.: Scenario-aware recurrent transformer for goal-directed video captioning. ACM Trans. Multim. Comput. Commun. Appl. 18(4), 104:1–104:17 (2022)

    Google Scholar 

  15. Seo, M.J., Kembhavi, A., Farhadi, A., Hajishirzi, H.: Bidirectional attention flow for machine comprehension. In: 5th International Conference on Learning Representations, ICLR 2017, Toulon, France, 24–26 April 2017, Conference Track Proceedings (2017)

    Google Scholar 

  16. Shi, Z., Huang, X., Jain, A., Lin, C.: Applying deep learning to the cache replacement problem. In: Proceedings of the 52nd Annual IEEE/ACM International Symposium on Microarchitecture, MICRO 2019, Columbus, OH, USA, 12–16 October 2019, pp. 413–425 (2019)

    Google Scholar 

  17. Vaswani, A., et al.: Attention is all you need. In: Advances in Neural Information Processing Systems 30: Annual Conference on Neural Information Processing Systems 2017, Long Beach, CA, USA, 4–9 December 2017, pp. 5998–6008 (2017)

    Google Scholar 

  18. Wu, N., Li, P.: Phoebe: reuse-aware online caching with reinforcement learning for emerging storage models. CoRR abs/2011.07160 (2020)

    Google Scholar 

  19. Xu, Y., Frachtenberg, E., Jiang, S., Paleczny, M.: Characterizing facebook’s memcached workload. IEEE Internet Comput. 18(2), 41–49 (2014)

    Article  Google Scholar 

Download references

Acknowledgements

This work is supported by the National Natural Science Foundation of China (No. 61832001) and Shenzhen Science and Technology Program (No. JCYJ20210324121213037).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jie Shao .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Yang, M., Yang, C., Shao, J. (2022). Transformer-Based Cache Replacement Policy Learning. In: Chbeir, R., Huang, H., Silvestri, F., Manolopoulos, Y., Zhang, Y. (eds) Web Information Systems Engineering – WISE 2022. WISE 2022. Lecture Notes in Computer Science, vol 13724. Springer, Cham. https://doi.org/10.1007/978-3-031-20891-1_35

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-20891-1_35

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-20890-4

  • Online ISBN: 978-3-031-20891-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics