Skip to main content

A Hardware Scheme for Data Prefetching

  • Conference paper
  • First Online:
  • 371 Accesses

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 1823))

Abstract

Prefetching brings data into the cache before the processor expects it, thereby eliminating potential cache misses. There are two major prefetching schemes. In a software scheme, the compiler predicts memory access patterns and places prefetch instructions in the code. In a hardware scheme, hardware predicts memory access patterns at runtime and brings data into the cache before the processor requires it.

This paper proposes a hardware scheme for prefetching, where a second processor is used for prefetching data for the primary processor. The scheme does not predict memory access patterns, but rather uses the second processor to run ahead of the primary processor so as to detect future memory accesses and prefetch these references.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Tien-Fu Chen. Data prefetching for high-performance processors. Technical Report PhD Thesis, University of Washington, July 1993.

    Google Scholar 

  2. Fredrik Dahlgren, Michel Dubois, and Per Stenstrom. Sequential hardware prefetching in shared-memory multiprocessors. IEEE Transactions on Parallel and Distributed Systems, 6(7), July 1995.

    Google Scholar 

  3. Nathalie Drach. Hardware implementation issues of data prefetching. In Proceeding of the International Conference on Supercomputing, pages 245–254. 1995.

    Google Scholar 

  4. John W C Fu and Janak H Patel. Stride directed prefetching in scalar processors. In Proceedings of the 25 th International Symposium on Microarchitecture, pages 102–110. 1992.

    Google Scholar 

  5. S Manoharan and S R Bathula. Hardware support for software prefetching. In Proceedings of the 4 th Australasian Computer Architecture Conference, pages 97–108. 1999.

    Google Scholar 

  6. Todd Mowry. Tolerating latency through software-controlled data prefetching. Technical Report PhD Thesis, Stanford University, March 1994.

    Google Scholar 

  7. Allan Porterfield. Software methods for improvement of cache performance on supercomputer applications. Technical Report PhD Thesis, Rice University, May 1989.

    Google Scholar 

  8. Richard Sites, editor. Alpha Architecture Reference Manual. Digital Press, 1992.

    Google Scholar 

  9. A J Smith. Cache memories. ACM Computing Surveys, pages 473–530, 1982.

    Google Scholar 

  10. J E Smith. Decoupled access/execute computer architectures. In Proceedings of the 9 th International Symposium on Computer Architecture, pages 112–119. 1982.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2000 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Manoharan, S., Kim, SM. (2000). A Hardware Scheme for Data Prefetching. In: Bubak, M., Afsarmanesh, H., Hertzberger, B., Williams, R. (eds) High Performance Computing and Networking. HPCN-Europe 2000. Lecture Notes in Computer Science, vol 1823. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-45492-6_35

Download citation

  • DOI: https://doi.org/10.1007/3-540-45492-6_35

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-67553-2

  • Online ISBN: 978-3-540-45492-2

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics