Cited By
View all- Dave HKotak N(2024)An analysis of cache configuration’s impacts on the miss rate of big data applications using gem5Serbian Journal of Electrical Engineering10.2298/SJEE2402217D21:2(217-234)Online publication date: 2024
Compiler-directed cache prefetching has the potential to hide much of the high memory latency seen by current and future high-performance processors. However, prefetching is not without costs, particularly on a shared-memory multiprocessor. Prefetching ...
On-die caches are a popular method to help hide the main memory latency. However, it is difficult to build large caches without substantially increasing their access latency, which in turn hurts performance. To overcome this difficulty, on-die caches ...
Association for Computing Machinery
New York, NY, United States
Check if you have access through your login credentials or your institution to get full access on this article.
Sign in