Abstract:
Distributed key-value stores employ large main memory caches to mitigate the high costs of disk access. A challenge for such caches is that large scale distributed stores...Show MoreMetadata
Abstract:
Distributed key-value stores employ large main memory caches to mitigate the high costs of disk access. A challenge for such caches is that large scale distributed stores simultaneously face multiple workloads, often with drastically different characteristics. Interference between such competing workloads leads to performance degradation through inefficient use of the main memory cache. This paper diagnoses the cache interference seen for representative workloads and then develops A-Cache, an adaptive set of main memory caching methods for distributed key-value stores. Focused on read performance for common workload patterns, A-Cache leads to throughput improvements of up to 150% for competing data-intensive applications running on server class machines.
Date of Conference: 23-27 September 2013
Date Added to IEEE Xplore: 09 January 2014
Electronic ISBN:978-1-4799-0898-1