Abstract:
Emerging network applications require packet classification at line speed on multiple header fields. Fast packet classification requires a careful attention to memory res...Show MoreMetadata
Abstract:
Emerging network applications require packet classification at line speed on multiple header fields. Fast packet classification requires a careful attention to memory resources due to the size and speed limitations in SRAM and DRAM memory used to implement the function. In this paper, we investigate a range of memory architectures that can be used to implement a wide range of packet classification caches. In particular, we examine their performance under real network traces in order to identify features that have the greatest impact. Through experiments, we show that a cache's associativity, replacement policy, and hash function all contribute in varying magnitudes to the cache's overall performance. Specifically, we show that small levels of associativity can result in enormous performance gains, that replacement policies can give modest performance improvements for under-provisioned caches, and that faster, less complex hashes can improve overall cache performance.
Date of Conference: 01-01 October 2003
Date Added to IEEE Xplore: 26 February 2004
Print ISBN:0-7803-7788-5
Print ISSN: 1531-2216