Abstract:
The k-nearest neighbor (kNN) is one of the most popular algorithms in machine learning owing to its simplicity, versatility, and implementation viability without any assu...Show MoreMetadata
Abstract:
The k-nearest neighbor (kNN) is one of the most popular algorithms in machine learning owing to its simplicity, versatility, and implementation viability without any assumptions about the data. However, for large-scale data, it incurs a large amount of memory access and computational complexity, resulting in long latency and high power consumption. In this paper, we present a kNN hardware accelerator in 65nm CMOS. This accelerator combines in-memory computing SRAM that is recently developed for binarized deep neural networks and digital hardware that performs top-k sorting. We designed and simulated the kNN accelerator, which performs up to 17.9 million query vectors per second while consuming 11.8 mW, demonstrating >4.8X energy improvement over prior works.
Date of Conference: 29-31 July 2019
Date Added to IEEE Xplore: 05 September 2019
ISBN Information: